Categories
2.1 Advanced and Experimental 3D Computer Animation Techniques Session with George Session with Serra

Artifact _ Final + Showreel

Story:

In my life, I often procrastinate on various tasks, such as doing my homework, doing laundry, and washing dishes. Whenever I procrastinate, I console myself by thinking that I’m passing these tasks to “tomorrow’s me,” knowing that the future me will be very annoyed. As it turns out, the next day I was indeed very angry but still ended up passing these unfinished tasks to me the day after. This cycle continues until the final deadline approaches, forcing me to face the mountain of tasks I’ve put off. This vicious cycle of procrastination makes me feel extremely stressed every time, yet I find it hard to break free from it.

Categories
Session with Serra

Artifact 2.2_After Effect

The post-production work for my project is primarily done in After Effects (AE). By using AE, I can finely tune and optimize the animation, including adding special effects, color correction, and compositing.

Tracking

In the production process, I utilized the “motion tracking” technique. Videos shot with a mobile phone often have slight shakes, and if you place a static animation into such footage, these shakes become very noticeable and disruptive. To address this, I used motion tracking to capture the shakes in the video. I then applied the analyzed position data to the animation layer and manually adjusted the overall position. This approach ensures that the animation syncs with the shakes in the video, making the entire scene appear more natural and cohesive, thereby enhancing the viewer’s sense of immersion and visual experience.

Adjusting the color

After completing the animation rendering, I noticed that the brightness of the animation did not match the original video. To better integrate the animation into the video, I added a color mask to the animation layer. Based on the environment, I chose a gray-green color for the mask and then adjusted the blend mode and opacity of the color layer. This approach made the entire scene look more harmonious. This method not only corrected the brightness discrepancy but also enhanced the cohesion between the animation and the actual scene, resulting in a more natural and unified final effect.

shine a light (on)

In my project, I used a lighting plugin called “CC Light Sweep.” This plugin can analyze the contours of my animation and generate an outline light effect. By using CC Light Sweep, I can precisely adjust the direction and intensity of the light source, making the lighting in the animation appear more natural and realistic. This plugin not only enhances the visual impact of the animation but also improves the overall texture of the scene, making the characters and environments look more three-dimensional and vivid under the lighting.

Roto Brush

In some scenes, to place the animation behind certain props, I used the “Roto Brush” tool. First, I used the Roto Brush to outline the required parts and then analyzed the entire video to determine the position of the props. This process extracts the selected parts, generating a transparent background. Next, I placed the processed video above the animation layer to establish the correct occlusion relationship. To make the occluded areas look more natural, I adjusted the “Feather” and “Contrast” settings of the Roto Brush. This method ensures that the animation integrates more seamlessly with the video, and the occlusion effect appears more realistic.

Tracking

Regarding motion tracking, there are some additional details to consider. The tool includes two frames: an outer frame and an inner frame. The outer frame analyzes the selected area, which can be set to the estimated maximum required range. The inner frame specifically analyzes the information around the focal point. If the outer frame’s area is too large, the analysis time will significantly increase, but the analysis will be more accurate. However, if the tracked object’s movement speed is fast and the outer frame’s range is too small, tracking errors may occur. Although manual adjustments can be made in such cases, they may lack precision. Therefore, when using the motion tracking tool, it is crucial to choose an appropriate outer frame size, balancing the analysis time and tracking accuracy.

Categories
Session with Serra

Artifact 2.1_Animation

In Maya

To ensure the accuracy of the animation’s positioning, I first used my phone to film the necessary scenes and blocking of the storyline. Then, I used Premiere Pro (PR) to export the image sequence and imported these frames into Maya as a reference for the animation. By doing so, I can precisely match the actual scenes and adjust the positions of characters and objects, making the animation more realistic and cohesive. This process not only improves the accuracy of the animation but also makes the entire creation process more efficient and organized.

In my project, A stands for animation, and S stands for shots. Whole numbers represent the scenes from the first day, while those with additional digits represent the scenes from the second day. This naming convention helps me clearly organize and distinguish between animations and shots from different timelines, making the entire creation process more orderly and manageable. For example, A_S1 denotes the animations and shots from the first day, whereas A_S1.2 denotes those from the second day. This approach allows me to efficiently arrange and adjust the content from different days, ensuring the coherence and logic of the story.

In my project, I use Maya for rendering and export the output in PNG format. This approach makes it easier to integrate the characters and scenes in After Effects (AE).

Categories
Session with Serra

Artifact 2.0

After my last attempt to scan my room, I suddenly came up with a new theme. I decided to try showcasing both the scene and character animation together. As a result, I have revised and adjusted the story to fit the new theme and presentation style.

This is a short story about procrastination. Procrastination is a widespread and impactful issue, affecting many people in various aspects of their lives, including work, study, and daily tasks. Procrastination not only affects personal efficiency and productivity but also leads to stress and anxiety. Research shows that approximately 15% to 20% of adults frequently procrastinate. In the student population, this percentage is even higher, with nearly 80% to 90% of college students admitting that they sometimes procrastinate. These statistics indicate that procrastination is not a problem faced by a few but a common phenomenon that deserves our attention and discussion.

In my life, I often procrastinate on various tasks, such as doing my homework, doing laundry, and washing dishes. Whenever I procrastinate, I console myself by thinking that I’m passing these tasks to “tomorrow’s me,” knowing that the future me will be very annoyed. As it turns out, the next day I was indeed very angry but still ended up passing these unfinished tasks to me the day after. This cycle continues until the final deadline approaches, forcing me to face the mountain of tasks I’ve put off. This vicious cycle of procrastination makes me feel extremely stressed every time, yet I find it hard to break free from it.

Since this story is based on my life, I created the character model directly using myself as a reference. When I am at home, I usually use hair clips to pin up my long hair, which often creates different hairstyles, making it look like the hair is blossoming. To make the character more authentic and relatable, I designed the pajamas as a loose T-shirt and shorts, which is exactly what I usually wear at home. This design not only makes the character look more natural and realistic but also allows the audience to relate more easily.

Character Model
Rigging by Mixamo and Advanced Skeleton (Adv)

First, I uploaded the model to Mixamo for quick rigging. This step quickly generates the basic skeleton structure and weight distribution. Then, I imported the rigged model into Maya and used the Advanced Skeleton (Adv) tool to match the corresponding bones and generate controllers. Next, I used Adv for facial rigging, which allows me to create expressive facial animations. Through these steps, I can ensure that the model’s movements and expressions are naturally and smoothly rendered, laying a solid foundation for the subsequent animation production.

Materials by Substance Painter (SP)

I used Substance Painter (SP) to create the materials, giving the clothes and pants different textures. In my story, although it is the same character, the events happen on different days. Therefore, I used different colors for the clothes, pants, and slippers to indicate the different dates. Specifically, yellow represents the first day, blue represents the second day, and red represents the third day, which appears for only 10 seconds. This approach allows me to clearly show the passage of time, making the storyline more coherent and easier to understand.

Categories
Session with Serra

Artifact 1.1

The inspiration for my story comes from my own life, so I chose my current bedroom as the setting for my story. By using a familiar environment, I can more accurately depict every detail in the scene, enhancing the story’s realism and intimacy.

The inspiration for my story comes from my own life, so I chose my current bedroom as the setting for my animation. I hope that by using this familiar environment, it can provide a precise and realistic scene reference for my animation. This approach not only enhances the details and authenticity of the animation but also saves me a significant amount of time and effort in manual modeling. By utilizing the existing bedroom setting, I can focus more on the design of the characters and the storyline, making the entire creation process more efficient. Additionally, a realistic setting helps to enhance the audience’s sense of immersion and resonance.

Scanning my room
Final results

However, the scanning results were not satisfactory, with many issues of interpenetration and misalignment. Additionally, there were blank areas where the scan failed to capture certain parts. This significantly compromises the integrity and accuracy of the scene, requiring additional time and effort to fix and fill in the gaps. This situation has made me reconsider the feasibility of using the scanned data or look for alternative solutions to ensure the quality of the final scene and the smooth progress of the animation.

Categories
Session with Serra

Week 5:Touch Designer to UE

In this lesson, we drop the effects from TD onto UE objects in real-time.

  1. Setting up TouchDesigner (TD): In TD, add a “Spout out” node after the node you want to output, which allows you to share the video effect between applications in real time. Note that the output name must be the same between programs.
  2. Setting up Unreal Engine (UE): In UE, make sure the “Off-World Live” plugin is enabled. Place an “Actor” in the scene to receive the TD data, this “Actor” is “OWL Spout Receiver Manager This “Actor” is the “OWL Spout Receiver Manager”.
  3. Adjust the settings in the Details panel of the “OWL Spout Receiver Manager” to configure the receiver, including specifying input names and creating render targets.
Categories
Session with Serra

Week 4:Touch Designer

TouchDesigner is a node-based visual programming language for real-time interactive multimedia content. It can be used to create interactive installations, live visuals, and generative art. TouchDesigner enables artists and designers to craft immersive, dynamic, and highly responsive multimedia projects. It can seamlessly integrate various data sources, sensors, and multimedia elements to produce captivating and engaging interactive experiences.

Two factors were interacting with the visuals:

  • The ambient music.
  • The microphone is in the middle of the installation.

I believe that in TD (TouchDesigner), as long as you can connect the nodes, you can create truly amazing effects. Each node represents different functions and operations, and by flexibly combining and connecting these nodes, you can achieve various complex visual and interactive effects. Whether it’s creating real-time audiovisual experiences or building dynamic multimedia displays, the connections between nodes offer endless possibilities and creativity. Once you master the use of these nodes, you can explore an infinite array of artistic expressions and technical innovations in TD.

4 effects I tried out.

Categories
Session with Serra

Artifact 1.0

Storyboard:

In the bedroom at night, a character is sitting on the bed playing with his mobile phone. The time displayed on the phone is ?? :??. The character lies down and closes his eyes. The next second, he opens his eyes and goes to a white room. He has a canvas in front of him, and underneath it are several paint buckets, and a roller brush. He picked up the brush and made a stroke on the canvas, and colors and patterns appeared. He continued to make several strokes, and different colors and patterns appeared. He painted faster and faster. Suddenly, the dream was over. He sat up with his eyes open in reality and picked up his mobile phone, which showed the time ??:??

Research board

Categories
Session with Serra

Week 3: Live Link VCAM

Live Link VCAM is a tool that can link mobile devices with Unreal Engine. It allows viewers to see and experience scenes from the project in the first-person perspective, essentially functioning as a simplified VR device, though it is presented in a flat format. I believe this tool greatly facilitates the creation of first-person view (POV) animations for creators. When creating a POV shot, I need to consider its position, angle, movement speed, and even the shake and amplitude of the camera. This usually requires numerous adjustments to achieve a good result. However, with this tool, I can easily create these shots. It allows me to intuitively adjust and optimize the first-person perspective, saving a lot of time and effort while enhancing the overall quality and viewing experience of the animation.

First we need a UE project (this is mandatory)

Then we need to enable 5 plugins: 1. Take Recorder 2. Apple ARKit 3. Live Link 4. RemoteSession 5. VirtualCamera

1.If the computer is connected to the phone’s wifi.

After the computer successfully connects to the phone’s WiFi. PC: win+R Enter cmd, press Enter, and enter ipconfig next to ”Users name>“

Then find the IP address of your computer and enter it into the UE as “Unicast Endpoint” and add: 0. For example, my computer’s IP address is 172.20.10.8, then I have to enter 172.20.10.8:0.

In this way, the live link VCAM on the mobile device searches for the computer device and then connects. We can then make it “active” in the project.

2.If at home (using public wifi), we need to set it up differently.

This is a video showing my phone successfully connected to the computer and “active” in my project. As you can see, the connection between the camera and the phone is very successful, allowing me to view my project environment in a 360-degree panorama. However, when I tried to walk around, the camera hardly moved. I later discovered that this was because my scene was too large, making the movement distance almost negligible within the project. Fortunately, this issue can be adjusted using tools on both sides. I can make the necessary adjustments in the settings of both the phone and the project tools to ensure that the camera more noticeably follows and reflects the movement accordingly.

Categories
Session with Serra

Week 1:Motion Capture

Thank you, actor Kai!

Today we tried the vicon optical solution mocap. It’s very interesting

Vicon’s motion capture solution can precisely track and record the movements of humans and objects, providing highly accurate data for animation, analysis, and research. The system uses 16 ultra-sensitive cameras arranged in a circular setup on the ceiling to capture the movement of 54 reflective markers attached to the clothing (excluding the fingers). This high-precision capture system ensures that every subtle movement is accurately recorded, providing a solid foundation for animation production and motion research. It can capture complex motion details and significantly improve the efficiency and quality of animation production, allowing animators to focus more on creative expression and storytelling. The application of this technology makes animation production more convenient and efficient.

Motion Capture by Kai

Funny motion-capture, lying on the ground being dragged along.