This week’s feedback can be summarized as follows: in certain expressions where the eyes are not looking directly forward, the facial lines form an angle, opening in the direction of the gaze. Some actions and transitions lack breakdowns, transitioning to spline. Regarding mouth shapes, they are typically defined as trapezoids, with one side of the lips being flatter and the other having a greater curve.
When it comes to body postures, it’s important to reduce excessive movements. Drag is essential during turning movements to ensure a natural transition.
Resolume Arena is a powerful real-time video mixing and VJ software designed for live performances and multimedia displays. It offers a flexible interface and rich features, including multi-layer video mixing, audio-reactive visuals, numerous effects and transitions, advanced Output Mapping, multi-screen output, and support for Spout and NDI for real-time video signal transmission with other software and devices. Additionally, it provides convenient media management tools, making it ideal for concerts, art installations, and other multimedia events.
And it can be connected with the keyboard
can drag and drop it in the same time or it can separate drag and drop in different layer and column, differnt layer can show the mix output by order
The details can be change at clip part. The effect can be drag and drop into layer or the clips, the front one will effect all layers and the back one only effect the specific layer.
We can change the shortcut by ourselves. And it can be selected to control all layers or just one clip by change the Shortcuts target.
When attempting to import the initial rig (character skeleton) into Unreal Engine, we encountered difficulties, particularly with texture mapping. To address this issue, I tried using Metahuman, a built-in character creation tool in Unreal Engine, but faced real-time facial capture issues when redirecting it to Unreal Engine. After two weeks of effort with no progress, we decided to revert to using the initial model.
Subsequently, I took steps to improve the initial model. I cleaned up the model’s history and re-mapped the UV textures,then used Substance Painter software to create new textures and re-bound the character’s skeleton using the Advance Skeleton tool.
Process in Substance Painter
Process in Marvlous Designer
After completing animation work in Maya, I conducted cloth testing and simulation using Marvelous Designer. Marvelous Designer is a professional cloth simulation software that can simulate the movement and deformation of cloth, adding a more realistic clothing effect to characters.
“I’m sorry that people are so jealous of me, but I can’t help it that I’m popular.” After discussion we picked this dialogue. And for the next step is to record the own reference.
I’ve sketched out some ideas, and this week’s task is to begin our blocking.
First, I created a circle. After adjusting its size, I edited the edges of the circle using noise. Then, I used the level and composite tools to modify the inside of the circle.
Next, I imported a photo and connected it, changing the base circle’s color. I added a transform and adjusted the rotation.
Finally, I added an edge node to enhance the edge variety, then adjusted the color and noise details. I used the moviefileout to export the final result.
I am hesitant to continue using Marvelous Designer (MD).
Using MD requires a great deal of patience. For example, the wind field effect actually blew my cape into the sky, which made me feel helpless. Also, in Unreal Engine, character mapping was problematic and I had to cover the character’s neck with fabric.
After two days of hard MD research, I decided to make two shots in Unreal Engine. One referenced The Fall, the other referenced Euphoria.
The following are some of MD’s works, among which the incorrect fabric material can not be correctly calculated, coupled with the poor animation effect, the result is not ideal, so I will not show it here.
Today we finished the last polishing feedback, I need to track the arc not only the box but also inculding the hands. And there’s many part I can do slow in and out to show the weight and speed.
For Acting part, we need to choose three five-second audio clips without video, imagine the setting in which the character takes place, the story before and after the sentence, and then find the right rig
Use a Spout Out node in TD to output the visual data. Spout is a system for sharing textures between applications in real-time.
Spout Out sends the entire data to be shared with other systems, such as Unreal Engine.
Adjust settings in TouchDesigner as needed, including specifying the output name.
Setting up Unreal Engine:
Ensure Off-World Live plugin is enabled.
Place a receiver actor in the scene to receive data from TouchDesigner. This actor is the “OWL Spout Receiver Manager.”
Customize the receiver manager as needed, specifying the input name and creating a render target to display the received data.
Apply the render target to a surface or shape in the Unreal scene to visualize the data.
Adjusting Settings:
In Unreal Engine, adjust settings in the Details panel of the OWL Spout Receiver Manager to configure the receiver, including specifying the input name and creating a render target.
Testing and Troubleshooting:
Test the setup to ensure the data is correctly transmitted from TouchDesigner to Unreal Engine.
Troubleshoot any issues that may arise, such as incorrect naming or missing configurations.
Workflow Considerations:
Understand the workflow for sending and receiving data between TouchDesigner and Unreal Engine.
Vicon Mocap data Into Unreal
Importing Mocap Animation:
If animation is not available, reimport the animation using FBX import options.
Choose the Viacom female / male skeleton.
Import all and ignore any prompts.
Export Access for animation :
If animation export access is unclear due to angles, adjust the import rotation.
Skeleton Adjustment:
If animation appears stretched, adjust the skeleton tree in the animation sequencer.
Retain hips in animation but change everything else to skeleton.
Editing Animations:
Two methods are shown for editing animations: a. Easy Way: Directly manipulate bones in the skeleton tree by keyframing. b. Harder, Proper Way: Create an IK rig for precise adjustments, especially for Mannequin characters.
IK Rig and Retargeting:
Set up IK rigs for proper retargeting from Viacom to Mannequin models.
Cleaning up Mocap:
Align the source (Vicon) and target (Mannequin) models.
Adjust rotations and positions to match.
Ensure foot rotations match leg adjustments to prevent sideways steps.
Exit edit mode and export the retargeted animation.
Mocap Cleanup:
Import retargeted animation into sequencer.
If animation is not visible, close and reopen Unreal Engine.
Bake the animation to the control rig to generate keyframes.
Make necessary adjustments to clean up mocap data.
Final Steps:
After cleaning up mocap, continue regular adjustments as needed.
This week’s feedback focused on refining the curves of various body parts and ensuring that the hips follow smoothly. Easing in and out remains a critical aspect to pay close attention to, especially in mastering the rhythm (timing).
It’s worth noting that, concerning the curves, I only observed the curves of the box but overlooked those of the hands, which are also crucial as they are prominently visible to the camera.