I am hesitant to continue using Marvelous Designer (MD).
Using MD requires a great deal of patience. For example, the wind field effect actually blew my cape into the sky, which made me feel helpless. Also, in Unreal Engine, character mapping was problematic and I had to cover the character’s neck with fabric.
After two days of hard MD research, I decided to make two shots in Unreal Engine. One referenced The Fall, the other referenced Euphoria.
The following are some of MD’s works, among which the incorrect fabric material can not be correctly calculated, coupled with the poor animation effect, the result is not ideal, so I will not show it here.
Use a Spout Out node in TD to output the visual data. Spout is a system for sharing textures between applications in real-time.
Spout Out sends the entire data to be shared with other systems, such as Unreal Engine.
Adjust settings in TouchDesigner as needed, including specifying the output name.
Setting up Unreal Engine:
Ensure Off-World Live plugin is enabled.
Place a receiver actor in the scene to receive data from TouchDesigner. This actor is the “OWL Spout Receiver Manager.”
Customize the receiver manager as needed, specifying the input name and creating a render target to display the received data.
Apply the render target to a surface or shape in the Unreal scene to visualize the data.
Adjusting Settings:
In Unreal Engine, adjust settings in the Details panel of the OWL Spout Receiver Manager to configure the receiver, including specifying the input name and creating a render target.
Testing and Troubleshooting:
Test the setup to ensure the data is correctly transmitted from TouchDesigner to Unreal Engine.
Troubleshoot any issues that may arise, such as incorrect naming or missing configurations.
Workflow Considerations:
Understand the workflow for sending and receiving data between TouchDesigner and Unreal Engine.
Vicon Mocap data Into Unreal
Importing Mocap Animation:
If animation is not available, reimport the animation using FBX import options.
Choose the Viacom female / male skeleton.
Import all and ignore any prompts.
Export Access for animation :
If animation export access is unclear due to angles, adjust the import rotation.
Skeleton Adjustment:
If animation appears stretched, adjust the skeleton tree in the animation sequencer.
Retain hips in animation but change everything else to skeleton.
Editing Animations:
Two methods are shown for editing animations: a. Easy Way: Directly manipulate bones in the skeleton tree by keyframing. b. Harder, Proper Way: Create an IK rig for precise adjustments, especially for Mannequin characters.
IK Rig and Retargeting:
Set up IK rigs for proper retargeting from Viacom to Mannequin models.
Cleaning up Mocap:
Align the source (Vicon) and target (Mannequin) models.
Adjust rotations and positions to match.
Ensure foot rotations match leg adjustments to prevent sideways steps.
Exit edit mode and export the retargeted animation.
Mocap Cleanup:
Import retargeted animation into sequencer.
If animation is not visible, close and reopen Unreal Engine.
Bake the animation to the control rig to generate keyframes.
Make necessary adjustments to clean up mocap data.
Final Steps:
After cleaning up mocap, continue regular adjustments as needed.
I plan to make a rather absurd and dreamy emotional short film with no narrative. The overall atmosphere is mainly enhanced by the switch between shots, the operation mirror and the background music. The main keywords included in the scene are: medieval churches, religious sense, dragons and machinery, light and shadow and dance
After finding the fantasy scene I needed, I first tried to make a character animation
Then I put the clothes in MD to make, mainly referring to some skirts of the show. Here, I want to make a style of long skirt biased towards Greek goddess,
Then import the character animation and the dress into ue
Live Link VCAM is a real-time virtual camera solution developed by Epic Games for use with Unreal Engine. It allows users to stream live video from their mobile devices directly into Unreal Engine, where the video can be integrated with virtual scenes in real-time. This enables filmmakers, content creators, and developers to visualize and capture shots more effectively, as they can see the virtual scene overlaid with live video footage from the camera in real-time. Live Link VCAM enhances the production workflow by providing seamless integration between live-action footage and virtual environments, resulting in more immersive and visually compelling content.
Start by opening five plugins
Open Project Setting. Change the Frame Buffer Pixel Format in the Default Settings to 8bit RGBA
Once your computer is connected to your phone’s wifi
win+R Enter cmd, press Enter, and enter ipconfig next to Lenovo> in the first line
And then add :0
If we are at home:
Then download Live Link VCAM in the apple device
Then drag vcam actor into the scene window.
If you want to adjust the position of the virtual camera, first turn off the activation, after adjusting to the proper position, right-click the VCAM in the list on the right and click Snap Object to View.
Note that you must turn off the computer firewall to successfully connect your phone to your computer
More information follow the link: https://docs.unrealengine.com/4.27/zh-CN/AnimatingObjects/VirtualCamera/VirtualCameraActorQuickStart/
Today we tried the vicon optical solution mocap. That’s totally a new and prefessional experience.
Vicon is a leading company specializing in motion capture technology. Their systems are widely used in various industries, including film, video games, sports, and biomechanics. Vicon’s motion capture solutions enable precise tracking and recording of human and object movements, providing highly accurate data for animation, analysis, and research. The technology is known for its reliability, high resolution, and the ability to capture complex motions in real-time, making it an essential tool for professionals seeking detailed motion analysis and realistic animation.
The essence of capturing motion is that 16 ultra-sensitive cameras set in a circle on the ceiling capture the movement of 54 reflective spot positions (excluding fingers) pasted on the suit.
First we need a 5,000 pound T-shaped instrument to activate the camera.
The point of reflection is at the tip of the finger
After all the preparatory work is completed, the formal recording of the movement can begin.
This term, we aim to create an engaging and immersive User Experience by leveraging the capabilities of Touch Designer (TD) and Resolume. Our goal is to seamlessly integrate these powerful tools to design interactive visual environments and dynamic live performances. By utilizing Touch Designer’s real-time 3D graphics and interactive multimedia capabilities alongside Resolume’s robust video mixing and projection mapping features, we plan to develop innovative visual installations that captivate and engage our audience.