Artifact process 02

I am hesitant to continue using Marvelous Designer (MD).

Using MD requires a great deal of patience. For example, the wind field effect actually blew my cape into the sky, which made me feel helpless. Also, in Unreal Engine, character mapping was problematic and I had to cover the character’s neck with fabric.

After two days of hard MD research, I decided to make two shots in Unreal Engine. One referenced The Fall, the other referenced Euphoria.

The following are some of MD’s works, among which the incorrect fabric material can not be correctly calculated, coupled with the poor animation effect, the result is not ideal, so I will not show it here.

These are some UE screenshots

Week 5 : TD to Unreal+Vicon in Unreal

TD cam into Unreal Texture(Real time)

  1. Setting up TouchDesigner (TD):
    • Use a Spout Out node in TD to output the visual data. Spout is a system for sharing textures between applications in real-time.
    • Spout Out sends the entire data to be shared with other systems, such as Unreal Engine.
    • Adjust settings in TouchDesigner as needed, including specifying the output name.
  2. Setting up Unreal Engine:
    • Ensure Off-World Live plugin is enabled.
    • Place a receiver actor in the scene to receive data from TouchDesigner. This actor is the “OWL Spout Receiver Manager.”
    • Customize the receiver manager as needed, specifying the input name and creating a render target to display the received data.
    • Apply the render target to a surface or shape in the Unreal scene to visualize the data.
  3. Adjusting Settings:
    • In Unreal Engine, adjust settings in the Details panel of the OWL Spout Receiver Manager to configure the receiver, including specifying the input name and creating a render target.
  4. Testing and Troubleshooting:
    • Test the setup to ensure the data is correctly transmitted from TouchDesigner to Unreal Engine.
    • Troubleshoot any issues that may arise, such as incorrect naming or missing configurations.
  5. Workflow Considerations:
    • Understand the workflow for sending and receiving data between TouchDesigner and Unreal Engine.

Vicon Mocap data Into Unreal

  1. Importing Mocap Animation:
    • If animation is not available, reimport the animation using FBX import options.
    • Choose the Viacom female / male skeleton.
    • Import all and ignore any prompts.
  2. Export Access for animation :
    • If animation export access is unclear due to angles, adjust the import rotation.
  3. Skeleton Adjustment:
    • If animation appears stretched, adjust the skeleton tree in the animation sequencer.
    • Retain hips in animation but change everything else to skeleton.
  4. Editing Animations:
    • Two methods are shown for editing animations: a. Easy Way: Directly manipulate bones in the skeleton tree by keyframing. b. Harder, Proper Way: Create an IK rig for precise adjustments, especially for Mannequin characters.
  5. IK Rig and Retargeting:
    • Set up IK rigs for proper retargeting from Viacom to Mannequin models.
  6. Cleaning up Mocap:
    • Align the source (Vicon) and target (Mannequin) models.
    • Adjust rotations and positions to match.
    • Ensure foot rotations match leg adjustments to prevent sideways steps.
    • Exit edit mode and export the retargeted animation.
  7. Mocap Cleanup:
    • Import retargeted animation into sequencer.
    • If animation is not visible, close and reopen Unreal Engine.
    • Bake the animation to the control rig to generate keyframes.
    • Make necessary adjustments to clean up mocap data.
  8. Final Steps:
    • After cleaning up mocap, continue regular adjustments as needed.
    • Ensure the process is understood for future use.

Week 4 : TD & TD music

Today we introduced TouchDesigner

  1. Introduction to TouchDesigner (TD):
    • TD is a node-based system and programming language used for creative industries.
    • It’s easy to learn and useful for quick tasks compared to other software like Maya or Unreal.
  2. Comparison to Processing and other creative coding platforms:
    • Similar to Processing and P5.js, TD offers visual coding capabilities.
    • It’s favored for its ease of use and quick experimentation.
  3. Features and Capabilities:
    • TD supports real-time compositing, rendering, and data visualization.
    • It’s versatile, handling various media inputs and outputs, including videos, images, and data.
  4. Operators in TouchDesigner:
    • Texture Operators (TOPs) for 2D and 3D images.
    • Channel Operators (CHOPs) for time-based data.
    • Surface Operators (SOPs) for 3D geometries.
    • Compositing Operators (COPs) for compositing and processing tasks.
    • Material Operators (Mats) for handling materials.
    • Composite Operators (Comps) for organizing operators.
  5. Best Practices:
    • Keep projects organized with groups and annotations.
    • Organize nodes horizontally for clarity.
    • Use Nulls to mark the end of logic.
    • Save frequently and start with smaller projects.
    • Embrace trial and error and be open to unexpected results.
  6. Cost and Resolution:
    • TD offers cost-effective solutions with good resolutions and frame rates.
    • Higher resolutions may require a subscription and powerful hardware.
  7. Tasks for Practice:
    • Create compositive sequences using film footage.
    • Remix soundtracks using film sounds and apply reactive effects.
    • Experiment with OSC control and feedback loops.
  8. Resource Sharing:
    • Shared project files for hands-on practice.
    • Encouragement to follow demonstrations and explore tasks.

Artifact process 01

I plan to make a rather absurd and dreamy emotional short film with no narrative. The overall atmosphere is mainly enhanced by the switch between shots, the operation mirror and the background music. The main keywords included in the scene are: medieval churches, religious sense, dragons and machinery, light and shadow and dance

After finding the fantasy scene I needed, I first tried to make a character animation

Then I put the clothes in MD to make, mainly referring to some skirts of the show. Here, I want to make a style of long skirt biased towards Greek goddess,

Then import the character animation and the dress into ue

Week 3 : live link VCAM

Live Link VCAM is a real-time virtual camera solution developed by Epic Games for use with Unreal Engine. It allows users to stream live video from their mobile devices directly into Unreal Engine, where the video can be integrated with virtual scenes in real-time. This enables filmmakers, content creators, and developers to visualize and capture shots more effectively, as they can see the virtual scene overlaid with live video footage from the camera in real-time. Live Link VCAM enhances the production workflow by providing seamless integration between live-action footage and virtual environments, resulting in more immersive and visually compelling content.

Start by opening five plugins

Open Project Setting. Change the Frame Buffer Pixel Format in the Default Settings to 8bit RGBA

Once your computer is connected to your phone’s wifi

win+R Enter cmd, press Enter, and enter ipconfig next to Lenovo> in the first line

And then add :0

If we are at home:

Then download Live Link VCAM in the apple device

Then drag vcam actor into the scene window.

If you want to adjust the position of the virtual camera, first turn off the activation, after adjusting to the proper position, right-click the VCAM in the list on the right and click Snap Object to View.

Note that you must turn off the computer firewall to successfully connect your phone to your computer

More information follow the link: https://docs.unrealengine.com/4.27/zh-CN/AnimatingObjects/VirtualCamera/VirtualCameraActorQuickStart/

Week 2 : Mocap workshop

Today we tried the vicon optical solution mocap. That’s totally a new and prefessional experience.

Vicon is a leading company specializing in motion capture technology. Their systems are widely used in various industries, including film, video games, sports, and biomechanics. Vicon’s motion capture solutions enable precise tracking and recording of human and object movements, providing highly accurate data for animation, analysis, and research. The technology is known for its reliability, high resolution, and the ability to capture complex motions in real-time, making it an essential tool for professionals seeking detailed motion analysis and realistic animation.

The essence of capturing motion is that 16 ultra-sensitive cameras set in a circle on the ceiling capture the movement of 54 reflective spot positions (excluding fingers) pasted on the suit.

First we need a 5,000 pound T-shaped instrument to activate the camera.

The point of reflection is at the tip of the finger

After all the preparatory work is completed, the formal recording of the movement can begin.

Week 1 : Introduction of this term

This term, we aim to create an engaging and immersive User Experience by leveraging the capabilities of Touch Designer (TD) and Resolume. Our goal is to seamlessly integrate these powerful tools to design interactive visual environments and dynamic live performances. By utilizing Touch Designer’s real-time 3D graphics and interactive multimedia capabilities alongside Resolume’s robust video mixing and projection mapping features, we plan to develop innovative visual installations that captivate and engage our audience.