M
M
MYTY Kit
For Avatar Creator
Search
⌃K

6. Before Rigging

This document has been updated along with MYTY Kit v1.0 release on October 2022.

TL;DR

MYTY Kit’s Motion Tracking Solution

Understanding the overall Motion Tracking process is required before the Bone Rigging, Controller, and MotionAdapter steps. It is recommended that you first read thePrerequisites.
MYTY ecosystem aims to enable Avatar Creators and metaverse app developers to create and use various live avatars easily. Accordingly, MYTY Kit solves the following needs of Avatar Creators and metaverse app developers.

Needs of Avatar Creators :

  1. 1.
    Even if I am not familiar with the technical background of motion capture, I want to apply motion capture data to my avatars freely.
  2. 2.
    I want my avatar works well with compatibility across various metaverse apps.

Needs of Metaverse app developers :

  1. 1.
    I want to use various motion capture solutions to fit my multiple platform applications and apply them to MYTY Avatars.
  2. 2.
    I want to use live avatars without compromising the Avatar Creator's motion design intent.
To achieve the above purpose, MYTY Kit provides DefaultMotionTemplate and MotionAdapter. The overall motion data pipeline covered by MYTY Kit is shown below.

Motion Capture Data Pipeline

Motion Data Pipeline of MYTY Kit v1.0
MYTY Kit's motion capture data pipeline is divided into a part handled by metaverse app developers and a part handled by Avatar Creators.
  1. 1.
    When we move our body or face in front of the camera, that movement is captured and transmitted as data.
  2. 2.
    The captured data is categorized through TemplateBridge of MYTY Kit and flow to MotionTemplate.
  3. 3.
    Motion data delivered to MotionTemplate is used as input for MotionAdapter and flows to the Controller connected to the adapter.
  4. 4.
    The Controller expresses the movement by transforming the bones or sprites of the connected avatars.
  5. 5.
    An avatar that follows you in real-time is complete.

DefaultMotionTemplate include in MYTY Kit

MYTY Kit provides a MotionTemplate that can be used as a sample by utilizing Google's MediaPipe and Mefamo. This is called DefaultMotionTemplate. By using this, Avatar Creators can obtain stable motion data automatically without applying a separate motion capture solution.
What is Google's MediaPipe ?
MYTY Kit's DefaultMotionTemplate supports motion tracking using Google's MediaPipe, a machine learning solution for live and streaming media. In particular, MYTY Kit utilizes MediaPipe's Face Mesh solution to track facial expressions in real-time. Since machine learning is employed to infer the position of facial and upper body landmarks as 3D coordinates, motion tracking requires just a single camera input, without the need for a dedicated depth sensor.
​
​
What is Mefamo?
MeFaMo calculates the facial keypoints and blend shapes of a user. Instead of using the built in IPhone blend shape calculation (like LiveLinkFace App does), this uses the Googles Mediapipe to calculate the facial key points of a face. Those key points will then be used to calculate several facial blend shapes (like eyebrows, blinking, smiling etc.).
​
​

Check the motion data in real-time

1. Drag and drop Assets > MYTYKit > MotionTemplate > DefaultMotionTemplate into an empty Hierarchy window.
2. Drag and drop Assets > MYTYKit > MotionTemplate > Motion Source Samples > MediapipeMotionPack into an empty Hierarchy window.
3. In the Inspector window of MediapipeMotionPack > MediapipeMotionSource object, set Motion Template Mapper to DefaultMotionTemplate.
4. Select DefaultMotionTemplate > Parametric >SimpleFaceParam and press the ▶️(Play) button to play the project.
5. You can see the facial motion data value tracked in real-time in the inspector window.
Is motion tracking not working? The input camera may be configured incorrectly. Refer to the-mediapipe-object-cant-track-my-movements.for a solution.

Get familiar with making live avatars

Key terms

The process of taking this 3D coordinate information and connecting it to the 2D NFT Avatar can be broken down into smaller steps of bone rigging, Controller, and MotionAdapter. Before you proceed further, please familiarize yourself with the following key terms.
MYTY Avatar
An avatar created using MYTY Kit. It is both a Live Avatar and an NFT Avatar, and is compatible with MYTY Ecosystem.
Motion capture
Motion capture(Mo-Cap) is the process of capturing the movement of people, animals or objects, transferring it to a 2D or 3D model and animating it with the set of recorded movements.
Motion tracking data (Motion data)
Movement data tracked by Mo-Cap solution.
Motion Adapter
A MYTY Kit tool that processes motion tracking data and connects it to another MYTY Kit tool, the Controller.
Controller
A MYTY Kit Tool that controls the Bones and Sprites (Image) of MYTY Avatars.
Bone rigging
The process of connecting Bones to the 2D Avatar Image to enable joint movements.
Sprite Mesh
The sprite's mesh type is all about how Unity will render your sprite in your game. The traditional way of rendering them is in full rect mode.

What kind of motion data can be tracked?

In DefaultMotionTemplate of MYTY Kit, motion data is classified into 47 templates.
MYTY Kit v1.0 provides a stable motion tracking environment only for Face parameters and Head. Other motion data for full-body tracking will be provided in a later version.
​

The Controller-MotionAdapter Relationship

Which Controllers and MotionAdapters will you need to make your desired animation?
  • MYTY Kit tools Controller and MotionAdapter connect the user’s Motion tracking data to their MYTY Avatar.
    • MotionAdapter is a tool that processes raw Motion input and delivers it to the related Controller.
    • Controller is a tool that uses the processed Motion input to control an avatar’s Bones and Sprites.
  • Let’s find out more about the four types of Controllers included in MYTY Kit: what they do, and which MotionAdapters they can be used with.
Bone 1D Controller
Bone 2D Controller
Input
The float value of 1 MotionAdapter output.
The float values of 2 MotionAdapter outputs.
Output
Transformation of connected bone(s).
Transformation of connected bone(s).
Related MotionAdapter
👉 BoneTiltOrPosition 👉 (Bone) EyeBlink & EeyBlinkAll 👉 (Bone) EyeBrow & EyeBrowAll
👉 Bone2D
👉 (Bone) Pupil & PupilAll
Usage Example
  • Using the user’s eyebrow movements in the vertical axis to set movements of the avatar’s eyebrow bone.
​
  • Using the user’s face movements in the vertical and horizontal axes (face direction) to set movements of the avatar’s FaceBone.
  • Using the user’s chest movements in the vertical and horizontal axes to set movements of the avatar’s BodyBone.
Sprite 1D Range Controller
Sprite 2D Nearest Controller
Rigged Sprite 2D Nearest Controller
Input
The float value of 1 MotionAdapter output.
The float value of 2 MotionAdapter output.
The float value of 2 MotionAdapter output.
Output
Selection of appropriate sprite image(s) to display.
Selection of appropriate sprite image(s) to display.
Selection of appropriate bone-rigged sprite image(s) to display.
Related MotionAdapter
👉 (Sprite) EyeBlink & EyeBlinkAll 👉 (Sprite) EyeBrow & EyeBrowAll
(Sprite)MouthSprite
(Sprite)MouthSprite
Usage Example
​
  • Using the user’s eye size in the vertical axis to display an appropriate eye-shape sprite.
  • Using the user's mouth size in the vertical and horizontal axes to display an appropriate mouth-shape sprite.
  • Using the user's mouth size in the vertical and horizontal axes to display an appropriate mouth mesh sprite rigged with bones.
👉 Please refer to 8. MYTY Kit Controllers for more information on the Controller.
👉 Please refer to 9. Apply Motion Tracking for more information on the MotionAdapter.

Usage Example

There are two ways to make an eye-blinking animation.

1. Using Bone Animation​

An example of making the eyes blink using Bone animation.
You can make it using the Bone 1D Controller and (Bone)EyeBlink Adapter
  1. 1.
    Rig the EyelidBone to the Eyelid Sprite.
  2. 2.
    Create an empty GameObject (Naming example: EyeBoneController) within Controller Group, and add the Bone 1D Controller Component.
  3. 3.
    Open Bone 1D Controller Window. Specify EyeBoneController, and add EyelidBone Object to the list in the Hierarchy window.
  4. 4.
    Save the EyelidBone’s positions for when the eye is open (Max) and closed (Min).
  5. 5.
    Add (Bone)EyeBlink Adapter Prefab to Hierarchy. Specify the SimpleFaceParam template and EyeBoneController to the Template and Controller, respectively.
  6. 6.
    Press the Project play button to check that the animation works as intended.

2. Using Frame Animation​

An example of making the eyes blink using Frame animation.
You can make it using the Sprite 1D Range Controller and (Sprite)EyeBlink Adapter.
  1. 1.
    Create an empty GameObject (Naming example: EyeBoneController) within Controller Group, and add the Sprite 1D Range Controller Component.
  2. 2.
    Open Sprite 1D Range Controller Window. Specify EyeBoneController, and add Eyelid Object to the list in the Hierarchy window.
  3. 3.
    Press Auto Label to automatically bring up Sprite Library Label.
  4. 4.
    Assign Min and Max values to the Sprite Label for when the eye is open, to determine the data range within which an open-eye Sprite will be sent.
  5. 5.
    Assign Min and Max values to the Sprite Label for when the eye is closed, to determine the data range within which a closed-eye Sprite will be sent.
  6. 6.
    Add (Sprite)EyeBlink Adapter Prefab to Hierarchy. Specify the SimpleFaceParam template and EyeBoneController to the Template and Controller, respectively.
  7. 7.
    Press the Project play button to check that the animation works as intended.
​
Have a question or an idea? If you have a question to ask or an idea to share, participate in the MYTY Kit Community. We’d love to hear from you.
😄
​