6. Before Rigging
This document has been updated along with MYTY Kit v1.0 release on October 2022.
Understanding the overall Motion Tracking process is required before the Bone Rigging, Controller, and MotionAdapter steps. It is recommended that you first read thePrerequisites.
MYTY ecosystem aims to enable Avatar Creators and metaverse app developers to create and use various live avatars easily. Accordingly, MYTY Kit solves the following needs of Avatar Creators and metaverse app developers.
- 1.Even if I am not familiar with the technical background of motion capture, I want to apply motion capture data to my avatars freely.
- 2.I want my avatar works well with compatibility across various metaverse apps.
- 1.I want to use various motion capture solutions to fit my multiple platform applications and apply them to MYTY Avatars.
- 2.I want to use live avatars without compromising the Avatar Creator's motion design intent.
To achieve the above purpose, MYTY Kit provides DefaultMotionTemplate and MotionAdapter. The overall motion data pipeline covered by MYTY Kit is shown below.

Motion Data Pipeline of MYTY Kit v1.0
MYTY Kit's motion capture data pipeline is divided into a part handled by metaverse app developers and a part handled by Avatar Creators.
- 1.When we move our body or face in front of the camera, that movement is captured and transmitted as data.
- 2.The captured data is categorized through TemplateBridge of MYTY Kit and flow to MotionTemplate.
- 3.Motion data delivered to MotionTemplate is used as input for MotionAdapter and flows to the Controller connected to the adapter.
- 4.The Controller expresses the movement by transforming the bones or sprites of the connected avatars.
- 5.An avatar that follows you in real-time is complete.
MYTY Kit provides a MotionTemplate that can work with various Motion capture solutions such as Google's MediaPipe. This is called DefaultMotionTemplate. By using this, Avatar Creators can obtain stable motion data automatically without applying a separate motion capture solution.
MYTY Kit's DefaultMotionTemplate supports motion tracking using Google's MediaPipe, a machine learning solution for live and streaming media. In particular, MYTY Kit utilizes MediaPipe's Face Mesh solution to track facial expressions in real-time. Since machine learning is employed to infer the position of facial and upper body landmarks as 3D coordinates, motion tracking requires just a single camera input, without the need for a dedicated depth sensor.

Click
Menu > MYTY Kit > Create
DefaultMotionTemplate
to add respective object to the Hierarchy.
It is highly recommended that you test your work-in-progress (WIP) avatars using the MYTY Avatar Viewer (https://viewer.myty.space/). This platform provides the most accurate representation of what your avatar users will encounter when using your creations before they are officially released.
Alternatively, you can also test your avatars in real-time using Unity. Please refer to the document below: