Control Avatar (ARKit)
Describe how to control MYTY Avatar movement with ARKit
Background
Motion capture results should be provided to allow MYTY Avatars to reflect the movement of users. This interface defines how to deliver the ARKit results.
Input parameters
FacePosition
Normalized position of detected face. This is used to position AR Face in the screen.
FaceScale
Scale of detected face. This is used to scale AR Face in the screen.
Up
Up vector of detected face. This is used to rotate head around the z-axis in FullBody mode.
Forward
Forward vector of detected face. This is used to rotate head around the y-axis in FullBody mode.
Blendshapes
Blendshape results from AR Kit. This is used to apply facial expressions to MYTY Avatar. As of April 2023, MYTY SDK uses coefficients below.
browDownLeft
,browDownRight
,browOuterUpLeft
,browOuterUpRight
,eyeBlinkLeft
,eyeBlinkRight
,jawOpen
,mouthClose
,mouthPucker
,mouthSmileLeft
,mouthSmileRight
,mouthStretchLeft
,mouthStretchRight
For more details about blendshape in ARKit, you can refer to the document below.
Example Usage (Dart)
Last updated