👨‍💻
MYTY
MYTY SDK
MYTY SDK
  • MYTY SDK
  • How to use MYTY SDK?
    • Create Unity Project
    • Import MYTY Kit
    • Import MYTY SDK
    • Build MYTY SDK
    • Create Scene
  • MYTY SDK Interface
    • Load Avatar
    • Select Avatar
    • Choose View Mode
    • Control Avatar (Mediapipe)
    • Control Avatar (ARKit)
  • For Web Application
    • Interacting with browser scripting
    • Integrate with React Project
  • React Example
    • Handle WebGL view & interaction
    • Choose Video Device for input
    • Handle Mediapipe Holistic
    • Capture & Record
  • For iOS Application
    • Interact with native iOS application
    • Integrate with Flutter Project
    • Trouble Shooting (Flutter Integration)
  • Flutter Example
    • Handle UnityWidget & interaction
    • Use ARKit Plugin
  • Avatar Registry(API)
    • Overview
    • Getting Started
    • Auth & Rate Limit
    • Legacy - v0
  • Resources
    • License Agreement
    • Version Note (SDK vs Kit)
    • Display Logo
Powered by GitBook
On this page
  • Background
  • Input parameters
  • FacePosition
  • FaceScale
  • Up
  • Forward
  • Blendshapes
  • Example Usage (Dart)
  1. MYTY SDK Interface

Control Avatar (ARKit)

Describe how to control MYTY Avatar movement with ARKit

Background

Motion capture results should be provided to allow MYTY Avatars to reflect the movement of users. This interface defines how to deliver the ARKit results.

Input parameters

FacePosition

Normalized position of detected face. This is used to position AR Face in the screen.

FaceScale

Scale of detected face. This is used to scale AR Face in the screen.

Up

Up vector of detected face. This is used to rotate head around the z-axis in FullBody mode.

Forward

Forward vector of detected face. This is used to rotate head around the y-axis in FullBody mode.

Blendshapes

Blendshape results from AR Kit. This is used to apply facial expressions to MYTY Avatar. As of April 2023, MYTY SDK uses coefficients below.

browDownLeft, browDownRight, browOuterUpLeft, browOuterUpRight, eyeBlinkLeft, eyeBlinkRight, jawOpen, mouthClose, mouthPucker, mouthSmileLeft, mouthSmileRight, mouthStretchLeft, mouthStretchRight

For more details about blendshape in ARKit, you can refer to the document below.

Example Usage (Dart)

context.read<UnityBloc>().add(
    UnityMotionCapturedEvent(
      arKitData: ARKitData(
        facePosition: vm.Vector3(screenCoordinates.x / screenWidth,
            screenCoordinates.y / screenHeight, 0),
        faceScale: vm.Vector3(0.35 / facePosition.length,
            0.35 / facePosition.length, 0.35 / facePosition.length),
        up: vm.Vector3(-upVector.x, upVector.y, upVector.z),
        forward: anchor.transform.forward,
        blendshapes: ARKitBlendShape(
            browDownLeft: anchor.blendShapes['browDown_L'] ?? 0,
            browDownRight: anchor.blendShapes['browDown_R'] ?? 0,
            browOuterUpLeft: anchor.blendShapes['browOuterUp_L'] ?? 0,
            browOuterUpRight: anchor.blendShapes['browOuterUp_R'] ?? 0,
            eyeBlinkLeft: anchor.blendShapes['eyeBlink_L'] ?? 0,
            eyeBlinkRight: anchor.blendShapes['eyeBlink_R'] ?? 0,
            jawOpen: anchor.blendShapes['jawOpen'] ?? 0,
            mouthClose: anchor.blendShapes['mouthClose'] ?? 0,
            mouthPucker: anchor.blendShapes['mouthPucker'] ?? 0,
            mouthSmileLeft: anchor.blendShapes['mouthSmile_L'] ?? 0,
            mouthSmileRight: anchor.blendShapes['mouthSmile_R'] ?? 0,
            mouthStretchLeft: anchor.blendShapes['mouthStretch_L'] ?? 0,
            mouthStretchRight: anchor.blendShapes['mouthStretch_R'] ?? 0),
      ),
    ),
  );
PreviousControl Avatar (Mediapipe)NextFor Web Application

Last updated 2 years ago

LogoblendShapes | Apple Developer DocumentationApple Developer Documentation