# Control Avatar (ARKit)

## Background

Motion capture results should be provided to allow MYTY Avatars to reflect the movement of users. This interface defines how to deliver the ARKit results.

## Input parameters

### FacePosition

Normalized position of detected face. This is used to position AR Face in the screen.

### FaceScale

Scale of detected face. This is used to scale AR Face in the screen.

### Up

Up vector of detected face. This is used to rotate head around the z-axis in FullBody mode.

### Forward

Forward vector of detected face. This is used to rotate head around the y-axis in FullBody mode.

### Blendshapes

Blendshape results from AR Kit. This is used to apply facial expressions to MYTY Avatar. As of April 2023, MYTY SDK uses coefficients below.

> <mark style="color:purple;">`browDownLeft`</mark>, <mark style="color:purple;">`browDownRight`</mark>, <mark style="color:purple;">`browOuterUpLeft`</mark>, <mark style="color:purple;">`browOuterUpRight`</mark>, <mark style="color:purple;">`eyeBlinkLeft`</mark>, <mark style="color:purple;">`eyeBlinkRight`</mark>, <mark style="color:purple;">`jawOpen`</mark>, <mark style="color:purple;">`mouthClose`</mark>, <mark style="color:purple;">`mouthPucker`</mark>, <mark style="color:purple;">`mouthSmileLeft`</mark>, <mark style="color:purple;">`mouthSmileRight`</mark>, <mark style="color:purple;">`mouthStretchLeft`</mark>, <mark style="color:purple;">`mouthStretchRight`</mark>

For more details about blendshape in ARKit, you can refer to the document below.

{% embed url="<https://developer.apple.com/documentation/arkit/arfaceanchor/2928251-blendshapes>" %}

## Example Usage (Dart)

```dart
context.read<UnityBloc>().add(
    UnityMotionCapturedEvent(
      arKitData: ARKitData(
        facePosition: vm.Vector3(screenCoordinates.x / screenWidth,
            screenCoordinates.y / screenHeight, 0),
        faceScale: vm.Vector3(0.35 / facePosition.length,
            0.35 / facePosition.length, 0.35 / facePosition.length),
        up: vm.Vector3(-upVector.x, upVector.y, upVector.z),
        forward: anchor.transform.forward,
        blendshapes: ARKitBlendShape(
            browDownLeft: anchor.blendShapes['browDown_L'] ?? 0,
            browDownRight: anchor.blendShapes['browDown_R'] ?? 0,
            browOuterUpLeft: anchor.blendShapes['browOuterUp_L'] ?? 0,
            browOuterUpRight: anchor.blendShapes['browOuterUp_R'] ?? 0,
            eyeBlinkLeft: anchor.blendShapes['eyeBlink_L'] ?? 0,
            eyeBlinkRight: anchor.blendShapes['eyeBlink_R'] ?? 0,
            jawOpen: anchor.blendShapes['jawOpen'] ?? 0,
            mouthClose: anchor.blendShapes['mouthClose'] ?? 0,
            mouthPucker: anchor.blendShapes['mouthPucker'] ?? 0,
            mouthSmileLeft: anchor.blendShapes['mouthSmile_L'] ?? 0,
            mouthSmileRight: anchor.blendShapes['mouthSmile_R'] ?? 0,
            mouthStretchLeft: anchor.blendShapes['mouthStretch_L'] ?? 0,
            mouthStretchRight: anchor.blendShapes['mouthStretch_R'] ?? 0),
      ),
    ),
  );
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://myty.gitbook.io/products/sdk/myty-sdk-interface/control-avatar-arkit.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
