Skip to main content
Version: 14 Oct 2024

Hand Tracking Developer Guide

Hand tracking lets users interact intuitively with virtual content by using natural hand gestures as an input method. The Magic Leap 2 headset camera detects hand movement in real time.

Magic Leap supports hand tracking for the Magic Leap native C-API (MLSDK), MRTK 2.8, MRTK 3, Unity, and OpenXR.

note

End users can enable additional hand-tracking capabilities available in the Magic Leap 2 OS. See the Hand Tracking Guide in the Magic Leap Care portal. Enabling or disabling these features does not affect hand tracking in your apps.

Unity with OpenXR

OpenXR provides a standardized interface for accessing hand tracking data from compatible hardware devices.

If you are developing apps in Unity with the Magic Leap 2 OpenXR Unity SDK, refer to the Unity (OpenXR) hand-tracking topics for detailed information.

The Magic Leap 2 OpenXR Unity SDK supports the OpenXR hand interaction extension, XR_EXT_hand_interaction, and the OpenXR palm pose extension XR_EXT_palm_pose.

This illustration shows the hand joint conventions used by the OpenXR Hand Interactions Profile.

OpenXR hand joint conventions for left and right hands.

These illustrations show the key poses used by the OpenXR Hand Interactions Extension: aim, grip, pinch, and poke.

OpenXR aim key pose for left and right hands.
OpenXR grip key pose for left and right hands.
OpenXR pinch key pose for left and right hands.
OpenXR poke key pose for left and right hands.

These illustrations show examples of palm poses using by the OpenXR Palm Pose extension.

OpenXR Palm Pose with Controller.
OpenXR Palm Pose tracked hand.
OpenXR Palm Pose digital hand avatar.

Unity with MLSDK

If you are developing apps in Unity with MLSDK, use the Magic Leap Hand Tracking API for Unity and Gesture Classification API for Unity to access hand-tracking capabilities and assign actions to hand gestures. The Gesture Classification API helps you create more intuitive user experiences as users interact with virtual content using their hands.

Using hand-tracking keypoint data, the Gesture Classification API recognizes when the geometric properties of the user’s hands conform to certain gestures, called key poses. You assign user interaction to these gestures. The API provides predefined Supported key poses. You can also create your own custom key poses. The API exposes all the hand characteristics you’ll need, such as the relative positions and angles of the keypoints.

You can also use the Gesture Classification API to query data about the user’s hand, such as finger length and extension angle.

See this overview of the Gesture Classification API for more information.

Hand-Tracking Keypoints

Hand-tracking keypoints are locations that correspond to hand joints, finger joints, and fingertips. These keypoints are used for designing interactions and attachment of virtual content.

A drawing of a human hand with MLSDK hand-tracking keypoints shown.

When you're using the MLSDK, the Magic Leap 2 device assigns 26 keypoints to each hand to create a three-dimensional representation of the hand. Twenty-one keypoints correspond to visible points on the hand. The other five keypoints are calculated based on these 21 keypoints.

Using this model of the user’s hands, the device simultaneously tracks hand position and rotation across three dimensions. This way, it can locate the position of the hand in space, classify gestures, and identify the right and left hand simultaneously.

Supported Key Poses

Key poses are static hand position classifications based on how closely the hand’s keypoints align with specified criteria. Each key pose falls under a more general grouping of hand positions called postures.

NameKey PoseDescription
OK
The thumb and index finger are flexed and their tips touch to form a circle. The other three fingers are partially extended. Posture: Pinch.
C
Fingers and thumb are flexed. Fingers are together and curved slightly. The hand resembles the letter C. Posture: Pinch.
Pinch
The thumb and index finger are flexed. The tips of the thumb and index finger are close to each other but not touching. The other fingers are flexed toward the palm. Posture: Pinch.
Finger
The index finger is extended. The thumb and other fingers are flexed toward the palm. Posture: Point.
L
The thumb is extended to the side of the palm and the index finger is extended to form a shape that resembles the letter L . The other fingers are flexed toward the palm. Posture: Point.
Thumb
The thumb is extended to the side of the palm. All other fingers are flexed toward the palm. Posture: Grasp.
Fist
All fingers, including the thumb, are flexed into the palm. Posture: Grasp.
Open
All fingers, including the thumb, are extended. Posture: Open.

Hand Characteristics

Hand characteristics are the geometric properties of the hand based on its keypoints. You can use these hand characteristics to create your own hand interaction constraints and custom key poses.

NameHand CharacteristicDescription
Hand transformation
Hand transformation describes the directional and rotational movement of the hand center.
Splay angles
Thumb-to-index angle
Index-to-middle angle
Middle-to-ring angle
Ring-to-pinky angle
Splay angles are the angles between adjacent fingers, capturing the splay of the hand. The splay angle of each adjacent pair of fingers is calculated using the tips of the fingers and the midpoint between the MCP keypoints of the finger.
Finger to palm angle
The finger to palm is the angle between the finger and the palm. It’s calculated using the wrist-to-center vector of the hand and the MCP-to-tip vector of the finger.
Finger extension
90 degree index finger extension
135 degree index finger extension
180 degree index finger extension
The finger extension is the angle of the first joint of the finger. It’s calculated using the MCP-to-PIP vector and the PIP-to-DIP vector of the finger.
Pinch angle
The Pinch angle is the angle between the index finger and thumb when the hand is in a Pinch posture. It’s calculated using the tip of the thumb, the tip of the index finger, and the midpoint between the MCP keypoints of the thumb and index finger.
Pinch distance
The Pinch distance is the distance between the tip of the thumb and the tip of the index finger when the hand is in the Pinch posture. It’s calculated using the tip of the thumb and the tip of the index finger.

Interaction Points

An interaction point is a transform which dynamically adapts translation and rotation based on the current posture of the hand. It provides a single optimal point of reference that simplifies common direct and indirect hand interactions.

NameHand CharacteristicDescription
Point
The Point interaction point is the tip of the index finger when the hand is in the Point posture.
Pinch
The Pinch interaction point is the midpoint between the tip of the thumb and the tip of the index finger when the hand is in the Pinch posture.

Design Considerations

For guidelines on designing apps that use hand tracking, see the Hand tracking design guide.