Skip to main content

70 docs tagged with "Input"

View All Tags

API Overview

Magic Leap's Eye Tracking data is retrieved in two ways.

API Overview

This section provides information on the Gesture Classification API and how to enable it inside your application.

API Overview

This guide provides detailed instructions and examples on how to use the MagicLeap's MLPowerManager API for managing power states and properties of components in Unity. The guide first explains the concept of power states, showing how to get all available states and retrieve a specific state for a device. It then delves into the process of setting a power state with thorough examples. The guide proceeds to explain how to get and update the properties of a component, focusing on different aspects like battery info, battery level, charging state, and connection state. Lastly, it touches on how to handle power management events such as errors, state changes, and property changes through callbacks. By following this guide, developers can efficiently control and manage Magic Leap 2's power states and properties inside their Unity Application.

Audio Capture

An overview of the audio capture demo scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.

Bluetooth

Magic Leap 2 provides support for Android’s default Bluetooth stack which includes both Classic Bluetooth and Bluetooth Low Energy. Using Bluetooth, Magic Leap 2 can create personal area networks to send and receive data with nearby Bluetooth devices. Integration with Android’s standard Bluetooth API means that users can connect existing Bluetooth devices that are supported on Android without any modification. Bluetooth devices can be connected in Magic Leap 2’s system settings using the steps below.

C-API Samples Overview

The list below contains descriptions of each of the C-API samples available in the ML Hub. The README.md files of each example contain instructions on how to build, install and uninstall the example, as well as its expected behavior and GUI.

Controller

An overview of the controller's buttons and sensors, touchpad gestures, and important changes from the Magic Leap 1 SDK.

Controller

An overview of the controller demo scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.

Controller Gesture Events

This section demonstrates how to use Magic Leap's Gesture subsystem to receive the input events that are triggered when the user performs a gesture on the controllers touch pad.

Controller Input Events

This section demonstrates how to use Unity's Input System to access input events from the Magic Leap 2 controller.

Controller Input Values

Using Unity Input System, you can read Magic Leap 2's controller input directly using the InputAction.ReadValue() method. View Unity's Documentation to learn more about Unity Input System Unity's Input System.

Controller Overview

Unity developers can use the OpenXR Magic Leap 2 Controller Interaction Profile to access the controller's input using Unity's Input system. This profile can be enabled inside your project's OpenXR Settings Window > XR Plugin Manager > OpenXR Settings

Controller Overview

The Magic Leap 2's controller input can be accessed using Unity's Input System. The Magic Leap 2 SDK includes predefined action mappings, so developers can access controller input in a familiar way.

Deploying Custom Voice Commands

The Magic Leap 2 Voice Input framework supports App Specific Voice Intents which are custom voice intents you can develop to use within your app. You can develop a full set of voice intents to incorporate in your applications with the assistance of the Voice Intent Development ToolKit (VIDTK).

Examples

This section demonstrates how to use the Gesture Classification API.

Eye Gaze Overview

Unity developers can use the OpenXR Eye Gaze Interaction Profile to determine what a user is looking at, allowing a hands-free method of interacting with their application. To access the Eye Gaze data, the interaction profile needs to be enabled in your project's OpenXR Settings (Window > XR Plugin Manager > OpenXR Settings).

Eye Tracking

An overview of the eye tracking demo scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.

Eye Tracking Overview

If your Application collects, stores, transfers or otherwise uses data off the Magic Leap 2 device that is received via this API, then you must comply with the Magic Leap 2 Eye Tracking Data Transparency Policy.

Eye Tracking Tracked Pose Driver

If your Application collects, stores, transfers or otherwise uses data off the Magic Leap 2 device that is received via this API, then you must comply with the Magic Leap 2 Eye Tracking Data Transparency Policy.

Eye Tracking Tracked Pose Driver

If your Application collects, stores, transfers or otherwise uses data off the Magic Leap 2 device that is received via this API, then you must comply with the Magic Leap 2 Eye Tracking Data Transparency Policy.

Generic Eye Tracking Data

If your Application collects, stores, transfers or otherwise uses data off the Magic Leap 2 device that is received via this API, then you must comply with the Magic Leap 2 Eye Tracking Data Transparency Policy.

Gesture Classification

An overview of the gesture classification demo scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.

Hand Tracking

An overview of the hand tracking demo scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.

Hand Tracking Overview

The Magic Leap 2 OpenXR Unity SDK supports hand tracking via the Hand Interaction Profile. The interaction profile can be enabled by selecting Edit > Project Settings > XR Plug-in Management > OpenXR, then adding the interaction profile into the Enabled Interaction Profiles section.

Hand Tracking Overview

This section provides information on the core Hand Tracking API and how to enable it inside your applications.

Handling Tracking Loss Events

If the Magic Leap can't locate its position in an environment, it experiences "tracking loss". The Magic Leap 2 lets developers manage their own tracking loss behavior -- some developers may want to pause the update loop and display a splash image, while others may want the app to continue playing.

Handling Tracking Loss Events

If the Magic Leap can't locate its position in an environment, it experiences "tracking loss". The Magic Leap 2 lets developers manage their own tracking loss behavior -- some developers may want to pause the update loop and display a splash image, while others may want the app to continue playing.

Head Tracking Overview

Head tracking uses cameras on the headset to track the movement of the user’s head. This allows the headset to display 3D content at a specific point in the user's viewing area.

Head Tracking Overview

Head tracking uses cameras on the headset to track the movement of the user’s head. This allows the headset to display 3D content at a specific point in the user's viewing area.

HMD Tracked Pose Driver

In order for your unity camera to track with head pose, it needs to have the Tracked Pose Driver component with the following settings. Depending on your app setup, you can either add this component to the main camera yourself, or get it through prefabs like the XRRig (from the XR Interaction Toolkit) or the "Main Camera" (from our SDK package).

Input Bindings

Unity's Input System can locate Controls using paths. Bindings on Input Actions utilize this feature to identify the Control(s) they receive input from. You can also use paths to directly look up Controls and Devices, or to have the Input System search for Controls among all devices using the InputSystem.FindControls method.

Input Bindings

This guide provides information on the input paths that are supported on Magic Leap 2 when using the OpenXR Eye Gaze Interaction Profile. For general information about OpenXR input in Unity, see the Unity OpenXR Plugin Input Manual.

Input Bindings

This guide provides information the Input Control Paths and Interaction Profiles that are supported on Magic Leap 2. For general information about the OpenXR Hand Interaction Profile in Unity. See the Unity OpenXR Hand Interaction Input Manual.

Input Device Feature Values

It is recommended that developers read the controller input using Unity's Input System. However, developers can obtain the controller's input directly from the InputDevice. This section provides an example of reading input using the TryGetFeatureValue method and XRCommonUsages features.

Input Device Feature Values

It is recommended that developers use the Unity Input System to obtain the Gaze Input. However, developers can also obtain eye tracking input directly from the InputDevice. This section provides an example of how to read input using the TryGetFeatureValue method and EyeTrackingUsages features.

Input Device Feature Values

It is recommended that developers read the controller input using Unity's Input System. However, developers can obtain the controller's input directly from the InputDevice. This section provides an example of reading input using the TryGetFeatureValue method and XRCommonUsages features.

Inputs Sample

This section describes how to read Magic Leap 2's controller input using Unity's Input System and Input Actions. The samples in this category assume that you are using the MagicLeapOpenXRInput.inputactions asset provided in the Magic Leap Samples. However, they can be easily modified to support custom input actions.

Magic Leap Eye Tracking Data

To obtain Magic Leap device specific features such as checking the eye tracking FixationConfidence status or if the user is blinking use Magic Leap's InputSubsystem.Extensions

Migrating from ML1

This article provides an overview on the changes that were made to the Magic Leap SDK and how to access Magic Leap 1 equivalent features on the Magic Leap 2.

Quick Start

This section covers how to use the Magic Leap 2 Controller Interaction Profile with Input System. For more information about obtaining OpenXR Input see Unity's OpenXR Input Documentation.

Quick Start

This section includes information on how to get started using Hand Tracking via Unity's XR Hands Package and the OpenXR Hand Interaction Profile. See the Unity Manual for more information about the XRHands Package and OpenXR Input.

Reference Space Overview

Reference Spaces are different methods of defining the origin and orientation of the coordinate system used by the device within your application. They affect how the device tracks the position and rotation of the user and the environment. Different modes are suitable for various types of applications and experiences.

Runtime Configuration

This section provides details on how developers can create and register voice commands dynamically at runtime. This feature can be helpful when loading content dynamically.

Runtime Voice Intents Example

This section provides details on how developers can create voice commands at runtime. This feature can be helpful when loading content dynamically.

Simple Example

This section includes code examples for developers to reference when implementing voice input in their applications.

Start/Stop Input

This section provides details on how developers can start and stop tracking voice intents inside their applications.

Voice Input System Settings

The Voice Intent API will only register voice commands if Voice Input is enabled inside the Magic Leap 2's System Settings (Settings > Magic Leap Inputs > Voice). This section demonstrates how to check if a user has enabled Voice Input.

Voice Intents Overview

Magic Leap recommends using the Voice Intent Development Toolkit (VIDTK) to create and deploy custom voice commands. The toolkit provides validation and guidence within the toolkit UI that are not available in the Unity Editor. For more information, see Voice Commands.

Voice Slots

A Slot is a placeholder string for a set of potential values. The utterance will use one of the values and the developer can have different logic based on which value was spoken. To indicate use of a slot, put the slot name within { } for the command.

WebView

An overview of the WebView scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.

XRI Hand Interaction Demo

This guide will demonstrate how to configure the Unity XRI Hands Demo Scene to work with the Magic Leap 2.