Skip to main content
Version: 14 Oct 2024

Overview

The transition to OpenXR involves the phasing out of MLSDK in favor of OpenXR extensions. To align with this change, Magic Leap’s Unity SDK is shifting from Unity's platform specific XR Plugin (ie: com.unity.xr.magicleap) to Unity's OpenXR Plugin (com.unity.xr.openxr). This shift is accompanied by the addition of new OpenXR Features and API for accessing the functions previously defined by MLSDK C-APIs. This document will outline the rationale behind the move to OpenXR, detail the steps involved in the migration process, and provide insights into the changes and enhancements introduced by this transition.

Why OpenXR?

OpenXR is an industry-standard framework for developing cross-platform XR applications. This migration aligns with Magic Leap’s transition from a custom Android-based OS (Lumin) to AOSP, simplifying the development process for Magic Leap 2. Adopting OpenXR offers the following advantages:

Cross platform support: Facilitates the creation of XR solutions deployable across various platforms. Reduced Development Overhead: Allows developers to use a wider range of development tools and resources.

MLSDK Interoperability

During the OpenXR transition, certain Magic Leap APIs may not be immediately available. However, developers can still use the MLSDK APIs to perform certain functions. Generally, all existing ML APIs should continue to work (Voice Intents, MLAudio, Camera, etc) when using OpenXR with the exception of:

  • Graphics related features such as Global / Segmented Dimming and Headlocked mode
  • Subsystem-related logic, including Input and Meshing

OpenXR Features & Extensions

By default, the OpenXR API provides a limited set of functionality guaranteed to be available on all conformant devices. However, most devices support additional functionality in addition to what’s provided by the core OpenXR API.  When developing an OpenXR application, if you wish to enable access to functionality not provided as part of the core OpenXR API, you must specify at build-time that your application requires the extension corresponding to that functionality. For example, if you want an application to be able to access hand-tracking, you must specify the hand-tracking extension (XR_EXT_hand_tracking).

Unity’s OpenXR Plugin provides a C# API and a user interface within Project Settings to easily enable or disable functionality needed by the app, through a concept Unity calls OpenXR Features. When you define an OpenXR Feature in your project code, you specify one or more OpenXR extensions to be included at build-time when the Feature is enabled. In theory, the application and all the specified features, should work on any device whose runtime supports those extensions. Commonly available extensions bear the EXT or KHR prefix in their names, while Magic Leap vendor-specific extensions use the ML_X prefix and may not be part of other vendors’ runtimes.  Magic Leap is developing its own set of vendor extensions to provide access Magic Leap 2 specific features previously available via the MLSDK C-APIs.

Developers will be able to access the OpenXR APIs immediately after installing the Magic Leap Unity SDK along side Unity's OpenXR plugin in their Unity Projects.

Changes to the com.unity.xr.magicleap Dependency

The OpenXR Plugin (com.unity.xr.openxr) replaces the previous Magic Leap XR Plugin as the XR provider for Magic Leap devices. Which includes an alternative to the ml_graphics API for rendering content to the Magic Leap 2.

With the ML XR Plugin removed, and replaced with the OpenXR Plugin, the following are achieved:

  • Unity renders to the ML2 display via OpenXR
  • ml_graphics is migrated to corresponding OpenXR extensions as Features within our SDK.
  • Magic Leap features can be accessed directly using OpenXR Features API

OpenXR Feature API

Previously, developers would need to implement Magic Leap Specific code for their application to function on Magic Leap 2. C-API headers were all Magic Leap specific. With the transition to OpenXR, developers can create Magic Leap 2 applications using the standard OpenXR workflow by implementing device specific functionality using Unity's OpenXR Features API.

note

OpenXR Extensions and features cannot be enabled at runtime. Developers should ensure that the extensions and interaction profiles are enabled inside their project's OpenXR Settings before building their applications.

Example

In the Unity SDK, each OpenXR Feature is encapsulated within its own class, providing a structured approach to accessing the new API equivalents of the legacy ML APIs. Developers can interact with these new APIs by utilizing corresponding Feature classes.

Take the GlobalDimmer as an example. Previously, under the MLSDK, setting the dimmer value to 0.6 would involve the following code:

Legacy MLSDK API
MLGlobalDimmer.SetValue(0.6f);

With the transition to OpenXR, the GlobalDimmer functionality is now accessible through the MagicLeapRenderingExtensionsFeature class. The new code to achieve the same result is:

Open XR API
var renderFeature = OpenXRSettings.Instance.GetFeature<MagicLeapRenderingExtensionsFeature>();
renderFeature.GlobalDimmerValue = 0.6f;

OpenXR Features

User Calibration

In the legacy Magic Leap SDK, user calibration was handled by two separate APIs:

Legacy MLSDK
MLHeadsetFit.GetState(out state);
MLEyeCalibration.GetState(out state);

With OpenXR, these functionalities are unified into a single feature, streamlining the process for developers:

Open XR API
userCalibrationFeature = OpenXRSettings.Instance.GetFeature<MagicLeapUserCalibrationFeature>();

userCalibrationFeature.GetLastHeadsetFit(out var headsetFitData);
userCalibrationFeature.GetLastEyeCalibration(out var eyeCalibrationData);

Planes

The C-API implementation of planes was integrated into the Unity Planes Subsystem, which is maintained in the OpenXR transition. Developers will experience no changes in this area, ensuring a seamless migration.

For OpenXR we are still using this Planes Subsystem, so for an external developer there should be no changes required

Marker Understanding

In the C-API Implementation a lot of settings and parsing was required to read the marker data. It added a layer of difficulty to using this API.

Legacy MLSDK
//Setup
MLMarkerTracker.TrackerSettings.Create(customProfile: customSettings);
MLMarkerTracker.SetSettingsAsync(newSettings);
MLMarkerTracker.StartScanningAsync(currentSettings);

//Event
MLMarkerTracker.OnMLMarkerTrackerResultsFound

//Shutdown
MLMarkerTracker.StopScanningAsync();

OpenXR simplifies this to follow with a more uniform approach:

Open XR API
//Setup
markerFeature = OpenXRSettings.Instance.GetFeature<MagicLeapMarkerUnderstandingFeature>();
markerFeature.CreateMarkerDetector(settings);

//Update
markerFeature.UpdateMarkerDetectors();

//Modify Marker Detector Settings
markerFeature.ModifyMarkerDetector(settings, ref markerDetectorToModify);

var markerdetector = markerFeature.MarkerDetectors[currentMarkerIndex];

//When removing or shutting down
markerFeature.DestroyMarkerDetector();

Segmented Dimmer

Segmented Dimmer is enabled in OpenXR by setting the XrEnvironmentBlendMode to ALPHA_BLEND. Set the blend mode to ADDITIVE to disable the Segmented Dimmer in Unity.

This approach mirrors the legacy ML API, where enabling the Segmented Dimmer was an opt-in feature:

Legacy MLSDK API:

Legacy MLSDK
MLSegmentedDimmer.Activate() // under the hood, this just changes BlendMode to ALPHA_BLEND
MLSegmentedDimmer.Deactivate() // and this changes it back to ADDITIVE

OpenXR API:

Open XR API
renderFeature = OpenXRSettings.Instance.GetFeature<MagicLeapRenderingExtensionsFeature>();
renderFeature.BlendMode = XrEnvironmentBlendMode.AlphaBlend;

Global Dimmer

The Global Dimmer is a part of the RenderingExtensions feature, which provides the XR_ML_global_dimmer extension.

OpenXR Method

Open XR API
renderFeature = OpenXRSettings.Instance.GetFeature<MagicLeapRenderingExtensionsFeature>();
renderFeature.GlobalDimmerEnabled = true;
renderFeature.GlobalDimmerValue = 1.0f;

Setting Focus Distance

Setting the focus distance is done using the RenderingExtensions feature.

OpenXR Method

Open XR API
renderFeature = OpenXRSettings.Instance.GetFeature<MagicLeapRenderingExtensionsFeature>();
mainCamera.stereoConvergence = 100f;
renderFeature.FocusDistance = mainCamera.stereoConvergence;

Interaction Profiles

Controller

Legacy MLSDK

There are several ways of getting data from an input device. Previously we recommended using the MagicLeapInputs.ControllerActions class that was a part of the pre-defined Magic Leap Inputs input asset,created specific for Magic Leap 2 :

Legacy MLSDK
controllerActions = new MagicLeapInputs.ControllerActions(mlInputs);

OpenXR

In OpenXR, the controller data is accessed through the standard Unity XR structures and features, such as:

  • TrackedPoseDriver
  • XRController
  • Input Devices
  • Input Action Map

The new SDK supports the common input bindings that are supported across a wide range of platforms. Developers can use the them to get the data they need from the controller, such as devicePosition, deviceRotation, menuButton, etc. The Magic Leap 2 Controller Interaction Profile defines which XRCommonUsages are supported by the controller, and maps them to the corresponding Magic Leap controller inputs.

Features Not Yet Supported

  • Controller Gestures

Gaze Tracking

The gaze tracking feature enables eye tracking on the ML2, which allows developers to get the position and rotation of the user’s gaze, as well as the tracking state.

Legacy MLSDK

Previously there were 2 ways to get Eye data.

  1. Eye Actions
Legacy MLSDK
eyesActions = new MagicLeapInputs.EyesActions(mlInputs); 
eyesActions.Data.ReadValue<UnityEngine.InputSystem.XR.Eyes>();
  1. Input Device
Legacy MLSDK
eyesDevice = InputSubsystem.Utils.FindMagicLeapDevice(InputDeviceCharacteristics.EyeTracking | InputDeviceCharacteristics.TrackedDevice);

// Eye data specific to Magic Leap
InputSubsystem.Extensions.TryGetEyeTrackingState(eyesDevice, out var trackingState);

OpenXR

In OpenXR, the eye data is accessed through the standard Input Devices API, without the need for any Magic Leap specific functions:

Open XR API
InputDevices.GetDevicesWithCharacteristics(InputDeviceCharacteristics.EyeTracking, InputDeviceList);
eyeTracking = InputDeviceList.FirstOrDefault();
eyeTracking.TryGetFeatureValue(CommonUsages.isTracked, out bool isTracked);
eyeTracking.TryGetFeatureValue(EyeTrackingUsages.gazePosition, out Vector3 position);
eyeTracking.TryGetFeatureValue(EyeTrackingUsages.gazeRotation, out Quaternion rotation);

This allows developers to get the eye data as common usages that are supported by the Unity Gaze Interaction Profile, which is a standard profile for eye tracking devices. The gaze position and rotation represent the combined gaze of both eyes, and the tracking state indicates whether the device is tracking the user’s eyes or not.