Skip to main content
Version: 21 Aug 2024

API Overview

This extension allows developers to interact with the Magic Leap 2's pixel sensors within their applications. It provides APIs for managing sensor data acquisition from various sensor types, each with unique capabilities and requirements. The following sections will delve deeper into these concepts and API usage.

NameSpace
using MagicLeap.OpenXR.Features.PixelSensors;
caution

This feature requires the Magic Leap 2 Pixel Sensor OpenXR Feature to be enabled in your project's OpenXR Settings (Window > XR Plugin Manager > OpenXR Settings).

Key Concepts

Pixel Sensor: A device that collects data using a grid-based array. The runtime may support multiple types of pixel sensors, each tailored to specific sensing tasks. Examples include a depth sensing camera, two eye sensing cameras, and two world sensing cameras.

  • Sensor Permissions: Before an application can access data from a sensor, appropriate permissions must be granted. For detailed guidance on configuring permissions, refer to the Sensor Permissions section.
  • Sensor Stream: Sensors may support multiple data streams, each optimized for different ranges or types of data collection. For instance, a depth pixel sensor might offer both short-range and long-range sensing streams. More information can be found in the Sensor Streams section.
  • Sensor Capability: Sensors possess configurable capabilities that can be tailored to the needs of the application, such as frame rate, resolution, and exposure time. Detailed configuration options are discussed in the Sensor Capabilities section.
  • Sensor Metadata: In addition to raw data, sensors often provide metadata that offers additional context about the captured data, such as exposure time and the camera model used. This is explored further in the Sensor Metadata section.
  • Camera Models: Each sensor is associated with a camera model that mathematically describes how it interprets and converts the three-dimensional world into two-dimensional images. For more information, visit the Camera Models section.

API Functions

The extension provides methods for:

  • Enumerating all available sensors.
  • Creating and destroying sensor handles.
  • Monitoring sensor availability.
  • Querying the number of streams a sensor supports.
  • Enumerating stream capabilities.
  • Querying and configuring sensor capabilities.
  • Enumerating and accessing sensor metadata.
  • Starting and stopping sensor streams.
  • Querying sensor data.

Enumerate the sensors

This API call retrieves a list of all sensor types supported by the device. It does not indicate whether these sensors are currently available or enabled.

Method Definition

 public List<PixelSensorId> GetSupportedSensors()

This function returns a List<PixelSensorId>, where each PixelSensorId represents a unique sensor supported by the device, providing its name and XR path.

Example Usage

The following Unity script demonstrates how to enumerate all supported sensors at the start of the application. It also checks if the MagicLeapPixelSensorFeature is enabled before attempting to list the sensors.

using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorScratchPad : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;

// This method is called before the first frame update
void Start()
{
// Attempt to retrieve the MagicLeapPixelSensorFeature from the OpenXR settings
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();

// Check if the feature is available and enabled
if (pixelSensorFeature == null || !pixelSensorFeature.enabled)
{
Debug.LogWarning("Magic Leap Pixel Sensor Feature is not available or not enabled.");
enabled = false; // Disable this script if the feature is unavailable or disabled
return;
}

// Get the list of supported sensors and log their details
List<PixelSensorId> supportedSensors = pixelSensorFeature.GetSupportedSensors();
foreach (var sensor in supportedSensors)
{
Debug.Log($"Supported Pixel Sensor: {sensor.SensorName} (XR Path: {sensor.XrPath})");
}
}
}

Notes

  • The GetSupportedSensors method returns all sensors supported by the hardware and firmware of the device. This does not necessarily mean that all returned sensors are currently operable or available for use in the application. Sensors may be unavailable due to privacy settings, user permissions, or other runtime conditions.
  • Developers should implement additional checks and handling based on the application's requirements and expected sensor availability.
  • The list returned by GetSupportedSensors may change with different versions of hardware or software, so it is advisable to query this list dynamically rather than caching the results.

Supported Sensors & Paths

This extension supports the following cameras:

Sensor NameXR PathDescription
Picture Center/pixelsensor/picture/centerRGB Camera on the front of the headset
World Left/pixelsensor/world/leftCamera located on the left corner of the headset.
World Center/pixelsensor/world/centerCamera located on the front of the headset.
World Right/pixelsensor/world/rightCamera located on the right corner of the headset.
Depth Center/pixelsensor/depth/centerDepth camera located on the front of the headset.
Eye Temple Left/pixelsensor/eye/temple/leftLeft temple eye camera.
Eye Nasal Left/pixelsensor/eye/nasal/leftLeft nasal eye camera.
Eye Nasal Right/pixelsensor/eye/nasal/rightRight nasal eye camera.
Eye Temple Right/pixelsensor/eye/temple/rightRight temple eye camera.

Sensor Permissions

Sensors require permissions before the application can access the data from the sensor. Android applications must have the required permission listed in their manifest to open sensors listed in the table below.

PermissionSensor Id
com.magicleap.permission.DEPTH_CAMERA (protection level: dangerous)/pixelsensor/depth/center
com.magicleap.permission.EYE_CAMERA (protection level: dangerous)/pixelsensor/eye/temple/left
/pixelsensor/eye/nasal/left
/pixelsensor/eye/nasal/right
/pixelsensor/eye/temple/right
permissions android.permission.CAMERA (protection level: dangerous)/pixelsensor/world/left,
/pixelsensor/world/center
/pixelsensor/world/right
/pixelsensor/picture/center

Create Pixel Sensor

Create a pixel sensor of the given type. returns True if the sensor was created. The function will return false if the sensor is not available.

When the sensor is created, it will mark the sensor as unavailable.

Method Definition

public bool CreatePixelSensor (PixelSensorId sensorType)

Obtaining the a Pixel Sensor ID

You can find a pixel sensor by name after querying them with the Pixel Sensor's Feature GetSupportedSensors function.

       PixelSensorId sensorType = availableSensors.Find(x => x.SensorName == "World Center");

Example Usage

The following example demonstrates how to create a sensor of type "World Center". It assumes that MagicLeapPixelSensorFeature has been properly initialized and enabled.

using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();

// Assuming the feature is initialized and enabled
var sensorType = pixelSensorFeature.GetSupportedSensors().Find(s => s.SensorName == "World Center");
bool wasCreated = pixelSensorFeature.CreatePixelSensor(sensorType);
Debug.Log($"Sensor creation was successful: {wasCreated}");
}
}

Destroy Sensor

Destroys a previously created pixel sensor. Returns true if the sensor was successfully destroyed. This is required to release the sensor so that it can be used by other processes.

Method Definition

public bool DestroyPixelSensor(PixelSensorId sensorType)

Example Usage

The following example demonstrates how to destroy a sensor of a given type. It assumes that MagicLeapPixelSensorFeature has been properly initialized and enabled.

using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();

// Assuming the feature is initialized and enabled
var sensorType = pixelSensorFeature.GetSupportedSensors().Find(s => s.SensorName == "World Center");
bool wasDestroyed = pixelSensorFeature.DestroyPixelSensor(sensorType);
Debug.Log($"Sensor destruction was successful. : {wasDestroyed}");
}
}

Sensor Availability Changed

Sensor availability can change dynamically due to various reasons, such as hardware limitations, system policies, or user permissions. Applications must handle these changes to manage sensor operations appropriately. The MagicLeapPixelSensorFeature class provides an event to notify about these changes.

Event Declaration

  public event Action<PixelSensorId, bool> OnSensorAvailabilityChanged

This event triggers whenever the availability of a sensor changes, passing the PixelSensorId of the affected sensor and a bool indicating its new availability state (true for available, false for unavailable). This can be used to wait for a sensor to become available before calling CreatePixelSensor()

Usage Example

using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();
pixelSensorFeature.OnSensorAvailabilityChanged += HandleSensorAvailabilityChanged;
}

private void HandleSensorAvailabilityChanged(PixelSensorId sensorId, bool isAvailable)
{
Debug.Log($"Sensor {sensorId.SensorName} availability changed: {(isAvailable ? "Available" : "Unavailable")}");
}

void OnDestroy()
{
// It's important to unsubscribe from the event when the object is destroyed
if (pixelSensorFeature != null)
{
pixelSensorFeature.OnSensorAvailabilityChanged -= HandleSensorAvailabilityChanged;
}
}
}

Get Sensor Stream Count

Sensors may support multiple data streams to provide different types of data, such as short-range and long-range sensing for a depth camera, or different resolutions and frame formats for a color camera.

Method Definition

The following API method retrieves the number of data streams supported by a specific sensor. Streams are indexed from 0.

public uint GetStreamCount(PixelSensorId sensorType)

Usage Example

This example demonstrates how to query the number of streams for a specific sensor type.

using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();

// Assuming the feature is initialized and enabled
PixelSensorId sensorType = pixelSensorFeature.GetSupportedSensors().Find(s => s.SensorName == "Depth Center");
uint numberOfStreams = pixelSensorFeature.GetStreamCount(sensorType);
Debug.Log($"Number of streams for {sensorType.SensorName}: {numberOfStreams}");
}
}

Enumerate Stream Capabilities

Use GetPixelSensorCapabilities to query the list of the PixelSensorCapability that can be configured for each stream. Each capability is identified by its type (PixelSensorCapability.PixelSensorCapabilityType), represented by a specific data type (PixelSensorCapability.PixelSensorCapabilityDataType) , and defined by a range type (PixelSensorCapability.PixelSensorCapabilityRangeType).

public bool GetPixelSensorCapabilities(PixelSensorId sensorType, uint streamIndex, out PixelSensorCapability[] capabilities)

Capabilities

EnumDescription
PixelSensorCapabilityType.UpdateRateData rate per second, must be specified by application. Data type is PixelSensorCapabilityDataType.UnsignedInt32.
PixelSensorCapabilityType.ResolutionResolution to configure, must be specified by application. Data type is PixelSensorCapabilityDataType.Extent2D.
PixelSensorCapabilityType.FormatData format, must be specified by application. Data type is PixelSensorFrameFormat.
PixelSensorCapabilityType.DepthRange of a depth sensor. Data type is PixelSensorCapabilityDataType.Float.
PixelSensorCapabilityType.MixedRealityCamera frame and digital content will be blended into a single frame. Data type is a PixelSensorCapabilityDataType.PixelSensorRealityMode
PixelSensorCapabilityType.ManualExposureTimeExposure time in milliseconds, if not specified runtime must use AUTO exposure. Data type is a PixelSensorCapabilityDataType.UnsignedInt32
PixelSensorCapabilityType.AnalogGainHigher gain is useful in low light conditions but may introduce noise. Data type is PixelSensorCapabilityDataType.UnsignedInt32.
PixelSensorCapabilityType.DigitalGainHigher gain is useful in low light conditions but may introduce noise. Data type is PixelSensorCapabilityDataType.UnsignedInt32.
PixelSensorCapabilityType.AutoExposureModeAuto Exposure Modes. Data type is an PixelSensorCapabilityDataType.PixelSensorAutoExposureMode
PixelSensorCapabilityType.AutoExposureTargetBrightnessSet target brightness for auto exposure mode. Data type is PixelSensorCapabilityDataType.Float

Capability Data Types

The data type used to represent a certain capability. For example, PixelSensorCapabilityType.UpdateRate is represented as a PixelSensorCapabilityDataType.UnsignedInt32 while PixelSensorCapabilityType.Resolution is represented as a PixelSensorCapabilityDataType.Extent2D.

EnumDescription
PixelSensorCapabilityDataType.BooleanCapability is a bool value.
PixelSensorCapabilityDataType.UnsignedInt32Capability is an integer value.
PixelSensorCapabilityDataType.FloatCapability is a float value.
PixelSensorCapabilityDataType.Extent2DCapability is a vector of two integers.

Range Types

Defines the permissible range for capability values:

EnumDescription
PixelSensorCapabilityRangeType.BooleanCapability has only two valid states, true or false.
PixelSensorCapabilityRangeType.ContinuousCapability can take any value in the given [min, max] range.
PixelSensorCapabilityRangeType.DiscreteCapability take any of the discrete values in the list.

Frame Formats

Describes the format of data produced by the sensors:

EnumDescription
PixelSensorFrameFormat.GrayscaleEach pixel is 1 byte and represents a grayscale value. Datatype of the corresponding frame buffer is uint8_t.
PixelSensorFrameFormat.Rgba8888Each pixel is 4 bytes and represents R,G,B, and A channels in that order. Datatype of the corresponding frame buffer is uint8_t.
PixelSensorFrameFormat.Yuv420888Frame is represented in the YUV_420_888 planar forma. Datatype of the corresponding frame buffer is uint8_t.
PixelSensorFrameFormat.JpegFrame is JPEG encoded.
PixelSensorFrameFormat.Depth32Represents the depth. Depth is the radial distance (in meters) of the real world location with respect to the depth camera. Datatype is float.
PixelSensorFrameFormat.DepthRawRaw pixel data representing light captured by the sensor. For depth cameras that have a projector this raw frame will include frames captured both when the projector is on and off. Refer to PixelSensorDepthFrameIlluminationType for more details. Data type is float.

Mixed Reality Modes

EnumDescription
PixelSensorRealityMode.MixedCamera frame and digital content will be blended into a single frame.
PixelSensorRealityMode.Camera
Only camera frame will be captured.
PixelSensorRealityMode.VirtualOnly virtual content will be captured.

Auto Exposure Modes

EnumDescription
PixelSensorAutoExposureMode.EnvironmentTrackingExposure mode optimized for environment tracking.
PixelSensorAutoExposureMode.ProximityIrTrackingExposure mode optimized for close proximity IR light source.

Example Usage

This example demonstrates how to enumerate and log the capabilities for each stream of a specified sensor.

using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorCapabilityLogger : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;

void Start()
{
// Assuming pixelSensorFeature is initialized and enabled
uint totalStreamCount = pixelSensorFeature.GetStreamCount(sensorType);
for (uint streamIndex = 0; streamIndex < totalStreamCount; streamIndex++)
{
if (pixelSensorFeature.GetPixelSensorCapabilities(sensorType, streamIndex, out PixelSensorCapability[] capabilities))
{
foreach (var capability in capabilities)
{
Debug.Log($"Stream {streamIndex}: Capability Type: {capability.CapabilityType}, Data Type: {capability.CapabilityDataType}, Range Type: {capability.CapabilityRangeType}");
}
}
else
{
Debug.LogWarning($"Failed to get capabilities for stream index {streamIndex}");
}
}
}
}

Query Sensor Capability Ranges

This method allows querying the valid ranges of capabilities for each sensor stream. It is critical to understand the dependencies between different capabilities, as setting one may restrict the allowable ranges of others.

Method Definition

The valid range of the capabilities for each of the sensor streams may be queried using PixelSensorFeature.QueryPixelSensorCapability.

 public bool QueryPixelSensorCapability(PixelSensorId sensorType, PixelSensorCapabilityType capabilityType, uint streamIndex, out PixelSensorCapabilityRange capabilityRange)
Interdependencies and Iterative Configuration

Some stream capabilities may be interdependent. For instance, the choice of frame rate might influence the valid ranges for frame resolution and vice versa. In a scenario where a sensor supports multiple streams, configuring one stream might affect the allowable configurations of another. For example, if a sensor with two streams has an aggregate frame rate limit of 60fps, configuring one stream at 40fps would limit the second stream to a maximum of 20fps.

Setting up a sensor typically requires an iterative approach. Start by querying the valid range for one capability, select a value within that range, and use this value to query the next capability. Continue this process until all necessary capabilities are configured.

Capability ranges can be discrete sets of values, or continuous ranges with defined upper and lower bounds.

Get Default Range Configuration

Using the PixelSensorConfigData's GetDefaultConfig function, developers can get a preset configuration for the device. This is useful when setting parameters that only have one option or if the default value is desired.

using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorScratchPad : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint streamIndex = 0;

// Start is called before the first frame update
private void Example()
{
pixelSensorFeature.GetPixelSensorCapabilities(sensorType, streamIndex, out var capabilities);
foreach (var pixelSensorCapability in capabilities)
{
if (pixelSensorFeature.QueryPixelSensorCapability(sensorType, pixelSensorCapability.CapabilityType,
streamIndex, out var range) && range.IsValid)
{
PixelSensorConfigData configData = range.GetDefaultConfig(streamIndex);
// Then apply the default configuration
pixelSensorFeature.ApplySensorConfig(sensorId.Value, configData);
}
}
}
}

Handling Different Range Types

The valid range for a capability can be a set of boolean states, discrete sets of values, or a continuous range with an upper and lower bound.

Example Implementation

using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorScratchPad : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint streamIndex = 0;

// Start is called before the first frame update
private void Example()
{
pixelSensorFeature.GetPixelSensorCapabilities(sensorType, streamIndex, out var capabilities);
foreach (var pixelSensorCapability in capabilities)
{
if (pixelSensorFeature.QueryPixelSensorCapability(sensorType, pixelSensorCapability.CapabilityType,
streamIndex, out var range) && range.IsValid)
{
switch (range.RangeType)
{
// See example below on how to handle the ranges
case PixelSensorCapabilityRangeType.Boolean:
HandleBooleanRange(range);
break;
case PixelSensorCapabilityRangeType.Continuous:
HandleContinuousRange(range);
break;
case PixelSensorCapabilityRangeType.Discrete:
HandleDiscreteRange(range);
break;
}
}
}
}
}

Boolean Range

If the range type is a boolean and the data is a boolean then the feature can be toggled (true/false) Capability has only two valid states, true or false.

    private void HandleBooleanRange(PixelSensorCapabilityRange range)
{
if (range.DataType == PixelSensorCapabilityDataType.Boolean)
{
Debug.Log("Capability can be toggled");
}
}
Continuous Range

If the range type is continuous it has a min/ max range. These values can be a float or an int

    private void HandleContinuousRange(PixelSensorCapabilityRange range)
{
if (range.DataType == PixelSensorCapabilityDataType.Float)
{
Debug.Log($"Continuous float range: Min = {range.FloatRange.Min}, Max = {range.FloatRange.Max}");
}
else if (range.DataType == PixelSensorCapabilityDataType.UnsignedInt32)
{
Debug.Log($"Continuous integer range: Min = {range.IntRange.Min}, Max = {range.IntRange.Max}");
}
}
Discrete Range

Capability take any of the discrete values in the list. For example enums like frame formats or set values like resolutions or frame rate.

    private void HandleDiscreteRange(PixelSensorCapabilityRange range)
{
switch (range.DataType)
{
case PixelSensorCapabilityDataType.UnsignedInt32 when range.RealityModes != null:
{
foreach (PixelSensorRealityMode pixelSensorRealityMode in range.RealityModes)
{
Debug.Log($"Exposure Modes: {pixelSensorRealityMode}");
}
break;
}
case PixelSensorCapabilityDataType.UnsignedInt32 when range.ExposureModes != null:
{
foreach (PixelSensorAutoExposureMode pixelSensorAutoExposureMode in range.ExposureModes)
{
Debug.Log($"Exposure Modes: {pixelSensorAutoExposureMode}");
}
break;
}
case PixelSensorCapabilityDataType.UnsignedInt32 when range.FrameFormats != null:
{
foreach (PixelSensorFrameFormat pixelSensorFrameFormat in range.FrameFormats)
{
Debug.Log($"Frame Formats: {pixelSensorFrameFormat}");
}
break;
}
case PixelSensorCapabilityDataType.UnsignedInt32:
{
if (range.IntValues != null)
{
foreach (uint intValues in range.IntValues)
{
Debug.Log($"Int Options : {intValues}");
}
}
break;
}
// Capability can be represented as a Vector2Int
case PixelSensorCapabilityDataType.Extent2D:
{
foreach (Vector2Int extentValues in range.ExtentValues)
{
Debug.Log($"Vector2Int Options : {extentValues}");
}
break;
}
}
}

Configure Sensor Values

To configure sensor values, developers must first query the capabilities of the sensor using PixelSensorFeature.QueryPixelSensorCapability. Once the capabilities are determined, configurations can be applied using PixelSensorFeature.ApplySensorConfig.

Query and Apply Configuration

This example demonstrates how to query the sensor capabilities and apply a specific configuration based on the retrieved capabilities.

Example 1: Setting Update Rate

uint streamIndex = 0;
PixelSensorCapabilityType capabilityType = PixelSensorCapabilityType.UpdateRate;

if (pixelSensorFeature.QueryPixelSensorCapability(sensorType, capabilityType, streamIndex, out var range) && range.IsValid)
{
var configData = new PixelSensorConfigData(capabilityType, streamIndex);
configData.VectorValue = range.ExtentValues[0]; // Assuming the desired value is the first in the extent
pixelSensorFeature.ApplySensorConfig(sensorType, configData);
}

Example 2: Setting Format, Resolution, and Update Rate

caution

This is a simple example and can cause configuration errors. Pixel Sensor Capabilities and ranges should be queried before applying each configuration to insure validity.

// Apply grayscale format
pixelSensorFeature.ApplySensorConfig(sensorType,
PixelSensorCapabilityType.Format,
(uint)PixelSensorFrameFormat.Grayscale, streamIndex);

// Apply resolution
pixelSensorFeature.ApplySensorConfig(sensorType,
PixelSensorCapabilityType.Resolution, new Vector2Int(1016, 1016),
streamIndex);

// Apply update rate
pixelSensorFeature.ApplySensorConfig(sensorType,
PixelSensorCapabilityType.UpdateRate, 30,
streamIndex);

Submit the Configuration

Custom Configuration

After configuring the sensor with PixelSensorFeature.ApplySensorConfig(), the configuration needs to be submitted to the sensor.

Example: Submit Custom Configuration

This example demonstrates submitting the sensor configuration asynchronously.

using System.Collections;
using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorConfigurator : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint[] streamIndexes = new uint[] { 0 };

private IEnumerator ConfigurePixelSensorCoroutine()
{
var pixelSensorOperation = pixelSensorFeature.ConfigureSensor(sensorType, streamIndexes);
yield return pixelSensorOperation;
if (pixelSensorOperation.DidOperationSucceed)
{
Debug.Log("Sensor configuration successful.");
}
else
{
Debug.LogError("Sensor configuration failed.");
}
}

void Start()
{
StartCoroutine(ConfigurePixelSensorCoroutine());
}
}

Default Configuration with Default Capabilities

Alternatively, sensors can be configured using default capabilities predefined by the system or platform.

Example: Submit Default Configuration

using System.Collections;
using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorConfigurator : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint[] streamIndexes = new uint[] { 0 };

private IEnumerator ConfigureSensorWithDefaultCapabilitiesCoroutine()
{
var pixelSensorOperation = pixelSensorFeature.ConfigureSensorWithDefaultCapabilities(sensorType, streamIndexes);
yield return pixelSensorOperation;
if (pixelSensorOperation.DidOperationSucceed)
{
Debug.Log("Sensor configured with default capabilities successfully.");
}
else
{
Debug.LogError("Configuration with default capabilities failed.");
}
}

void Start()
{
StartCoroutine(ConfigureSensorWithDefaultCapabilitiesCoroutine());
}
}

MetaData Configuration

note

A sensor has to be configured in order to be able to get its supported metadata types.

Method Definition

Gets the available metadata types for a sensor's stream. The function returns True if the supported metadata types were retrieved.

bool EnumeratePixelSensorMetaDataTypes(PixelSensorId sensorType, uint stream, out PixelSensorMetaDataType[] metaDataTypes)

Example Usage

using System.Collections.Generic;
using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint[] streamIndexes = new uint[] { 0 };
private Dictionary<uint, PixelSensorMetaDataType[]> supportedMetadataTypes = new ();

private void FetchSupportedMetaData()
{
foreach (var stream in streamIndexes)
{
if (pixelSensorFeature.EnumeratePixelSensorMetaDataTypes(sensorType, stream, out var metaDataTypes))
{
supportedMetadataTypes[stream] = metaDataTypes;
Debug.Log($"Metadata types for stream {stream} retrieved successfully.");
}
else
{
Debug.LogError($"Failed to retrieve metadata types for stream {stream}.");
}
}
}
}

Start and Stop Sensor Streams

After configuring the sensor capabilities, use the following methods to start and stop streaming data from the sensor. These operations are asynchronous and can be managed within coroutines.

Start Sensor Stream

Method Definition

 public PixelSensorAsyncOperationResult pixelSensorFeature.StartSensor (PixelSensorId sensorType, IEnumerable<uint> streams, Dictionary<uint, PixelSensorMetaDataType[]> metaDataTypes = null)

Example Usage

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint[] streamIndexes = new uint[] { 0 };
private Dictionary<uint, PixelSensorMetaDataType[]> metaDataTypesByStream = new();
private IEnumerator StartSensorCoroutine()
{
// Assume metadata types have been configured as shown in the FetchMetaData example
var sensorStartAsyncResult = pixelSensorFeature.StartSensor(sensorType, streamIndexes, metaDataTypesByStream);
yield return sensorStartAsyncResult;

if (sensorStartAsyncResult.DidOperationSucceed)
{
Debug.Log("Sensor streaming started successfully.");
}
else
{
Debug.LogError("Failed to start sensor streaming.");
}
}
}

Stop Sensor Stream

Method Definition

 public PixelSensorAsyncOperationResult pixelSensorFeature.StopSensor (PixelSensorId sensorType, IEnumerable<uint> streams)

Example Usage

using System.Collections;
using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint[] streamIndexes = new uint[] { 0 };

private IEnumerator StopSensorCoroutine()
{
// This example assumes that the Pixel Feature and sensorType were initialized separately.
var sensorStopAsyncResult = pixelSensorFeature.StopSensor(sensorType, streamIndexes);
yield return sensorStopAsyncResult;

if (sensorStopAsyncResult.DidOperationSucceed)
{
Debug.Log("Sensor streaming stopped successfully.");
}
else
{
Debug.LogError("Failed to stop sensor streaming.");
}
}
}

Query sensor data

Once the sensor capabilities have been configured and the necessary streams for the sensors are started, PixelSensorFeature.GetSensorData() can be used to retrieve sensor data.

Method Definition

public bool GetSensorData(
PixelSensorId sensorType,
uint streamIndex,
out PixelSensorFrame frame,
out PixelSensorMetaData[] metaData,
Allocator allocator,
long timeOut = 10,
bool shouldFlipTexture = true
)

Example Usage

The following example demonstrates how to query sensor data for a specific sensor and process the metadata associated with each frame. It assumes the MagicLeapPixelSensorFeature is properly initialized and enabled.

using System.Linq;
using UnityEngine;
using Unity.Collections;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorDataManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;
private uint streamIndex = 0; // Example uses the first stream index

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();
sensorType = pixelSensorFeature.GetSupportedSensors().FirstOrDefault(); // Simplified for example

if (pixelSensorFeature.GetSensorData(sensorType, streamIndex, out PixelSensorFrame frame,
out PixelSensorMetaData[] metaData, Allocator.Temp, shouldFlipTexture: true))
{
ProcessSensorData(frame, metaData);
}
}

private void ProcessSensorData(PixelSensorFrame frame, PixelSensorMetaData[] metaData)
{
// processing logic...

// Process the main pixel sensor frame as needed

// Process time, note it is returned as a long so it needs to be converted
// ex: string formattedTime = DateTimeOffset.FromUnixTimeMilliseconds(frame.CaptureTime / 1000);

// Note: MetaData can contain frames like confidence
var confidenceMetadata = currentFrameMetaData.OfType<PixelSensorDepthConfidenceBuffer>().FirstOrDefault();
if (confidenceMetadata != null)
{
Debug.Log("Processing Depth Confidence Data...");
}

// Note: MetaData can contain frames like or depth flags
var flagMetadata = currentFrameMetaData.OfType<PixelSensorDepthFlagBuffer>().FirstOrDefault();
if (flagMetadata != null)
{
Debug.Log("Processing Depth Flag Data...");
}
}
}

Additional Meta Data

Pixel sensors may provide additional meta data for the captured frames. Application can obtain this metadata by specifying which data to capture before starting the sensor.

EnumDescription
PixelSensorExposureTimeExposure time in milliseconds used to capture the frame
PixelSensorAnalogGainAnalog gain used to capture the frame.
PixelSensorDigitalGainDigital gain used to capture the frame.
PixelSensorPinholeIntrinsicsSpecifies the camera intrinsics and distortion coefficients for a pinhole camera model.
PixelSensorFisheyeIntrinsicsSpecifies the camera matrix and distortion coefficients for a Magic Leap’s fisheye camera model.
PixelSensorDepthFrameIlluminationIllumination type used for the depth frame.
PixelSensorDepthConfidenceBufferConfidence values for each pixel in the camera frame. The confidence score is derived from the sensor noise and it is not normalized. The higher the value the higher the confidence. Applications can determine what confidence threshold to use based on their use case. Data type is float.
PixelSensorDepthFlagBufferFlag bits for each pixel in the depth camera frame. Refer to PixelSensorDepthFlagBuffer for more details. Data type is uint32_t.
private void LogAdditionalMetaData(PixelSensorMetaData[] frameMetaData)
{
for (int i = 0; i < frameMetaData.Length; i++)
{
var builder = new StringBuilder();
var metaData = frameMetaData[i];
switch (metaData)
{
case PixelSensorAnalogGain analogGain:
builder.AppendLine($"{analogGain.AnalogGain}");
break;
case PixelSensorDigitalGain digitalGain:
builder.AppendLine($"{digitalGain.DigitalGain}");
break;
case PixelSensorExposureTime exposureTime:
builder.AppendLine($"{exposureTime.ExposureTime:F1}");
break;
case PixelSensorDepthFrameIllumination illumination:
// Illumination On/off
builder.AppendLine($"{illumination.IlluminationType}");
break;
case PixelSensorFisheyeIntrinsics fisheyeIntrinsics:
{
builder.AppendLine($"FOV: {fisheyeIntrinsics.FOV}");
builder.AppendLine($"Focal Length: {fisheyeIntrinsics.FocalLength}");
builder.AppendLine($"Principal Point: {fisheyeIntrinsics.PrincipalPoint}");
builder.AppendLine(
$"Radial Distortion: [{string.Join(',', fisheyeIntrinsics.RadialDistortion.Select(val => val.ToString("F1")))}]");
builder.AppendLine(
$"Tangential Distortion: [{string.Join(',', fisheyeIntrinsics.TangentialDistortion.Select(val => val.ToString("F1")))}]");
break;
}
case PixelSensorPinholeIntrinsics pinholeIntrinsics:
{
builder.AppendLine($"FOV: {pinholeIntrinsics.FOV}");
builder.AppendLine($"Focal Length: {pinholeIntrinsics.FocalLength}");
builder.AppendLine($"Principal Point: {pinholeIntrinsics.PrincipalPoint}");
builder.AppendLine(
$"Distortion: [{string.Join(',', pinholeIntrinsics.Distortion.Select(val => val.ToString("F1")))}]");
break;
}
}
Debug.Log(builder.ToString());
}
}

Depth Flags

Flag Descriptions

  • PixelSensorDepthFlags.Valid — Indicates that there is no additional flag data for this pixel.
  • PixelSensorDepthFlags.Invalid — This bit is set to one to indicate that one or more flags from below have been set. Depending on the use case the application can correlate the flag data and corresponding pixel data to determine how to handle the pixel data.
  • PixelSensorDepthFlags.Saturated — The pixel intensity is either below the min or the max threshold value.
  • PixelSensorDepthFlags.Inconsistent — Inconsistent data received when capturing frames. This can happen due to fast motion.
  • PixelSensorDepthFlags.LowSignal — Pixel has very low signal to noise ratio. One example of when this can happen is for pixels in far end of the range.
  • PixelSensorDepthFlags.FlyingPixel — This typically happens when there is step jump in the distance of adjoining pixels in the scene. Example: When you open a door looking into the room the edges along the door’s edges can cause flying pixels.
  • PixelSensorDepthFlags.MaskedBit — If this bit is on it indicates that the corresponding pixel may not be within the illuminator’s illumination cone.
  • PixelSensorDepthFlags.Sbi — This bit will be set when there is high noise.
  • PixelSensorDepthFlags.StrayLight — This could happen when there is another light source apart from the depth camera illuminator.
  • PixelSensorDepthFlags.ConnectedComponents — If a small group of PixelSensorDepthFlags.Valid is surrounded by a set of PixelSensorDepthFlags.Invalid then this bit will be set to 1.

Camera Models

Different camera sensors may support various camera models based on their optical characteristics and intended use. The Magic Leap Pixel Sensor extension supports both Pinhole and Fisheye camera models.

Pinhole Camera Model

The Pinhole camera model is a standard imaging model that approximates the geometry of a real camera by projecting points in a scene through a single point, known as the "pinhole," onto an image plane. This model is widely used due to its simplicity and accuracy under common shooting conditions.

public class PixelSensorPinholeIntrinsics : PixelSensorMetaData
{
/// <summary>
/// Focal length in pixels. Represents the distance between the pinhole and the image plane.
/// </summary>
public Vector2 FocalLength { get; internal set; }

/// <summary>
/// The principal point in pixels. This is the point where the optical axis intersects the image plane.
/// </summary>
public Vector2 PrincipalPoint { get; internal set; }

/// <summary>
/// The horizontal (x) and vertical (y) field of view in degrees.
/// </summary>
public Vector2 FOV { get; internal set; }

/// <summary>
/// Distortion coefficients, typically used to correct lens distortion. These coefficients are in the following order: [k1, k2, p1, p2, k3].
/// </summary>
public double[] Distortion { get; internal set; } = new double[5];

public override PixelSensorMetaDataType MetaDataType => PixelSensorMetaDataType.PinholeCameraModel;
}

Fisheye Camera Model

PixelSensorFisheyeIntrinsics specifies the camera matrix and distortion co-efficients for a Magic Leap’s fisheye camera model. The Magic Leap fisheye model differentiates itself from conventional fisheye models (see here) by adding an additional tangential term on top of the existing method. Applications can use the intrinsics with the conventional OpenCV fisheye calibration library (see here) by dropping the tangential terms (p1 and p2 in the equations below) but this may result in lower accuracy.

Radial distortion coefficients: k1, k2, k3, k4
Tangential distortion coefficients: p1, p2

If P = [x, y, z] is a point in camera coordinates and a = x/z, b = y/z are the corresponding point locations in normalized image coordinates, this model will project and distort said point in the following way:

Conventional fisheye model

r = sqrt(a^2 + b^2)
θ = atan( r )
θ_rad = θ * (1 + k1 * θ^2 + k2 * θ^4 + k3 * θ^6 + k4 * θ^8)
x_rad = a * ( θ_rad / r )
y_rad = b * ( θ_rad / r )

Tangential term (can be omitted if reduced accuracy is acceptable)

r_rad_sq = x_rad^2 + y_rad^2
x_rad_tan = x_rad + 2 * p1 * x_rad * y_rad + p2 * (r_rad_sq + 2 * x_rad^2)
y_rad_tan = y_rad + p1 * (r_rad_sq + 2 * y_rad^2) + 2 * p2 * x_rad * y_rad

public class PixelSensorFisheyeIntrinsics : PixelSensorMetaData
{
/// <summary>
/// The Focal length in pixels
/// </summary>
public Vector2 FocalLength { get; internal set; }

/// <summary>
/// The principal point in pixels
/// </summary>
public Vector2 PrincipalPoint { get; internal set; }

/// <summary>
/// The horizontal and vertical field of view in degrees
/// </summary>
public Vector2 FOV { get; internal set; }

/// <summary>
/// The tangential distortion coefficients. These coefficients are in the following order: [k1, k2, k3, k4].
/// </summary>
public double[] RadialDistortion { get; internal set; } = new double[5];

/// <summary>
/// The radial distortion coefficients. These coefficients are in the following order: [p1, p2]
/// </summary>
public double[] TangentialDistortion { get; internal set; } = new double[2];

public override PixelSensorMetaDataType MetaDataType => PixelSensorMetaDataType.FishEyeCameraModel;
}

Unity Frame Format

PixelSensorFrame structure is used to store per pixel data. The type of data stored for each pixel varies and depends on the PixelSensorFrame.FrameType. The top left corner of the frame is treated as the origin.

EnumDescription
PixelSensorFrameType.GrayscaleRefers to UnityEngine.TextureFormat.R8
PixelSensorFrameType.Rgba8888Refers to UnityEngine.TextureFormat.Rgba8888
PixelSensorFrameType.Yuv420888Refers to UnityEngine.TextureFormat.YUY2
PixelSensorFrameType.JpegRefers to Jpeg data
PixelSensorFrameType.Depth32Refers to UnityEngine.TextureFormat.RFloat
PixelSensorFrameType.DepthRawRefers to UnityEngine.TextureFormat.RFloat
PixelSensorFrameType.DepthConfidenceRefers to UnityEngine.TextureFormat.RFloat
PixelSensorFrameType.DepthFlagsRefers to UnityEngine.TextureFormat.RFloat

Example

The following example demonstrates how to process various types of frames and convert them into a usable format in Unity.

using System;
using UnityEngine;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorScratchPad : MonoBehaviour
{
private Texture2D targetTexture;

// Start is called before the first frame update
public void ProcessFrame(in PixelSensorFrame frame)
{
if (!frame.IsValid || frame.Planes.Length == 0)
{
return;
}

if (targetTexture == null)
{
ref var plane = ref frame.Planes[0];
targetTexture = new Texture2D((int)plane.Width, (int)plane.Height, GetTextureFormat(frame.FrameType), false);
}

targetTexture.LoadRawTextureData(frame.Planes[0].ByteData);
targetTexture.Apply();
}

private TextureFormat GetTextureFormat(PixelSensorFrameType frameType)
{
switch (frameType)
{
case PixelSensorFrameType.Grayscale:
return TextureFormat.R8;
case PixelSensorFrameType.Rgba8888:
return TextureFormat.RGBA32;
case PixelSensorFrameType.Yuv420888:
return TextureFormat.YUY2;
case PixelSensorFrameType.Depth32:
case PixelSensorFrameType.DepthRaw:
case PixelSensorFrameType.DepthConfidence:
case PixelSensorFrameType.DepthFlags:
return TextureFormat.RFloat;
}
return TextureFormat.RFloat;
}
}

Obtain Sensor Pose

For certain applications it is useful to have the exact pose of the sensor available, GetSensorPose obtains the latest pose for the sensor. You should use this function as soon as you obtain data from the sensor to make sure the pose matches the frame obtained from the sensor.

/// <summary>
/// Get the sensor pose
/// </summary>
/// <param name="sensorType">The type of the sensor</param>
/// <param name="offset">The offset that should be considered for the pose calculation</param>
/// <returns>The pose of the sensor</returns>
public Pose GetSensorPose(PixelSensorId sensorType, Pose offset = default)
{
if (!IsSensorConnected(sensorType, out var sensor))
{
return default;
}

return sensor.GetSensorPose(offset);
}

Example

This example gets the pose of a sensor

using System.Linq;
using Unity.XR.CoreUtils;
using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private XROrigin xrOrigin;

private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();
sensorType = pixelSensorFeature.GetSupportedSensors().FirstOrDefault(); // Simplified for example

// Assumes the Sensor was configured and started...
bool wasCreated = pixelSensorFeature.CreatePixelSensor(sensorType);
if(wasCreated){
GetSensorPose();
}
}

private void GetSensorPose()
{
// Get sensor Pose
Pose sensorPose = pixelSensorFeature.GetSensorPose(sensorType);

// Updates the Sensor Pose to be relative to the XR Origin
if (xrOrigin = FindAnyObjectByType<XROrigin>())
{
Vector3 worldPosition = xrOrigin.CameraFloorOffsetObject.transform.TransformPoint(sensorPose.position);
Quaternion worldRotation = xrOrigin.transform.rotation * sensorPose.rotation;
// Update the existing pose
sensorPose = new Pose(worldPosition, worldRotation);
}

Debug.Log("Sensor Pose:" + sensorPose);
}
}

Obtain Frame Rotation

Some sensors, like the eye cameras, have a rotation applied to them. The Pixel Sensor API provides a function to read the rotation of the frame so that you can display it properly in your application.

/// <summary>
/// The sensor data stored can have differing orientations.
/// </summary>
/// <param name="sensorType">The type of the sensor to get the rotation for</param>
/// <returns>The rotation of the frame data</returns>
public Quaternion GetSensorFrameRotation(PixelSensorId sensorType)
{
return sensorType.SensorName switch
{
"/pixelsensor/eye/nasal/left" or "/pixelsensor/eye/nasal/right" => Quaternion.Euler(0, 0, 90),
"/pixelsensor/eye/temple/left" or "/pixelsensor/eye/temple/right" => Quaternion.Euler(0, 0, -90),
_ => Quaternion.identity
};
}

Example

using System.Linq;
using UnityEngine;
using UnityEngine.XR.OpenXR;
using MagicLeap.OpenXR.Features.PixelSensors;

public class PixelSensorManager : MonoBehaviour
{
private MagicLeapPixelSensorFeature pixelSensorFeature;
private PixelSensorId sensorType;

void Start()
{
pixelSensorFeature = OpenXRSettings.Instance.GetFeature<MagicLeapPixelSensorFeature>();
sensorType = pixelSensorFeature.GetSupportedSensors().FirstOrDefault(); // Simplified for example

// Assumes the Sensor was configured and started..
GetFrameRotation();
}

private void GetFrameRotation()
{
Quaternion frameRotation = pixelSensorFeature.GetSensorFrameRotation(sensorType);
Debug.Log("Sensor Frame Rotation:" + frameRotation);
}
}