Skip to main content
Version: 21 Aug 2024

Pixel Sensor Overview

The Magic Leap 2 OpenXR Unity SDK supports accessing the device's pixel sensors the Magic Leap 2 Pixel Sensor feature. This feature needs to be enabled in Unity's OpenXR Settings (Edit > Project Settings > XR Plug-in Management > OpenXR).

caution

The documentation for the Pixel Sensor API section is still being developed, the API is in an experimental state. If you run into any issues, please reach out to us on the developer forum.

NameSpace
using MagicLeap.OpenXR.Features.PixelSensors;

Supported Sensors & Paths

This extension supports the following cameras:

Sensor NameXR PathDescription
Picture Center/pixelsensor/picture/centerRGB Camera on the front of the headset
World Left/pixelsensor/world/leftCamera located on the left corner of the headset.
World Center/pixelsensor/world/centerCamera located on the front of the headset.
World Right/pixelsensor/world/rightCamera located on the right corner of the headset.
Depth Center/pixelsensor/depth/centerDepth camera located on the front of the headset.
Eye Temple Left/pixelsensor/eye/temple/leftLeft temple eye camera.
Eye Nasal Left/pixelsensor/eye/nasal/leftLeft nasal eye camera.
Eye Nasal Right/pixelsensor/eye/nasal/rightRight nasal eye camera.
Eye Temple Right/pixelsensor/eye/temple/rightRight temple eye camera.

Sensor Permissions

Sensors require permissions before the application can access the data from the sensor. These permissions can be enabled in your project's Permissions Settings (Edit > Project Settings > Magic Leap > Permissions) and then requested at runtime using the Android Permissions API.

PermissionSensor Id
com.magicleap.permission.DEPTH_CAMERA (protection level: dangerous)/pixelsensor/depth/center
com.magicleap.permission.EYE_CAMERA (protection level: dangerous)/pixelsensor/eye/temple/left
/pixelsensor/eye/nasal/left
/pixelsensor/eye/nasal/right
/pixelsensor/eye/temple/right
permissions android.permission.CAMERA (protection level: dangerous)/pixelsensor/world/left,
/pixelsensor/world/center
/pixelsensor/world/right
/pixelsensor/picture/center

Capabilities

Each sensor contains a list of capabilities. The table below outlines all of the capabilities and their data type.

EnumDescription
PixelSensorCapabilityType.UpdateRateData rate per second, must be specified by application. Data type is PixelSensorCapabilityDataType.UnsignedInt32.
PixelSensorCapabilityType.ResolutionResolution to configure, must be specified by application. Data type is PixelSensorCapabilityDataType.Extent2D.
PixelSensorCapabilityType.FormatData format, must be specified by application. Data type is MagicLeapPixelSensorFeature.PixelSensorFrameFormat.
PixelSensorCapabilityType.DepthRange of a depth sensor. Data type is PixelSensorCapabilityDataType.Float.
PixelSensorCapabilityType.MixedRealityCamera frame and digital content will be blended into a single frame. Data type is a PixelSensorCapabilityDataType.PixelSensorRealityMode
PixelSensorCapabilityType.ManualExposureTimeExposure time in milliseconds, if not specified runtime must use AUTO exposure. Data type is a PixelSensorCapabilityDataType.UnsignedInt32
PixelSensorCapabilityType.AnalogGainHigher gain is useful in low light conditions but may introduce noise. Data type is PixelSensorCapabilityDataType.UnsignedInt32.
PixelSensorCapabilityType.DigitalGainHigher gain is useful in low light conditions but may introduce noise. Data type is PixelSensorCapabilityDataType.UnsignedInt32.
PixelSensorCapabilityType.AutoExposureModeAuto Exposure Modes. Data type is an PixelSensorCapabilityDataType.PixelSensorAutoExposureMode
PixelSensorCapabilityType.AutoExposureTargetBrightnessSet target brightness for auto exposure mode. Data type is PixelSensorCapabilityDataType.Float

Mixed Reality Modes

The World Camera's support 2 separate Auto Exposure Modes

EnumDescription
PixelSensorRealityMode.MixedCamera frame and digital content will be blended into a single frame.
PixelSensorRealityMode.Camera
Only camera frame will be captured.
PixelSensorRealityMode.VirtualOnly virtual content will be captured.

Auto Exposure Modes

The World Camera's support 2 separate Auto Exposure Modes

EnumDescription
PixelSensorAutoExposureMode.EnvironmentTrackingExposure mode optimized for environment tracking.
PixelSensorAutoExposureMode.ProximityIrTrackingExposure mode optimized for close proximity IR light source.

Frame Formats

Describes the format of data produced by the sensors:

EnumDescription
PixelSensorFrameFormat.GrayscaleEach pixel is 1 byte and represents a grayscale value. Datatype of the corresponding frame buffer is uint8_t.
PixelSensorFrameFormat.Rgba8888Each pixel is 4 bytes and represents R,G,B, and A channels in that order. Datatype of the corresponding frame buffer is uint8_t.
PixelSensorFrameFormat.Yuv420888Frame is represented in the YUV_420_888 planar forma. Datatype of the corresponding frame buffer is uint8_t.
PixelSensorFrameFormat.JpegFrame is JPEG encoded.
PixelSensorFrameFormat.Depth32Represents the depth. Depth is the radial distance (in meters) of the real world location with respect to the depth camera. Datatype is float.
PixelSensorFrameFormat.DepthRawRaw pixel data representing light captured by the sensor. For depth cameras that have a projector this raw frame will include frames captured both when the projector is on and off. Refer to PixelSensorDepthFrameIlluminationType for more details. Data type is float.

Unity Frame Formats

PixelSensorFrame structure is used to store per pixel data. The type of data stored for each pixel varies and depends on the PixelSensorFrame.FrameType. The top left corner of the frame is treated as the origin.

EnumDescription
PixelSensorFrameType.GrayscaleRefers to UnityEngine.TextureFormat.R8
PixelSensorFrameType.Rgba8888Refers to UnityEngine.TextureFormat.Rgba8888
PixelSensorFrameType.Yuv420888Refers to UnityEngine.TextureFormat.YUY2
PixelSensorFrameType.Depth32Refers to UnityEngine.TextureFormat.RFloat
PixelSensorFrameType.DepthRawRefers to UnityEngine.TextureFormat.RFloat
PixelSensorFrameType.DepthConfidenceRefers to UnityEngine.TextureFormat.RFloat
PixelSensorFrameType.DepthFlagsRefers to UnityEngine.TextureFormat.RFloat

Metadatas

Pixel sensors may provide additional meta data for the captured frames. Application can obtain this metadata by specifying which data to capture before starting the sensor.

EnumDescription
PixelSensorExposureTimeExposure time in milliseconds used to capture the frame
PixelSensorAnalogGainAnalog gain used to capture the frame.
PixelSensorDigitalGainDigital gain used to capture the frame.
PixelSensorPinholeIntrinsicsSpecifies the camera intrinsics and distortion coefficients for a pinhole camera model.
PixelSensorFisheyeIntrinsicsSpecifies the camera matrix and distortion coefficients for a Magic Leap’s fisheye camera model.
PixelSensorDepthFrameIlluminationIllumination type used for the depth frame.
PixelSensorDepthConfidenceBufferConfidence values for each pixel in the camera frame. The confidence score is derived from the sensor noise and it is not normalized. The higher the value the higher the confidence. Applications can determine what confidence threshold to use based on their use case. Data type is float.
PixelSensorDepthFlagBufferFlag bits for each pixel in the depth camera frame. Refer to PixelSensorDepthFlagBuffer for more details. Data type is uint32_t.

In this Category