Skip to main content
Version: 12 Dec 2024

XR_ML_pixel_sensor

Experimental API

This API is still an experimental extension not included in the official OpenXR registry and is subject to change.

12.108. XR_ML_pixel_sensor

Name String

XR_ML_pixel_sensor

Extension Type

Instance extension

Registered Extension Number

476

Revision

1

Extension and Version Dependencies
Last Modified Date

2024-03-13

Contributors

Karthik Kadappan, Magic Leap
Ron Bessems, Magic Leap
Rafael Wiltz, Magic Leap
Philip Unger, Magic Leap

12.108.1. Overview

This extension enables the applications to work with the pixel sensors.

This section will briefly cover some of the key concepts needed to understand how this extension works. Refer to the later sections for more details.

  • Pixel sensor: Sensor that collects data using a grid based sensor. Runtime may support multiple pixel sensors. For example it may support one depth sensing camera, two eye sensing cameras, and two world sensing cameras.

  • Sensor permissions: Sensor may require permissions before the application can access the data from the sensor. Refer to the section on Sensor Permissions for more details.

  • Sensor stream: A sensor may support multiple streams of frame data. For example a depth pixel sensor can support two streams one for short range sensing and one for long range sensing. Refer to the section on Sensor Streams for more details.

  • Sensor capability: A sensor may have capabilities that can be configured by the application. Some examples of sensor capabilities are frame rate, frame resolution, and sensor exposure time. Refer to the section on Sensor Capabilities for more details.

  • Sensor metadata: A sensor may provide metadata in addition to the frame data. Some example of metadata are exposure time used to capture the frame, camera model. Refer to the section on Sensor Metadata for more details.

  • Camera models: Each of the pixel sensors may have a camera model that provides a mathematical representation of how it captures and transforms the 3D world into a 2D image. Refer to the section on Camera Models for more details.

The extension provides APIs to do the following:

12.108.2. Enumerate the sensors

The xrEnumeratePixelSensorsML is used to enumerate all the available pixel sensors.

// Provided by XR_ML_pixel_sensor
XrResult xrEnumeratePixelSensorsML(
    XrSession                                   session,
    uint32_t                                    sensorCapacityInput,
    uint32_t*                                   sensorCountOutput,
    XrPath*                                     sensors);
Member Descriptions
  • session is a handle to an XrSession.

  • sensorCapacityInput is the capacity of the sensors array, or 0 to indicate a request to retrieve the required capacity.

  • sensorCountOutput is filled in by the runtime with the count of sensors written or the required capacity in the case thatsensorCapacityInput is insufficient.

  • sensors is a pointer to an array of XrPath.

  • See Buffer Size Parameters chapter for a detailed description of retrieving the required sensors size.

Valid Usage (Implicit)
  • The XR_ML_pixel_sensor extension must be enabled prior to calling xrEnumeratePixelSensorsML

  • session must be a valid XrSession handle

  • sensorCountOutput must be a pointer to a uint32_t value

  • If sensorCapacityInput is not 0, sensors must be a pointer to an array of sensorCapacityInput XrPath values

Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

This extension supports the following cameras:

  • /pixelsensor/picture/center

  • /pixelsensor/world/left

  • /pixelsensor/world/center

  • /pixelsensor/world/right

  • /pixelsensor/depth/center

  • /pixelsensor/eye/temple/left

  • /pixelsensor/eye/nasal/left

  • /pixelsensor/eye/nasal/right

  • /pixelsensor/eye/temple/right

12.108.3. Sensor Permissions

Some sensors may require permissions before the application can access the data from the sensor.

Permissions

Android applications must have the required permission listed in their manifest to open sensors listed in the table below.

PermissionSensor Id

com.magicleap.permission.DEPTH_CAMERA (protection level: dangerous)

/pixelsensor/depth/center

com.magicleap.permission.EYE_CAMERA (protection level: dangerous)

/pixelsensor/eye/temple/left, /pixelsensor/eye/nasal/left, /pixelsensor/eye/nasal/right, /pixelsensor/eye/temple/right

permissions android.permission.CAMERA (protection level: dangerous)

/pixelsensor/world/left, /pixelsensor/world/center, pixelsensor/world/right, pixelsensor/picture/center

12.108.4. Create and destroy a sensor handle

Applications can create a sensor handle using thexrCreatePixelSensorML. This will provide an XrPixelSensorML handle.

Sensor availability may change during the lifecycle of the application. Listen for the XrEventDataPixelSensorAvailabilityChangedML event to be notified of these changes.

The XrPixelSensorML handle is defined as:

// Provided by XR_ML_pixel_sensor
XR_DEFINE_HANDLE(XrPixelSensorML)

The xrCreatePixelSensorML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrCreatePixelSensorML(
    XrSession                                   session,
    const XrPixelSensorCreateInfoML*            createInfo,
    XrPixelSensorML*                            sensor);
Member Descriptions
Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_LIMIT_REACHED

  • XR_ERROR_OUT_OF_MEMORY

  • XR_ERROR_PIXEL_SENSOR_PERMISSION_DENIED_ML

  • XR_ERROR_PIXEL_SENSOR_NOT_SUPPORTED_ML

  • XR_ERROR_PIXEL_SENSOR_NOT_AVAILABLE_ML

  • XR_ERROR_PATH_UNSUPPORTED

  • XR_ERROR_PATH_INVALID

The XrPixelSensorCreateInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCreateInfoML {
    XrStructureType    type;
    const void*        next;
    XrPath             sensor;
} XrPixelSensorCreateInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • sensor is the XrPath of the sensor to be opened.

Valid Usage (Implicit)

To close the sensor and release all the resources for that sensor callxrDestroyPixelSensorML.

// Provided by XR_ML_pixel_sensor
XrResult                                        xrDestroyPixelSensorML(
    XrPixelSensorML                             sensor);
Parameter Descriptions
Valid Usage (Implicit)
Thread Safety
  • Access to sensor, and any child handles, must be externally synchronized

Return Codes
Success
  • XR_SUCCESS

Failure
  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_RUNTIME_FAILURE

12.108.5. Sensor availability change events

The availability of sensor is subject to change at any time. The application must be notified of availability via theXrEventDataPixelSensorAvailabilityChangedML event.

The XrEventDataPixelSensorAvailabilityChangedML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrEventDataPixelSensorAvailabilityChangedML {
    XrStructureType    type;
    const void*        next;
    XrPath             sensor;
    XrBool32           available;
    XrTime             changeTime;
} XrEventDataPixelSensorAvailabilityChangedML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • sensor is the sensor XrPath.

  • available indicates if this sensor is currently available.

  • changeTime is the time at which this change occurred.

Valid Usage (Implicit)

12.108.6. Sensor streams

Sensors may have multiple data streams. For example a depth camera sensor can support two streams: one for short range sensing and one for long range sensing. A color camera sensor can support two streams, each with different resolutions and frame formats.

The runtime does not make any guarantees about whether the different streams in a sensor are captured at the same time or not.

Use xrGetPixelSensorStreamCountML to get the number of streams supported by the sensor. Streams are indexed from 0 to n-1 where n is the number streams reported byxrGetPixelSensorStreamCountML.

The xrGetPixelSensorStreamCountML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrGetPixelSensorStreamCountML(
    XrPixelSensorML                             sensor,
    uint32_t*                                   streamCount);
Member Descriptions
  • sensor is a handle to the sensor to enumerate.

  • streamCount is a pointer that stores the number of streams.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

12.108.7. Enumerate stream capabilities

Use xrEnumeratePixelSensorCapabilitiesML to query the list of theXrPixelSensorCapabilityML that can be configured for each stream. Each capability is identified by it name given byXrPixelSensorCapabilityTypeML, has a data type given byXrPixelSensorCapabilityDataTypeML and a range type given byXrPixelSensorCapabilityRangeTypeML.

The xrEnumeratePixelSensorCapabilitiesML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrEnumeratePixelSensorCapabilitiesML(
    XrPixelSensorML                             sensor,
    uint32_t                                    stream,
    uint32_t                                    capabilityCapacityInput,
    uint32_t*                                   capabilityCountOutput,
    XrPixelSensorCapabilityML*                  capabilities);
Member Descriptions
  • sensor is a handle to the sensor.

  • stream is the stream index for which you want to query the capabilities.

  • capabilityCapacityInput is the capacity of the capabilitiesarray, or 0 to indicate a request to retrieve the required capacity.

  • capabilityCountOutput is filled in by the runtime with the count of sensor capabilities written or the required capacity in the case thatcapabilityCapacityInput is insufficient.

  • capabilities is a pointer to an array ofXrPixelSensorCapabilityML.

  • See Buffer Size Parameters chapter for a detailed description of retrieving the required capabilities size.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

The XrPixelSensorCapabilityML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityML {
    XrStructureType                       type;
    const void*                           next;
    XrPixelSensorCapabilityTypeML         capabilityType;
    XrPixelSensorCapabilityDataTypeML     capabilityDataType;
    XrPixelSensorCapabilityRangeTypeML    capabilityRangeType;
} XrPixelSensorCapabilityML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • capabilityType is the type of the capability.

  • capabilityDataType is the data type of the capability.

  • capabilityRangeType is the type of the range of the capability.

Valid Usage (Implicit)

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorCapabilityTypeML {
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML = 0,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML = 1,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML = 3,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_DEPTH_ML = 4,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_REALITY_MODE_ML = 5,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML = 100,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_ANALOG_GAIN_ML = 101,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_DIGITAL_GAIN_ML = 102,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_AUTO_EXPOSURE_MODE_ML = 200,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_AUTO_EXPOSURE_TARGET_BRIGHTNESS_ML = 201,
    XR_PIXEL_SENSOR_CAPABILITY_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorCapabilityTypeML;
EnumDescription

XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML

Data rate per second, must be specified by application. Data type is uint32_t.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML

Resolution to configure, must be specified by application. Data type is XrExtent2Di.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML

Data format, must be specified by application. Data type is XrPixelSensorFrameFormatML.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_DEPTH_ML

Range of a depth sensor. Data type is float.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_REALITY_MODE_ML

Reality mode. Data type is XrPixelSensorRealityModeML.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML

Exposure time in milliseconds, if not specified runtime must use AUTO exposure. Data type is float.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_ANALOG_GAIN_ML

Higher gain is useful in low light conditions but may introduce noise. Data type is uint32_t.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_DIGITAL_GAIN_ML

Higher gain is useful in low light conditions but may introduce noise. Data type is uint32_t.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_AUTO_EXPOSURE_MODE_ML

Allowed auto exposure modes. Data type is XrPixelSensorAutoExposureModeML.

XR_PIXEL_SENSOR_CAPABILITY_TYPE_AUTO_EXPOSURE_TARGET_BRIGHTNESS_ML

Set target brightness for auto exposure mode. Data type is float.

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorCapabilityDataTypeML {
    XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML = 0,
    XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML = 100,
    XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML = 101,
    XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML = 200,
    XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorCapabilityDataTypeML;
EnumDescription

XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML

Capability is a bool value.

XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML

Capability is an integer value.

XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML

Capability is a float value.

XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML

Capability is a vector of two integers.

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorCapabilityRangeTypeML {
    XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_BOOL_ML = 0,
    XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML = 1,
    XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML = 2,
    XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorCapabilityRangeTypeML;
EnumDescription

XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_BOOL_ML

Capability has only two valid states, true or false.

XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML

Capability can take any value in the given [min, max] range.

XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML

Capability take any of the discrete values in the list.

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorFrameFormatML {
    XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML = 0,
    XR_PIXEL_SENSOR_FRAME_FORMAT_RGBA_8888_ML = 1,
    XR_PIXEL_SENSOR_FRAME_FORMAT_YUV_420_888_ML = 2,
    XR_PIXEL_SENSOR_FRAME_FORMAT_JPEG_ML = 3,
    XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_F32_ML = 4,
    XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_RAW_ML = 5,
    XR_PIXEL_SENSOR_FRAME_FORMAT_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorFrameFormatML;
EnumDescription

XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML

Each pixel is 1 byte and represents a grayscale value. Datatype of the corresponding frame buffer is uint8_t.

XR_PIXEL_SENSOR_FRAME_FORMAT_RGBA_8888_ML

Each pixel is 4 bytes and represents R,G,B, and A channels in that order. Datatype of the corresponding frame buffer is uint8_t.

XR_PIXEL_SENSOR_FRAME_FORMAT_YUV_420_888_ML

Frame is represented in the YUV_420_888 planar forma. Datatype of the corresponding frame buffer is uint8_t.

XR_PIXEL_SENSOR_FRAME_FORMAT_JPEG_ML

Frame is JPEG encoded.

XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_F32_ML

Represents the depth. Depth is the radial distance (in meters) of the real world location with respect to the depth camera. Datatype is float.

XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_RAW_ML

Raw pixel data representing light captured by the sensor. For depth cameras that have a projector this raw frame will include frames captured both when the projector is on and off. Refer to XrPixelSensorDepthFrameIlluminationML for more details. Data type is float.

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorRealityModeML {
    XR_PIXEL_SENSOR_REALITY_MODE_MIXED_ML = 0,
    XR_PIXEL_SENSOR_REALITY_MODE_CAMERA_ML = 1,
    XR_PIXEL_SENSOR_REALITY_MODE_VIRTUAL_ML = 2,
    XR_PIXEL_SENSOR_REALITY_MODE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorRealityModeML;
EnumDescription

XR_PIXEL_SENSOR_REALITY_MODE_MIXED_ML

Camera frame and digital content will be blended into a single frame.

XR_PIXEL_SENSOR_REALITY_MODE_CAMERA_ML

Only camera frame will be captured.

XR_PIXEL_SENSOR_REALITY_MODE_VIRTUAL_ML

Only virtual content will be captured.

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorAutoExposureModeML {
    XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_ENVIRONMENT_TRACKING_ML = 0,
    XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_CLOSE_PROXIMITY_IR_TRACKING_ML = 1,
    XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorAutoExposureModeML;
EnumDescription

XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_ENVIRONMENT_TRACKING_ML

Exposure mode optimized for environment tracking.

XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_CLOSE_PROXIMITY_IR_TRACKING_ML

Exposure mode optimized for close proximity IR light source.

12.108.8. Query sensor capability ranges

The valid range of the capabilities for each of the sensor streams may be queried using xrQueryPixelSensorCapabilityRangeML. Some stream capabilities may be limited by the configuration of other capabilities within the same stream or a different stream on the same sensor. For example the choice of the frame rate may affect the valid ranges for frame resolution and vice-versa. Another example is a case where a sensor with 2 streams may have an upper limit of 60fps across the 2 streams. So, if one of the streams is configured to have a frame rate of more than 30fps then the valid range of frame rates for the second stream will be reduced so that the total frame rate summed across both the streams does not exceed 60fps.

Creating a valid configuration of the various capabilities is an iterative step. Applications starts by querying the valid range for the first capability. Before calling the query function the second time to get the valid range for the next capability the application needs to pick a value for the first capability and pass it back into the query function. This process needs to be continued till the necessary capabilities are configured.

xrQueryPixelSensorCapabilityRangeML allows the application to detect valid combinations of XrPixelSensorCapabilityTypeML for all the streams in the sensor.

If a sensor does not support a queried capability the runtime must returnXR_ERROR_PIXEL_SENSOR_CAPABILITY_NOT_SUPPORTED_ML.

The xrQueryPixelSensorCapabilityRangeML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrQueryPixelSensorCapabilityRangeML(
    XrPixelSensorML                             sensor,
    const XrPixelSensorCapabilityQueryInfoML*   queryInfo,
    const uint32_t                              configurationCount,
    const XrPixelSensorCapabilityConfigBaseHeaderML*const * configurations,
    XrPixelSensorCapabilityRangeBaseHeaderML*   capabilityRange);
Member Descriptions
  • sensor is a handle to the sensor to query.

  • queryInfo is the query information.

  • configurationCount is the number of configurations.

  • configurations is null or array of configuration structures needed for setting up the streams as the query process proceeds.

  • capabilityRange is the valid capability range.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_PIXEL_SENSOR_CAPABILITY_NOT_SUPPORTED_ML

The XrPixelSensorCapabilityQueryInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityQueryInfoML {
    XrStructureType                  type;
    const void*                      next;
    uint32_t                         stream;
    XrPixelSensorCapabilityTypeML    capabilityType;
} XrPixelSensorCapabilityQueryInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the stream to query.

  • capabilityType is the type of the sensor capability that is being queried.

Valid Usage (Implicit)

The valid range for a capability can be a set of discrete values, or a continuous range with an upper and lower bound, etc. All the structures that define the valid range for a capability must be an extension of the XrPixelSensorCapabilityRangeBaseHeaderML base type.

The type of structure to use for getting the range of capability can be inferred from the XrPixelSensorCapabilityML::capabilityDataTypeand XrPixelSensorCapabilityML::capabilityRangeType.

The XrPixelSensorCapabilityRangeBaseHeaderML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeBaseHeaderML {
    XrStructureType    type;
    void*              next;
    XrBool32           valid;
} XrPixelSensorCapabilityRangeBaseHeaderML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • valid is a flag that indicates if the capability can be configured given the current configuration chain described inxrQueryPixelSensorCapabilityRangeML::configurations.

Valid Usage (Implicit)
  • The XR_ML_pixel_sensor extension must be enabled prior to using XrPixelSensorCapabilityRangeBaseHeaderML

  • type must be one of the following XrStructureType values: XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_FLOAT_ML, XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_UINT32_ML, XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML, XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_XR_BOOL32_ML, XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_XR_EXTENT_2DI_ML

  • next must be NULL or a valid pointer to the next structure in a structure chain

XrPixelSensorCapabilityRangeContinuousFloatML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML.

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeContinuousFloatML {
    XrStructureType    type;
    void*              next;
    XrBool32           valid;
    float              minValue;
    float              maxValue;
} XrPixelSensorCapabilityRangeContinuousFloatML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • valid is a flag that indicates if the capability can be configured given the current configuration chain described inxrQueryPixelSensorCapabilityRangeML::configurations.

  • minValue is the minimum value of the capability.

  • maxValue is the maximum value of the capability.

Valid Usage (Implicit)

XrPixelSensorCapabilityRangeContinuousUint32ML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML.

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeContinuousUint32ML {
    XrStructureType    type;
    void*              next;
    XrBool32           valid;
    uint32_t           minValue;
    uint32_t           maxValue;
} XrPixelSensorCapabilityRangeContinuousUint32ML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • valid is a flag that indicates if the capability can be configured given the current configuration chain described inxrQueryPixelSensorCapabilityRangeML::configurations.

  • minValue is the minimum value of the capability.

  • maxValue is the maximum value of the capability.

Valid Usage (Implicit)

XrPixelSensorCapabilityRangeDiscreteUint32ML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML.

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeDiscreteUint32ML {
    XrStructureType    type;
    void*              next;
    XrBool32           valid;
    uint32_t           valueCapacityInput;
    uint32_t           valueCountOutput;
    uint32_t*          values;
} XrPixelSensorCapabilityRangeDiscreteUint32ML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • valid is a flag that indicates if the capability can be configured given the current configuration chain described inxrQueryPixelSensorCapabilityRangeML::configurations.

  • valueCapacityInput is the capacity of the values array, or 0 to indicate a request to retrieve the required capacity.

  • valueCountOutput is filled in by the runtime with the count of sensor values written or the required capacity in the case thatvalueCapacityInput is insufficient.

  • values is a pointer to an array of uint32_t.

  • See Buffer Size Parameters chapter for a detailed description of retrieving the required values size.

Valid Usage (Implicit)

XrPixelSensorCapabilityRangeDiscreteXrBool32ML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML.

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeDiscreteXrBool32ML {
    XrStructureType    type;
    void*              next;
    XrBool32           valid;
    uint32_t           valueCapacityInput;
    uint32_t           valueCountOutput;
    XrBool32*          values;
} XrPixelSensorCapabilityRangeDiscreteXrBool32ML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • valid is a flag that indicates if the capability can be configured given the current configuration chain described inxrQueryPixelSensorCapabilityRangeML::configurations.

  • valueCapacityInput is the capacity of the values array, or 0 to indicate a request to retrieve the required capacity.

  • valueCountOutput is filled in by the runtime with the count of sensor values written or the required capacity in the case thatvalueCapacityInput is insufficient.

  • values is a pointer to an array of XrBool32.

  • See Buffer Size Parameters chapter for a detailed description of retrieving the required values size.

Valid Usage (Implicit)

XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML and range type of XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML.

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML {
    XrStructureType    type;
    void*              next;
    XrBool32           valid;
    uint32_t           valueCapacityInput;
    uint32_t           valueCountOutput;
    XrExtent2Di*       values;
} XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • valid is a flag that indicates if the capability can be configured given the current configuration chain described inxrQueryPixelSensorCapabilityRangeML::configurations.

  • valueCapacityInput is the capacity of the values array, or 0 to indicate a request to retrieve the required capacity.

  • valueCountOutput is filled in by the runtime with the count of sensor values written or the required capacity in the case thatvalueCapacityInput is insufficient.

  • values is a pointer to an array of XrExtent2Di.

  • See Buffer Size Parameters chapter for a detailed description of retrieving the required values size.

Valid Usage (Implicit)

xrQueryPixelSensorCapabilityRangeML andxrConfigurePixelSensorAsyncML takes an array of structures to configure each of the capabilities. All the structures that define a configuration for a capability must be an extension of the XrPixelSensorCapabilityConfigBaseHeaderML base type.

The type of structure to use for configuring the capability can be inferred from the XrPixelSensorCapabilityML::capabilityDataType.

The XrPixelSensorCapabilityConfigBaseHeaderML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigBaseHeaderML {
    XrStructureType                  type;
    const void*                      next;
    uint32_t                         stream;
    XrPixelSensorCapabilityTypeML    capabilityType;
} XrPixelSensorCapabilityConfigBaseHeaderML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the stream to apply this configuration to.

  • capabilityType is the type of the sensor capability that is being queried.

Valid Usage (Implicit)

XrPixelSensorCapabilityConfigXrBool32ML must be used for capability with data type of XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML.

The XrPixelSensorCapabilityConfigXrBool32ML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigXrBool32ML {
    XrStructureType                  type;
    const void*                      next;
    uint32_t                         stream;
    XrPixelSensorCapabilityTypeML    capabilityType;
    XrBool32                         value;
} XrPixelSensorCapabilityConfigXrBool32ML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the stream to apply this configuration to.

  • capabilityType is the type of the sensor capability that is being queried.

  • value is the value to set for the capabilityType.

Valid Usage (Implicit)

XrPixelSensorCapabilityConfigUint32ML must be used for capability with data type of XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML.

The XrPixelSensorCapabilityConfigUint32ML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigUint32ML {
    XrStructureType                  type;
    const void*                      next;
    uint32_t                         stream;
    XrPixelSensorCapabilityTypeML    capabilityType;
    uint32_t                         value;
} XrPixelSensorCapabilityConfigUint32ML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the stream to apply this configuration to.

  • capabilityType is the type of the sensor capability that is being queried.

  • value is the value to set for the capabilityType.

Valid Usage (Implicit)

XrPixelSensorCapabilityConfigFloatML must be used for capability with data type of XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML.

The XrPixelSensorCapabilityConfigFloatML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigFloatML {
    XrStructureType                  type;
    const void*                      next;
    uint32_t                         stream;
    XrPixelSensorCapabilityTypeML    capabilityType;
    float                            value;
} XrPixelSensorCapabilityConfigFloatML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the stream to apply this configuration to.

  • capabilityType is the type of the sensor capability that is being queried.

  • value is the value to set for the capabilityType.

Valid Usage (Implicit)

XrPixelSensorCapabilityConfigXrExtent2DiML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML.

The XrPixelSensorCapabilityConfigXrExtent2DiML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigXrExtent2DiML {
    XrStructureType                  type;
    const void*                      next;
    uint32_t                         stream;
    XrPixelSensorCapabilityTypeML    capabilityType;
    XrExtent2Di                      value;
} XrPixelSensorCapabilityConfigXrExtent2DiML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the stream to apply this configuration to.

  • capabilityType is the type of the sensor capability that is being queried.

  • value is the value to set for the capabilityType.

Valid Usage (Implicit)

12.108.9. Configure sensor capabilities

Once the sensor capabilities have been queried usingxrQueryPixelSensorCapabilityRangeML,xrConfigurePixelSensorAsyncML can be used to apply this configuration in one step.

The runtime must return XR_ERROR_CALL_ORDER_INVALID ifxrQueryPixelSensorCapabilityRangeML has not been called for this sensor handle for all XrPixelSensorCapabilityTypeML requested.

The runtime must return XR_ERROR_VALIDATION_FAILURE if the sensor is still active. In this case application must call xrStopPixelSensorAsyncML on all the streams before applying a new configuration.

The runtime must support reconfiguring a sensor with new capabilities without requiring to destroying and recreating an XrPixelSensorML.

The runtime must return XR_ERROR_VALIDATION_FAILURE if any of the capabilities are not valid.

The runtime must return XR_ERROR_VALIDATION_FAILURE if any of the required capabilities are not provided. See XrPixelSensorCapabilityTypeML for the list of capabilities thatmust be specified by the application.

The xrConfigurePixelSensorAsyncML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrConfigurePixelSensorAsyncML(
    XrPixelSensorML                             sensor,
    const XrPixelSensorConfigInfoML*            configInfo,
    XrFutureEXT*                                future);
Member Descriptions
  • sensor is a handle to the sensor.

  • configInfo is the configuration information.

  • future is a pointer to a runtime created XrFutureEXT.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_CALL_ORDER_INVALID

  • XR_ERROR_PIXEL_SENSOR_CAPABILITY_NOT_SUPPORTED_ML

The XrPixelSensorConfigInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorConfigInfoML {
    XrStructureType                                            type;
    const void*                                                next;
    uint32_t                                                   streamCount;
    const uint32_t*                                            streams;
    uint32_t                                                   configurationCount;
    const XrPixelSensorCapabilityConfigBaseHeaderML*const *    configurations;
} XrPixelSensorConfigInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • streamCount is the number of streams that needs to be configured.

  • streams is the list of streams to start.

  • configurationCount is the number of configurations.

  • configurations is null or array of configuration structures.

Valid Usage (Implicit)

The xrConfigurePixelSensorCompleteML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrConfigurePixelSensorCompleteML(
    XrPixelSensorML                             sensor,
    XrFutureEXT                                 future,
    XrPixelSensorConfigureCompletionML*         completion);
Member Descriptions
  • sensor is a handle to the sensor.

  • future is a the XrFutureEXT to complete.

  • completion is the completion structure containing the result of the operation.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_FUTURE_PENDING_EXT

  • XR_ERROR_FUTURE_INVALID_EXT

The XrPixelSensorConfigureCompletionML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorConfigureCompletionML {
    XrStructureType    type;
    void*              next;
    XrResult           futureResult;
} XrPixelSensorConfigureCompletionML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • futureResult is XrResult of the async operation associated with future passed to the completion function.

Valid Usage (Implicit)

12.108.10. Allocate memory for sensor data

The application is responsible for allocating the memory to store the sensor data.xrGetPixelSensorBufferPropertiesML must provide the size of the buffer to allocate.

xrGetPixelSensorBufferPropertiesML must returnXR_ERROR_CALL_ORDER_INVALID if xrConfigurePixelSensorCompleteMLwas not yet called for this XrPixelSensorML.

The xrGetPixelSensorBufferPropertiesML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrGetPixelSensorBufferPropertiesML(
    XrPixelSensorML                             sensor,
    const XrPixelSensorBufferPropertiesInfoML*  propertiesInfo,
    XrPixelSensorBufferPropertiesML*            properties);
Member Descriptions
  • sensor is a handle to the sensor.

  • propertiesInfo is pointer to theXrPixelSensorBufferPropertiesInfoML structure.

  • properties will be filled by the runtime with the buffer information.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_CALL_ORDER_INVALID

The XrPixelSensorBufferPropertiesInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorBufferPropertiesInfoML {
    XrStructureType                   type;
    const void*                       next;
    uint32_t                          stream;
    uint32_t                          metadataCount;
    const XrPixelSensorMetadataML*    metadatas;
} XrPixelSensorBufferPropertiesInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the index of the stream for which buffer size is needed.

  • metadataCount is the number of metadatas needed.

  • metadatas is a pointer to an array of XrPixelSensorMetadataML.

Valid Usage (Implicit)

The XrPixelSensorBufferPropertiesML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorBufferPropertiesML {
    XrStructureType    type;
    void*              next;
    uint32_t           stream;
    uint32_t           bufferSize;
} XrPixelSensorBufferPropertiesML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the index of the stream for which bufferSize is specified.

  • bufferSize the size of the buffer in bytes.

Valid Usage (Implicit)

12.108.11. Start and stop sensor streams

Once the sensor capabilities have been configuredxrStartPixelSensorAsyncML can be used to start streaming data from the sensor.

The Runtime must return XR_ERROR_CALL_ORDER_INVALID ifxrConfigurePixelSensorCompleteML has not been called for this sensor handle.

Sensor stream must be started after the sensor configuration was completed and before requesting new frames via xrGetPixelSensorDataML.

The xrStartPixelSensorAsyncML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrStartPixelSensorAsyncML(
    XrPixelSensorML                             sensor,
    const XrPixelSensorStartInfoML*             startInfo,
    XrFutureEXT*                                future);
Member Descriptions
Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_CALL_ORDER_INVALID

The XrPixelSensorStartInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStartInfoML {
    XrStructureType    type;
    const void*        next;
    uint32_t           streamCount;
    const uint32_t*    streams;
} XrPixelSensorStartInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • streamCount is the number of streams that needs to be started.

  • streams is the list of streams to start.

Valid Usage (Implicit)

The xrStartPixelSensorCompleteML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrStartPixelSensorCompleteML(
    XrPixelSensorML                             sensor,
    XrFutureEXT                                 future,
    XrPixelSensorStartCompletionML*             completion);
Member Descriptions
  • sensor is a handle to the sensor.

  • future is a the XrFutureEXT to complete.

  • completion is the completion structure containing the result of the operation.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_FUTURE_PENDING_EXT

  • XR_ERROR_FUTURE_INVALID_EXT

The XrPixelSensorStartCompletionML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStartCompletionML {
    XrStructureType    type;
    void*              next;
    XrResult           futureResult;
} XrPixelSensorStartCompletionML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • futureResult is XrResult of the async operation associated with future passed to the completion function.

Valid Usage (Implicit)

Call xrStopPixelSensorAsyncML to stop the sensor streams without having to destroy the sensor handle. The runtime may take advantage of the start and stop APIs to save on system resources. However the runtime must retain the state of the sensors (i.e. the most recent sensor configuration) and use it when the application callsxrStartPixelSensorAsyncML.

The xrStopPixelSensorAsyncML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrStopPixelSensorAsyncML(
    XrPixelSensorML                             sensor,
    const XrPixelSensorStopInfoML*              stopInfo,
    XrFutureEXT*                                future);
Member Descriptions
Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_CALL_ORDER_INVALID

The XrPixelSensorStopInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStopInfoML {
    XrStructureType    type;
    const void*        next;
    uint32_t           streamCount;
    const uint32_t*    streams;
} XrPixelSensorStopInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • streamCount is the number of streams that needs to be stopped.

  • streams is the list of streams to stop.

Valid Usage (Implicit)

The xrStopPixelSensorCompleteML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrStopPixelSensorCompleteML(
    XrPixelSensorML                             sensor,
    XrFutureEXT                                 future,
    XrPixelSensorStopCompletionML*              completion);
Member Descriptions
  • sensor is a handle to the sensor.

  • future is a the XrFutureEXT to complete.

  • completion is the completion structure containing the result of the operation.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_FUTURE_PENDING_EXT

  • XR_ERROR_FUTURE_INVALID_EXT

The XrPixelSensorStopCompletionML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStopCompletionML {
    XrStructureType    type;
    void*              next;
    XrResult           futureResult;
} XrPixelSensorStopCompletionML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • futureResult is XrResult of the async operation associated with future passed to the completion function.

Valid Usage (Implicit)

12.108.12. Query sensor data

Once the sensor capabilities have been configured and the necessary streams for the sensors are started xrGetPixelSensorDataML can be used to get sensor data.

The Runtime must return XR_ERROR_CALL_ORDER_INVALID ifxrStartPixelSensorCompleteML has not been called for this sensor handle.

The xrGetPixelSensorDataML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrGetPixelSensorDataML(
    XrPixelSensorML                             sensor,
    const XrPixelSensorDataGetInfoML*           info,
    XrPixelSensorBufferML*                      buffer,
    XrPixelSensorDataML*                        data);
Member Descriptions
  • sensor is a handle to the sensor to query.

  • info is a pointer to a structure get operation details.

  • buffer is the buffer to use for storage of the data.

  • data is the frame data. This may be chained to obtain additional metadata.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

  • XR_TIMEOUT_EXPIRED

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_TIME_INVALID

  • XR_ERROR_CALL_ORDER_INVALID

xrGetPixelSensorDataML must return the latest frame data available after XrPixelSensorDataGetInfoML::lastCaptureTime. If multiple frames arrived between the current time andXrPixelSensorDataGetInfoML::lastCaptureTime only the newest frame will be returned.

xrGetPixelSensorDataML must wait up toXrPixelSensorDataGetInfoML::timeout for image data to arrive. If no image data arrives before theXrPixelSensorDataGetInfoML::timeout expires the function must return XR_TIMEOUT_EXPIRED.

The application must call xrGetPixelSensorBufferPropertiesML at least once per XrPixelSensorML handle. If not the runtime must return XR_ERROR_CALL_ORDER_INVALID fromxrGetPixelSensorDataML.

The application may reuse the same XrPixelSensorDataML for subsequent calls to xrGetPixelSensorDataML if it no longer needs the data from previous call.

The XrPixelSensorDataGetInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDataGetInfoML {
    XrStructureType    type;
    const void*        next;
    uint32_t           stream;
    XrTime             lastCaptureTime;
    XrDuration         timeout;
} XrPixelSensorDataGetInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • stream is the index of the stream to get the data from.

  • lastCaptureTime is the timestamp of the previous capture or 0.

  • timeout is the duration that xrGetPixelSensorDataML must wait at most if no frame data is available newer than lastCaptureTime.

Valid Usage (Implicit)

The XrPixelSensorBufferML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorBufferML {
    XrStructureType    type;
    void*              next;
    uint32_t           bufferSize;
    void*              buffer;
} XrPixelSensorBufferML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • bufferSize is the size of the buffer in bytes.

  • buffer is the buffer that the runtime will fill with sensor data.

Valid Usage (Implicit)

All pixel sensors must provide the XrPixelSensorDataML frame data viaxrGetPixelSensorDataML. In addition to this pixel sensors may provide addition metadata as described in later sections.

The XrPixelSensorDataML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDataML {
    XrStructureType          type;
    void*                    next;
    XrTime                   captureTime;
    XrPixelSensorFrameML*    frame;
} XrPixelSensorDataML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • captureTime is the timestamp of when this data was captured.

  • frame is a pointer to the frame data.

XrPixelSensorFrameML structure is used to store per pixel data. The type of data stored for each pixel varies and depends on theXrPixelSensorFrameTypeML. The top left corner of the frame is treated as the origin.

The XrPixelSensorFrameML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorFrameML {
    XrStructureType             type;
    void*                       next;
    XrPixelSensorFrameTypeML    frameType;
    uint32_t                    planeCount;
    XrPixelSensorPlaneML*       planes;
} XrPixelSensorFrameML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • frameType is the type of plane.

  • planeCount is the number of planes.

  • planes is an array of planes containing the pixel data.

Valid Usage (Implicit)

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorFrameTypeML {
    XR_PIXEL_SENSOR_FRAME_TYPE_GRAYSCALE_U8ML = 0,
    XR_PIXEL_SENSOR_FRAME_TYPE_RGBA_8888_ML = 1,
    XR_PIXEL_SENSOR_FRAME_TYPE_YUV_420_888_ML = 2,
    XR_PIXEL_SENSOR_FRAME_TYPE_JPEG_ML = 3,
    XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_32_ML = 4,
    XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_RAW_ML = 5,
    XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_CONFIDENCE_ML = 6,
    XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_FLAGS_ML = 7,
    XR_PIXEL_SENSOR_FRAME_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorFrameTypeML;
EnumDescription

XR_PIXEL_SENSOR_FRAME_TYPE_GRAYSCALE_U8ML

Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML for more details.

XR_PIXEL_SENSOR_FRAME_TYPE_RGBA_8888_ML

Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_RGBA_8888_ML for more details.

XR_PIXEL_SENSOR_FRAME_TYPE_YUV_420_888_ML

Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_YUV_420_888_ML for more details.

XR_PIXEL_SENSOR_FRAME_TYPE_JPEG_ML

Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_JPEG_ML for more details.

XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_32_ML

Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_32_ML for more details.

XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_RAW_ML

Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_RAW_ML for more details.

XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_CONFIDENCE_ML

Refer XR_PIXEL_SENSOR_CAPABILITY_TYPE_METADATA_DEPTH_CONFIDENCE_BUFFER_ML.

XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_FLAGS_ML

Refer to XR_PIXEL_SENSOR_CAPABILITY_TYPE_METADATA_DEPTH_FLAG_BUFFER_ML.

The XrPixelSensorPlaneML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorPlaneML {
    XrStructureType    type;
    void*              next;
    uint32_t           width;
    uint32_t           height;
    uint32_t           stride;
    uint32_t           bytesPerPixel;
    uint32_t           pixelStride;
    uint32_t           bufferSize;
    void*              buffer;
} XrPixelSensorPlaneML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • width is the width in pixels.

  • height is the height in pixels.

  • stride is the line stride in bytes.

  • bytesPerPixel is the number of bytes per pixel.

  • pixelStride distance between two pixels in bytes.

  • bufferSize the size of the whole image in bytes.

  • buffer pointer to the data buffer.

Valid Usage (Implicit)

12.108.13. Sensor Metadata

Pixel sensors may provide additional meta data for the captured frames. Application can obtain this metadata by chaining the next pointer inXrPixelSensorDataML::next pointer.XrPixelSensorMetadataML represents the list of metadata that a sensormay support.

The XrPixelSensorMetadataML is defined as:

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorMetadataML {
    XR_PIXEL_SENSOR_METADATA_EXPOSURE_TIME_ML = 0,
    XR_PIXEL_SENSOR_METADATA_ANALOG_GAIN_ML = 1,
    XR_PIXEL_SENSOR_METADATA_DIGITAL_GAIN_ML = 2,
    XR_PIXEL_SENSOR_METADATA_PINHOLE_CAMERA_MODEL_ML = 3,
    XR_PIXEL_SENSOR_METADATA_FISHEYE_CAMERA_MODEL_ML = 4,
    XR_PIXEL_SENSOR_METADATA_DEPTH_FRAME_ILLUMINATION_ML = 5,
    XR_PIXEL_SENSOR_METADATA_DEPTH_CONFIDENCE_BUFFER_ML = 6,
    XR_PIXEL_SENSOR_METADATA_DEPTH_FLAG_BUFFER_ML = 7,
    XR_PIXEL_SENSOR_METADATA_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorMetadataML;
EnumDescription

XR_PIXEL_SENSOR_METADATA_EXPOSURE_TIME_ML

Exposure time in milliseconds used to capture the frame. Refer to XrPixelSensorExposureTimeML for more details.

XR_PIXEL_SENSOR_METADATA_ANALOG_GAIN_ML

Analog gain used to capture the frame. Refer to XrPixelSensorAnalogGainML for more details.

XR_PIXEL_SENSOR_METADATA_DIGITAL_GAIN_ML

Digital gain used to capture the frame. Refer to XrPixelSensorDigitalGainML for more details.

XR_PIXEL_SENSOR_METADATA_PINHOLE_CAMERA_MODEL_ML

Pinhole camera model. Refer to XrPixelSensorPinholeIntrinsicsML for more details.

XR_PIXEL_SENSOR_METADATA_FISHEYE_CAMERA_MODEL_ML

Fisheye camera model. Refer to XrPixelSensorFisheyeIntrinsicsML for more details.

XR_PIXEL_SENSOR_METADATA_DEPTH_FRAME_ILLUMINATION_ML

Illumination type used for the depth frame. Refer to XrPixelSensorDepthFrameIlluminationML for more details.

XR_PIXEL_SENSOR_METADATA_DEPTH_CONFIDENCE_BUFFER_ML

Confidence values for each pixel in the camera frame. The confidence score is derived from the sensor noise and it is not normalized. The higher the value the higher the confidence. Applications can determine what confidence threshold to use based on their use case. Data type is float.

XR_PIXEL_SENSOR_METADATA_DEPTH_FLAG_BUFFER_ML

Flag bits for each pixel in the depth camera frame. Refer to XrPixelSensorDepthFlagsML for more details. Data type is uint32_t.

The type of metadata supported depends both on the sensor and how it is configured. Use xrEnumeratePixelSensorMetadataML to get the list of metadata supported by the sensor streams.

xrEnumeratePixelSensorMetadataML must returnXR_ERROR_CALL_ORDER_INVALID if xrConfigurePixelSensorCompleteMLwas not yet called for this XrPixelSensorML.

Some the metadata (ex:XR_PIXEL_SENSOR_METADATA_PINHOLE_CAMERA_MODEL_ML) might be the same for all the streams in a given sensor. In such cases applications may choose to configure only one of the streams to return the metadata.

The xrEnumeratePixelSensorMetadataML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrEnumeratePixelSensorMetadataML(
    XrPixelSensorML                             sensor,
    uint32_t                                    stream,
    uint32_t                                    metadataCapacityInput,
    uint32_t*                                   metadataCountOutput,
    XrPixelSensorMetadataML*                    metadatas);
Member Descriptions
  • sensor is a handle to the sensor to enumerate.

  • stream is the stream for which you want to query the metadata.

  • metadataCapacityInput is the capacity of the metadatas array, or 0 to indicate a request to retrieve the required capacity.

  • metadataCountOutput is filled in by the runtime with the count of sensor metadata written or the required capacity in the case thatmetadataCapacityInput is insufficient.

  • metadatas is a pointer to an array of XrPixelSensorMetadataML.

  • See Buffer Size Parameters chapter for a detailed description of retrieving the required metadatas size.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_CALL_ORDER_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

The XrPixelSensorExposureTimeML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorExposureTimeML {
    XrStructureType    type;
    void*              next;
    float              exposureTime;
} XrPixelSensorExposureTimeML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • exposureTime is the exposure time in milliseconds used to capture the frame.

Valid Usage (Implicit)

The XrPixelSensorAnalogGainML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorAnalogGainML {
    XrStructureType    type;
    void*              next;
    uint32_t           analogGain;
} XrPixelSensorAnalogGainML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • analogGain is the analog gain used to capture the frame.

Valid Usage (Implicit)

The XrPixelSensorDigitalGainML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDigitalGainML {
    XrStructureType    type;
    void*              next;
    uint32_t           digitalGain;
} XrPixelSensorDigitalGainML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • digitalGain is the digital gain used to capture the frame.

Valid Usage (Implicit)

The XrPixelSensorDepthFrameIlluminationML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDepthFrameIlluminationML {
    XrStructureType                              type;
    void*                                        next;
    XrPixelSensorDepthFrameIlluminationTypeML    illuminationType;
} XrPixelSensorDepthFrameIlluminationML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • illuminationType is the illumination type used to capture the frame.

Valid Usage (Implicit)

// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorDepthFrameIlluminationTypeML {
    XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_ON_ML = 0,
    XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_OFF_ML = 1,
    XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorDepthFrameIlluminationTypeML;
EnumDescription

XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_ON_ML

Depth camera frame projector is on.

XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_OFF_ML

Depth camera frame projector is off.

The XrPixelSensorDepthConfidenceBufferML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDepthConfidenceBufferML {
    XrStructureType          type;
    void*                    next;
    XrPixelSensorFrameML*    frame;
} XrPixelSensorDepthConfidenceBufferML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • frame is a pointer to the frame data. See XR_PIXEL_SENSOR_METADATA_DEPTH_CONFIDENCE_BUFFER_ML for more details.

Valid Usage (Implicit)

The XrPixelSensorDepthFlagBufferML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDepthFlagBufferML {
    XrStructureType          type;
    void*                    next;
    XrPixelSensorFrameML*    frame;
} XrPixelSensorDepthFlagBufferML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • frame is a pointer to the frame data. See XR_PIXEL_SENSOR_METADATA_DEPTH_FLAG_BUFFER_ML for more details.

Valid Usage (Implicit)

// Provided by XR_ML_pixel_sensor
typedef XrFlags64 XrPixelSensorDepthFlagsML;

// Provided by XR_ML_pixel_sensor
// Flag bits for XrPixelSensorDepthFlagsML
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_INVALID_BIT_ML = 0x00000001;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_SATURATED_BIT_ML = 0x00000002;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_INCONSISTENT_BIT_ML = 0x00000004;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_LOW_SIGNAL_BIT_ML = 0x00000008;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_FLYING_PIXEL_BIT_ML = 0x00000010;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_MASKED_BIT_ML = 0x00000020;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_SBI_BIT_ML = 0x00000100;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_STRAY_LIGHT_BIT_ML = 0x00000200;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_CONNECTED_COMPONENTS_BIT_ML = 0x00000400;
Flag Descriptions
  • XR_PIXEL_SENSOR_DEPTH_INVALID_BIT_ML  — This bit is set to one to indicate that one or more flags from below have been set. Depending on the use case the application can correlate the flag data and corresponding pixel data to determine how to handle the pixel data.

  • XR_PIXEL_SENSOR_DEPTH_SATURATED_BIT_ML  — The pixel intensity is either below the min or the max threshold value.

  • XR_PIXEL_SENSOR_DEPTH_INCONSISTENT_BIT_ML  — Inconsistent data received when capturing frames. This can happen due to fast motion.

  • XR_PIXEL_SENSOR_DEPTH_LOW_SIGNAL_BIT_ML  — Pixel has very low signal to noise ratio. One example of when this can happen is for pixels in far end of the range.

  • XR_PIXEL_SENSOR_DEPTH_FLYING_PIXEL_BIT_ML  — This typically happens when there is step jump in the distance of adjoining pixels in the scene. Example: When you open a door looking into the room the edges along the door’s edges can cause flying pixels.

  • XR_PIXEL_SENSOR_DEPTH_MASKED_BIT_ML  — If this bit is on it indicates that the corresponding pixel may not be within the illuminator’s illumination cone.

  • XR_PIXEL_SENSOR_DEPTH_SBI_BIT_ML  — This bit will be set when there is high noise.

  • XR_PIXEL_SENSOR_DEPTH_STRAY_LIGHT_BIT_ML  — This could happen when there is another light source apart from the depth camera illuminator.

  • XR_PIXEL_SENSOR_DEPTH_CONNECTED_COMPONENTS_BIT_ML  — If a small group of XR_PIXEL_SENSOR_DEPTH_FLAG_VALID_ML is surrounded by a set of XR_PIXEL_SENSOR_DEPTH_FLAG_INVALID_ML then this bit will be set to 1.

12.108.14. Camera Models

Different cameras may support different camera models. Applications can use xrEnumeratePixelSensorMetadataML to determine what camera models are supported by each one of the sensors.

XrPixelSensorPinholeIntrinsicsML specifies the camera intrinsics and distortion co-efficients for a pinhole camera model.

The XrPixelSensorPinholeIntrinsicsML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorPinholeIntrinsicsML {
    XrStructureType    type;
    void*              next;
    XrVector2f         focalLength;
    XrVector2f         principalPoint;
    XrVector2f         fov;
    double             distortion[5];
} XrPixelSensorPinholeIntrinsicsML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • focalLength is the focal length in pixels.

  • principalPoint is the principal point in pixels.

  • fov is the horizontal (x) and vertical (y) field of view in degrees.

  • distortion is distortion coefficients. These coefficients are in the following order: [k1, k2, p1, p2, k3].

Valid Usage (Implicit)

XrPixelSensorFisheyeIntrinsicsML specifies the camera matrix and distortion co-efficients for a Magic Leap’s fisheye camera model. This fisheye model differentiates itself from conventional fisheye models (seehere) by adding an additional tangential term on top of the existing method. Applications can use the intrinsics with the conventional OpenCV fisheye calibration library (seehere) by dropping the tangential terms (p1 and p2 in the equations below) but this may result in lower accuracy.

Radial distortion coefficients: k1, k2, k3, k4
Tangential distortion coefficients: p1, p2

If P = [x, y, z] is a point in camera coordinates and a = x/z, b = y/z are the corresponding point locations in normalized image coordinates, this model will project and distort said point in the following way:

Conventional fisheye model

r = sqrt(a^2 + b^2)
θ = atan( r )
θ_rad = θ * (1 + k1 * θ^2 + k2 * θ^4 + k3 * θ^6 + k4 * θ^8)
x_rad = a * ( θ_rad / r )
y_rad = b * ( θ_rad / r )

Tangential term (can be omitted if reduced accuracy is acceptable)

r_rad_sq = x_rad^2 + y_rad^2
x_rad_tan = x_rad + 2 * p1 * x_rad * y_rad + p2 * (r_rad_sq + 2 * x_rad^2)
y_rad_tan = y_rad + p1 * (r_rad_sq + 2 * y_rad^2) + 2 * p2 * x_rad * y_rad

The XrPixelSensorFisheyeIntrinsicsML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorFisheyeIntrinsicsML {
    XrStructureType    type;
    void*              next;
    XrVector2f         focalLength;
    XrVector2f         principalPoint;
    XrVector2f         fov;
    double             radialDistortion[4];
    double             tangentialDistortion[2];
} XrPixelSensorFisheyeIntrinsicsML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • focalLength is the focal length in pixels.

  • principalPoint is the principal point in pixels.

  • fov is the horizontal and vertical field of view in degrees.

  • tangentialDistortion is the tangential distortion coefficients. These coefficients are in the following order: [k1, k2, k3, k4].

  • radialDistortion is the radial distortion coefficients. These coefficients are in the following order: [p1, p2].

Valid Usage (Implicit)

12.108.15. Create Sensor pose

For certain applications it is useful to have the exact pose of the sensor available, xrCreatePixelSensorSpaceML creates an XrSpace for the sensor. The runtime must return XR_ERROR_PIXEL_SENSOR_SPACE_NOT_SUPPORTED_MLif it cannot provide pose information for the requested sensor, the created space must be valid and must yield valid pose information when usingxrLocateSpace.

The xrCreatePixelSensorSpaceML function is defined as:

// Provided by XR_ML_pixel_sensor
XrResult xrCreatePixelSensorSpaceML(
    XrSession                                   session,
    const XrPixelSensorCreateSpaceInfoML*       info,
    XrSpace*                                    space);
Member Descriptions
  • session the session to use.

  • info the creation information.

  • space pointer to an XrSpace handle to be filled by the runtime.

Valid Usage (Implicit)
Return Codes
Success
  • XR_SUCCESS

  • XR_SESSION_LOSS_PENDING

Failure
  • XR_ERROR_FUNCTION_UNSUPPORTED

  • XR_ERROR_VALIDATION_FAILURE

  • XR_ERROR_RUNTIME_FAILURE

  • XR_ERROR_HANDLE_INVALID

  • XR_ERROR_INSTANCE_LOST

  • XR_ERROR_SESSION_LOST

  • XR_ERROR_POSE_INVALID

  • XR_ERROR_LIMIT_REACHED

  • XR_ERROR_OUT_OF_MEMORY

  • XR_ERROR_PIXEL_SENSOR_SPACE_NOT_SUPPORTED_ML

The XrPixelSensorCreateSpaceInfoML structure is defined as:

// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCreateSpaceInfoML {
    XrStructureType    type;
    const void*        next;
    XrPixelSensorML    sensor;
    XrPosef            offset;
} XrPixelSensorCreateSpaceInfoML;
Member Descriptions
  • type is the XrStructureType of this structure.

  • next is NULL or a pointer to the next structure in a structure chain.

  • sensor is the sensor for which to create the space.

  • offset offset to include in the space to the sensor.

Valid Usage (Implicit)

12.108.16. Example Sensor pose

Some applications need to know exactly where a sensor is at a given time,xrCreatePixelSensorSpaceML allows the application to retrieve anXrSpace, which can then be used to get the location of the sensor.

Typically this would be combined with sensor data extraction.

class Sensor {
 public:

  bool Start() {
    XrPath sensorPath;
    if (XR_SUCCESS == xrStringToPath(instance,"pixelsensor/world/center",&sensorPath)) {
      return false;
    }
    if (!CreateSensor(sensorPath)) { return false; }

    XrPixelSensorCreateSpaceInfoML info{XR_TYPE_PIXEL_SENSOR_CREATE_SPACE_INFO_ML};
    info.sensor = m_Sensor;
    info.offset.orientation.w = 1.0f;
    CHK_XR(xrCreatePixelSensorSpaceML(m_Session, &info, &m_SensorSpace));
    return true;
  }

  void OnGameTick(XrTime displayTime) {
    if (m_SensorSpace==XR_NULL_HANDLE) {
      return;
    }

    XrSpaceLocation location{XR_TYPE_SPACE_LOCATION};

    // Instead of displayTime, XrPixelSensorDataML::captureTime can be used
    // to get the pose of the camera at capture time.
    CHK_XR(xrLocateSpace(m_SensorSpace, m_LocalSpace, displayTime, &location));

    // do something with the pose of the sensor.
  }

  void Stop() {
    // Clean up the space.
    xrDestroySpace(m_SensorSpace);
    m_SensorSpace = XR_NULL_HANDLE;

    // Clean up the sensor.
    xrDestroyPixelSensorML(m_Sensor);
    m_Sensor = XR_NULL_HANDLE;
  }

 protected:
  // Previously initialized.
  XrSession m_Session;
  XrSpace m_LocalSpace;

  XrPixelSensorML m_Sensor{XR_NULL_HANDLE};
  XrSpace m_SensorSpace{XR_NULL_HANDLE};

  bool CreateSensor(XrPath sensor) {

    // Enumerate all the available sensors.
    uint32_t count;
    CHK_XR(xrEnumeratePixelSensorsML(m_Session, 0, &count, nullptr ));
    std::vector<XrPath> sensors(count);
    CHK_XR(xrEnumeratePixelSensorsML(m_Session, count, &count, sensors.data() ));

    // Check if the requested sensor is currently available.
    if (std::find(sensors.begin(), sensors.end(), sensor) == sensors.end()){
      return false;
    }

    // Create the handle.
    XrPixelSensorCreateInfoML createInfo{XR_TYPE_PIXEL_SENSOR_CREATE_INFO_ML, nullptr, sensor};
    CHK_XR(xrCreatePixelSensorML(m_Session, &createInfo, &m_Sensor));
    return true;
  }
};

12.108.17. Example Sensor Configuration

This example builds on the previous example and adds the sensor negotiation to the mix.

class SensorNegotiate : public Sensor {
 public:

  bool Start() {
    XrPath sensorPath;
    if (XR_SUCCESS == xrStringToPath(m_Instance,"pixelsensor/world/center",&sensorPath)) {
      return false;
    }
    if (!CreateSensor(sensorPath)) { return false; }
    if (!NegotiateSensorStreamCapabilities()) { return false; }
    return true;
  }

  void Stop() {
    // cleanup the configuration.
    DeleteConfigs();

    // clean up the sensor.
    xrDestroyPixelSensorML(m_Sensor);
    m_Sensor = XR_NULL_HANDLE;
  }

 protected:

  std::array<int, m_StreamCount>  m_FrameRate {30, 30};
  std::array<XrExtent2Di, m_StreamCount> m_FrameResolution { {{640,480}, {640,480}} };
  std::array<XrPixelSensorFrameFormatML, m_StreamCount>  m_FrameFormat {XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML, XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML};
  std::array<float, m_StreamCount>  m_ExposureUs {8.5, 25.5};
  std::vector<XrPixelSensorCapabilityTypeML> requiredCapabilities {XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML, XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML,
                                                               XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML, XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML};

  std::vector<XrPixelSensorCapabilityConfigBaseHeaderML*> m_ConfigChain;

  void AppendConfig(XrPixelSensorCapabilityConfigBaseHeaderML* config) {
    m_ConfigChain.push_back(config);
  }

  void DeleteConfigs() {
    for (auto & config: m_ConfigChain) {
      switch ( config->type) {
        case XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML:
          delete reinterpret_cast<XrPixelSensorCapabilityConfigUint32ML*>(config);
          break;
        case XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_FLOAT_ML:
          delete reinterpret_cast<XrPixelSensorCapabilityConfigFloatML*>(config);
          break;
        case XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML:
          delete reinterpret_cast<XrPixelSensorCapabilityConfigXrExtent2DiML*>(config);
          break;
        default:
          // If we reach this we missed one of the types.
          break;
      };
    }
    m_ConfigChain.clear();
  }

  XrPixelSensorCapabilityConfigBaseHeaderML *const * Config() const {
    return m_ConfigChain.size() > 0 ? &m_ConfigChain[0] : nullptr;
  }


  bool PickFormat(uint32_t stream, XrPixelSensorFrameFormatML desiredFormat) {
    XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
    queryInfo.stream = stream;
    queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML;

    // Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
    XrPixelSensorCapabilityRangeDiscreteUint32ML capValues{XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML};
    capValues.valueCapacityInput = 0;
    capValues.values = nullptr;

    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));
    if (capValues.valueCountOutput==0) {
      return false;   // No formats available.
    }

    std::vector<XrPixelSensorFrameFormatML> values(capValues.valueCountOutput);
    capValues.values = reinterpret_cast<uint32_t*>(values.data());
    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));

    // Check if the required format is supported.
    auto it = std::find(values.begin(), values.end(), desiredFormat);
    if (it==values.end()) {
      return false;
    }

    // Configure the sensor with the requested format.
    XrPixelSensorCapabilityConfigUint32ML * formatConfig = new XrPixelSensorCapabilityConfigUint32ML();
    formatConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML;
    formatConfig->next = nullptr;
    formatConfig->stream = queryInfo.stream;
    formatConfig->capabilityType = queryInfo.capabilityType;
    formatConfig->value = desiredFormat;
    AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(formatConfig));
    return true;
  }


  bool PickResolution(uint32_t stream, XrExtent2Di desiredResolution) {
    XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
    queryInfo.stream = stream;
    queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML;

    // Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
    XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML capResolution{XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML};
    capResolution.valueCapacityInput = 0;
    capResolution.values = nullptr;
    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capResolution)));
    if (capResolution.valueCountOutput==0) {
      return false;      // No resolutions available.
    }
    std::vector<XrExtent2Di> resolutions(capResolution.valueCountOutput);
    capResolution.values = resolutions.data();
    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capResolution)));

    // Check if the required resolution is supported.
    bool found = false;
    for(const auto& resolution : resolutions) {
      if(resolution.width == desiredResolution.width && resolution.height == desiredResolution.height) {
        found = true;
        break;
      }
    }
    if (found == false) {
      // As a fallback if the desired resolution is not available then pick
      // the first available resolution. Other heuristics (i.e. closest) can
      // be used to pick resolution as well.
      desiredResolution = resolutions[0];
    }

    // Configure the sensor with the resolution.
    XrPixelSensorCapabilityConfigXrExtent2DiML * resolutionConfig = new XrPixelSensorCapabilityConfigXrExtent2DiML();
    resolutionConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML;
    resolutionConfig->next = nullptr;
    resolutionConfig->stream = queryInfo.stream;
    resolutionConfig->capabilityType = queryInfo.capabilityType;
    resolutionConfig->value = desiredResolution;
    AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(resolutionConfig));
    return true;
  }

  bool PickFrameRate(uint32_t stream, int desiredFrameRate) {
    XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
    queryInfo.stream = stream;
    queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML;

    // Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
    XrPixelSensorCapabilityRangeDiscreteUint32ML capValues{XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML};
    capValues.valueCapacityInput = 0;
    capValues.values = nullptr;

    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));
    if (capValues.valueCountOutput==0) {
      return false;   // No frame rates available.
    }

    std::vector<XrPixelSensorFrameFormatML> values(capValues.valueCountOutput);
    capValues.values = reinterpret_cast<uint32_t*>(values.data());
    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));

    // Check if the required frame rate is supported.
    // If not pick the closest available frame rate.
    auto it = std::min_element(values.begin(), values.end(),
                              [desiredFrameRate](int a, int b) {
                                  return std::abs(a - desiredFrameRate) < std::abs(b - desiredFrameRate);
                              });
    if (it==values.end()) {
      return false;
    }

    // Configure the sensor with the requested frame rate.
    XrPixelSensorCapabilityConfigUint32ML * frameRateConfig = new XrPixelSensorCapabilityConfigUint32ML();
    frameRateConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML;
    frameRateConfig->next = nullptr;
    frameRateConfig->stream = queryInfo.stream;
    frameRateConfig->capabilityType = queryInfo.capabilityType;
    frameRateConfig->value = *it;
    AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(frameRateConfig));
    return true;
  }

  bool PickExposure(uint32_t stream,  float desiredExposureUs) {
    XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
    queryInfo.stream = stream;
    queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML;

    // Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
    XrPixelSensorCapabilityRangeContinuousFloatML range{XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_FLOAT_ML};
    CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&range)));

    float selectedExposureUs = desiredExposureUs;
    if ( selectedExposureUs < range.minValue ) {
      selectedExposureUs = range.minValue;
    } else if ( selectedExposureUs > range.maxValue ) {
      selectedExposureUs = range.maxValue;
    }

    // Configure the sensor with the exposure time.
    XrPixelSensorCapabilityConfigFloatML * floatConfig = new XrPixelSensorCapabilityConfigFloatML();
    floatConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_FLOAT_ML;
    floatConfig->next = nullptr;
    floatConfig->stream = queryInfo.stream;
    floatConfig->capabilityType = queryInfo.capabilityType;
    floatConfig->value = selectedExposureUs;
    AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(floatConfig));
    return true;
  }

  // Pick closest fps, resolution, and exposure times.
  bool NegotiateSensorStreamCapabilities() {
    if (m_Sensor==XR_NULL_HANDLE){
      return false;
    }

    // Order these capability discovery functions in the way that prioritizes
    // capabilities that are important to the use case of the application.
    for (uint32_t i=0;i<m_StreamCount;i++) {

      uint32_t count;
      CHK_XR(xrEnumeratePixelSensorCapabilitiesML(m_Sensor, i, 0, &count, nullptr));
      std::vector<XrPixelSensorCapabilityML> availableCapabilities(count);
      CHK_XR(xrEnumeratePixelSensorCapabilitiesML(m_Sensor, i, count, &count, availableCapabilities.data()));

      // Check if all the required capabilities are present.
      for(auto& capabilityType : requiredCapabilities) {
        bool found = false;
        for(auto& availableCapability : availableCapabilities) {
          if(availableCapability.capabilityType == capabilityType) {
            found = true;
            break;
          }
        }
        if(found == false) {
          return false;
        }
      }

      if (!PickFormat(i, m_FrameFormat[i])) {
        return false;
      }

      if (!PickResolution(i, m_FrameResolution[i])) {
        return false;
      }

      if (!PickFrameRate(i, m_FrameRate[i])) {
        return false;
      }

      if (!PickExposure(i, m_ExposureUs[i])) {
        return false;
      }
    }
    return true;
  }
};

12.108.18. Example frame Data

This last example shows how image data can be captured after the sensor has been setup as in the previous examples.

class Sensor : public SensorNegotiate {
  XrFutureEXT m_ConfigFuture{XR_NULL_FUTURE_EXT};
  XrFutureEXT m_StartFuture{XR_NULL_FUTURE_EXT};
  XrFutureEXT m_StopFuture{XR_NULL_FUTURE_EXT};
  enum State { IDLE, CONFIGURING, STARTING, STREAMING, STOPPING };
  State m_State{IDLE};

  struct StreamInfo {
    XrPixelSensorBufferML buffer{XR_TYPE_PIXEL_SENSOR_BUFFER_ML};
    XrPixelSensorFrameML* frame;
    float exposureMs{0};
    XrPixelSensorFisheyeIntrinsicsML intrinsics;
    XrTime lastCaptureTime{0};
  };

  std::vector<StreamInfo> m_StreamInfo;

 public:

  bool Start() {

    XrPath sensorPath;
    if (XR_SUCCESS == xrStringToPath(m_Instance,"pixelsensor/world/center",&sensorPath)) {
      return false;
    }
    if (!CreateSensor(sensorPath)) { return false; }

    if (!NegotiateSensorStreamCapabilities()) {
      return false;
    }

    if (!ConfigureStreams()) {
      return false;
    }

    return true;
  }

  void OnGameTick() {
    // Check for data.
    switch (m_State) {
      case IDLE:
        break;
      case CONFIGURING:
        CheckStreamConfig();
        break;
      case STARTING:
        CheckStartSensor();
        break;
      case STREAMING:
        HandleData();
        break;
      case STOPPING:
        CheckStopSensor();
        break;
    }
  }

  bool Stop() {
    m_State = IDLE;
    std::vector<uint32_t> streams(m_StreamCount);
    std::iota(streams.begin(), streams.end(), 0);
    XrPixelSensorStopInfoML stopInfo {XR_TYPE_PIXEL_SENSOR_STOP_INFO_ML, nullptr, m_StreamCount, streams.data()};
    bool ok = xrStopPixelSensorAsyncML(m_Sensor, &stopInfo, &m_StopFuture) == XR_SUCCESS;
    if ( ok ) {
      m_State = STOPPING;
      return true;
    }
    // Log error
    return false;
  }

 private:

  bool ConfigureStreams() {
    if ( m_State != IDLE ) {
      return false;
    }

    // Notice that we pass the configuration chain that was built in the
    // 2nd example above to the configure function.

    std::vector<uint32_t> streams(m_StreamCount);
    std::iota(streams.begin(), streams.end(), 0);
    XrPixelSensorConfigInfoML configInfo {XR_TYPE_PIXEL_SENSOR_CONFIG_INFO_ML};
    configInfo.streamCount = streams.size();
    configInfo.streams = streams.data();
    configInfo.configurationCount = m_ConfigChain.size();
    configInfo.configurations = Config();

    bool ok = xrConfigurePixelSensorAsyncML(m_Sensor, &configInfo, &m_ConfigFuture) == XR_SUCCESS;
    if ( ok ) {
      m_State = CONFIGURING;
      return true;
    }
    // Log an error
    return false;
  }

  // Since configuring camera sensors can take significant time
  // the XR_EXT_future pattern is used here.
  void CheckStreamConfig() {
    XrFuturePollResultEXT futureResult{XR_TYPE_FUTURE_POLL_RESULT_EXT};
    XrFuturePollInfoEXT pollInfo{XR_TYPE_FUTURE_POLL_INFO_EXT};
    pollInfo.future = m_ConfigFuture;

    CHK_XR(xrPollFutureEXT(m_Instance, &pollInfo, &futureResult)!=XR_SUCCESS);

    switch (futureResult.state) {
      case XR_FUTURE_STATE_PENDING_EXT:
        break;
      case XR_FUTURE_STATE_READY_EXT: {
        XrPixelSensorConfigureCompletionML completion{XR_TYPE_PIXEL_SENSOR_CONFIGURE_COMPLETION_ML};
        CHK_XR(xrConfigurePixelSensorCompleteML(m_Sensor, m_ConfigFuture, &completion));
        if (completion.futureResult==XR_SUCCESS) {
          if(SetupStreamInfo() == false) {
            // Log an error.
            m_State = IDLE;
          }
          // Configuration successful, start the sensor streams.
          std::vector<uint32_t> streams(m_StreamCount);
          std::iota(streams.begin(), streams.end(), 0);
          XrPixelSensorStartInfoML startInfo {XR_TYPE_PIXEL_SENSOR_START_INFO_ML, nullptr, m_StreamCount, streams.data()};
          bool ok = xrStartPixelSensorAsyncML(m_Sensor, &startInfo, &m_StartFuture) == XR_SUCCESS;
          if ( ok ) {
            m_State = STARTING;
            return;
          } else {
            // Log an error.
            m_State = IDLE;
          }
        } else {
          // Log an error.
          m_State = IDLE;
        }
      }
    }
  }

  void CheckStartSensor() {
    XrFuturePollResultEXT futureResult{XR_TYPE_FUTURE_POLL_RESULT_EXT};
    XrFuturePollInfoEXT pollInfo{XR_TYPE_FUTURE_POLL_INFO_EXT};
    pollInfo.future = m_StartFuture;

    CHK_XR(xrPollFutureEXT(m_Instance, &pollInfo, &futureResult)!=XR_SUCCESS);

    switch (futureResult.state) {
      case XR_FUTURE_STATE_PENDING_EXT:
        break;
      case XR_FUTURE_STATE_READY_EXT: {
        XrPixelSensorStartCompletionML completion{XR_TYPE_PIXEL_SENSOR_START_COMPLETION_ML};
        CHK_XR(xrStartPixelSensorCompleteML(m_Sensor, m_ConfigFuture, &completion));
        if (completion.futureResult==XR_SUCCESS) {
          m_State = STREAMING;
        } else {
          // Log an error.
          m_State = IDLE;
        }
      }
    }
  }

  void CheckStopSensor() {
    XrFuturePollResultEXT futureResult{XR_TYPE_FUTURE_POLL_RESULT_EXT};
    XrFuturePollInfoEXT pollInfo{XR_TYPE_FUTURE_POLL_INFO_EXT};
    pollInfo.future = m_StopFuture;

    CHK_XR(xrPollFutureEXT(m_Instance, &pollInfo, &futureResult)!=XR_SUCCESS);

    switch (futureResult.state) {
      case XR_FUTURE_STATE_PENDING_EXT:
        break;
      case XR_FUTURE_STATE_READY_EXT: {
        XrPixelSensorStopCompletionML completion{XR_TYPE_PIXEL_SENSOR_STOP_COMPLETION_ML};
        CHK_XR(xrStopPixelSensorCompleteML(m_Sensor, m_ConfigFuture, &completion));
        if (completion.futureResult==XR_SUCCESS) {
          ClearStreamInfo();
          if (m_Sensor!=XR_NULL_HANDLE) {
            xrDestroyPixelSensorML(m_Sensor);
            m_Sensor = XR_NULL_HANDLE;
          }
          DeleteConfigs();
          m_State = IDLE;
        } else {
          // Log an error.
          m_State = IDLE;
        }
      }
    }
  }

  void ClearStreamInfo() {
    for (auto & streamInfo: m_StreamInfo) {
      delete[] (char*)(streamInfo.buffer.buffer);
      delete streamInfo.frame;
    }
    m_StreamInfo.clear();
  }

  bool SetupStreamInfo() {

    ClearStreamInfo();
    m_StreamInfo.resize(m_StreamCount);

    std::vector<XrPixelSensorMetadataML> requiredMetadatas {XR_PIXEL_SENSOR_METADATA_EXPOSURE_TIME_ML, XR_PIXEL_SENSOR_METADATA_FISHEYE_CAMERA_MODEL_ML};

    for (uint32_t i=0;i<m_StreamCount;i++) {

      uint32_t count;
      CHK_XR(xrEnumeratePixelSensorMetadataML(m_Sensor, i, 0, &count, nullptr));
      std::vector<XrPixelSensorMetadataML> availableMetadatas(count);
      CHK_XR(xrEnumeratePixelSensorMetadataML(m_Sensor, i, count, &count, availableMetadatas.data()));

      // Check if all the required metadata is present.
      bool allMetatadataPresent = std::all_of(requiredMetadatas.begin(), requiredMetadatas.end(),
                                             [&availableMetadatas](XrPixelSensorMetadataML metadata) {
                                               return std::find(availableMetadatas.begin(), availableMetadatas.end(), metadata) != availableMetadatas.end();
                                             });
      if(!allMetatadataPresent) {
        return false;
      }

      XrPixelSensorBufferPropertiesInfoML propertiesInfo{XR_TYPE_PIXEL_SENSOR_BUFFER_PROPERTIES_INFO_ML, nullptr, i,
                                                         static_cast<uint32_t>(requiredMetadatas.size()), requiredMetadatas.data()};
      XrPixelSensorBufferPropertiesML properties{XR_TYPE_PIXEL_SENSOR_BUFFER_PROPERTIES_ML};
      CHK_XR(xrGetPixelSensorBufferPropertiesML(m_Sensor, &propertiesInfo, &properties));

      m_StreamInfo[i].lastCaptureTime = 0;
      m_StreamInfo[i].buffer.buffer = new char[properties.bufferSize];
      m_StreamInfo[i].buffer.bufferSize = properties.bufferSize;
    }
    return true;
  }

  void HandleData() {

    XrPixelSensorDataGetInfoML info{XR_TYPE_PIXEL_SENSOR_DATA_GET_INFO_ML};
    XrPixelSensorDataML sensorData{XR_TYPE_PIXEL_SENSOR_DATA_ML};
    XrPixelSensorExposureTimeML exposureMetadata{XR_TYPE_PIXEL_SENSOR_EXPOSURE_TIME_ML};
    XrPixelSensorFisheyeIntrinsicsML fisheyeMetadata{XR_TYPE_PIXEL_SENSOR_FISHEYE_INTRINSICS_ML};
    sensorData.next = &exposureMetadata;
    exposureMetadata.next = &fisheyeMetadata;

    for (uint32_t i=0;i<m_StreamCount;i++) {
      info.stream = i;
      info.lastCaptureTime = m_StreamInfo[i].lastCaptureTime;
      info.timeout = 0;

      switch(xrGetPixelSensorDataML(m_Sensor, &info, &m_StreamInfo[i].buffer, &sensorData)) {
        case XR_SUCCESS:

          // ***************
          // DATA AVAILABLE!
          // ***************
          m_StreamInfo[i].lastCaptureTime = sensorData.captureTime;
          m_StreamInfo[i].frame = sensorData.frame;
          m_StreamInfo[i].exposureMs = exposureMetadata.exposureTime;
          m_StreamInfo[i].intrinsics = fisheyeMetadata;
          break;
        case XR_TIMEOUT_EXPIRED:
          // Nothing yet.
          break;
      }
    }
  }
};

12.108.19. New Handles

12.108.20. New Enums

XrResult enumeration is extended with:

  • XR_ERROR_PIXEL_SENSOR_PERMISSION_DENIED_ML

  • XR_ERROR_PIXEL_SENSOR_NOT_SUPPORTED_ML

  • XR_ERROR_PIXEL_SENSOR_CAPABILITY_NOT_SUPPORTED_ML

  • XR_ERROR_PIXEL_SENSOR_SPACE_NOT_SUPPORTED_ML

XrStructureType enumeration is extended with:

  • XR_TYPE_PIXEL_SENSOR_CREATE_INFO_ML

  • XR_TYPE_EVENT_DATA_PIXEL_SENSOR_AVAILABILITY_CHANGED_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_FLOAT_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_FLOAT_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_UINT32_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_XR_BOOL32_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_XR_EXTENT_2DI_ML

  • XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML

  • XR_TYPE_PIXEL_SENSOR_EXPOSURE_TIME_ML

  • XR_TYPE_PIXEL_SENSOR_PINHOLE_INTRINSICS_ML

  • XR_TYPE_PIXEL_SENSOR_FISHEYE_INTRINSICS_ML

  • XR_TYPE_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_ML

  • XR_TYPE_PIXEL_SENSOR_DEPTH_CONFIDENCE_BUFFER_ML

  • XR_TYPE_PIXEL_SENSOR_DEPTH_FLAG_BUFFER_ML