XR_ML_pixel_sensor
This API is still an experimental extension not included in the official OpenXR registry and is subject to change.
12.108. XR_ML_pixel_sensor
- Name String
XR_ML_pixel_sensor
- Extension Type
Instance extension
- Registered Extension Number
476
- Revision
1
- Extension and Version Dependencies
- Last Modified Date
2024-03-13
- Contributors
Karthik Kadappan, Magic Leap
Ron Bessems, Magic Leap
Rafael Wiltz, Magic Leap
Philip Unger, Magic Leap
12.108.1. Overview
This extension enables the applications to work with the pixel sensors.
This section will briefly cover some of the key concepts needed to understand how this extension works. Refer to the later sections for more details.
Pixel sensor: Sensor that collects data using a grid based sensor. Runtime may support multiple pixel sensors. For example it may support one depth sensing camera, two eye sensing cameras, and two world sensing cameras.
Sensor permissions: Sensor may require permissions before the application can access the data from the sensor. Refer to the section on Sensor Permissions for more details.
Sensor stream: A sensor may support multiple streams of frame data. For example a depth pixel sensor can support two streams one for short range sensing and one for long range sensing. Refer to the section on Sensor Streams for more details.
Sensor capability: A sensor may have capabilities that can be configured by the application. Some examples of sensor capabilities are frame rate, frame resolution, and sensor exposure time. Refer to the section on Sensor Capabilities for more details.
Sensor metadata: A sensor may provide metadata in addition to the frame data. Some example of metadata are exposure time used to capture the frame, camera model. Refer to the section on Sensor Metadata for more details.
Camera models: Each of the pixel sensors may have a camera model that provides a mathematical representation of how it captures and transforms the 3D world into a 2D image. Refer to the section on Camera Models for more details.
The extension provides APIs to do the following:
12.108.2. Enumerate the sensors
The xrEnumeratePixelSensorsML is used to enumerate all the available pixel sensors.
// Provided by XR_ML_pixel_sensor
XrResult xrEnumeratePixelSensorsML(
XrSession session,
uint32_t sensorCapacityInput,
uint32_t* sensorCountOutput,
XrPath* sensors);
This extension supports the following cameras:
/pixelsensor/picture/center
/pixelsensor/world/left
/pixelsensor/world/center
/pixelsensor/world/right
/pixelsensor/depth/center
/pixelsensor/eye/temple/left
/pixelsensor/eye/nasal/left
/pixelsensor/eye/nasal/right
/pixelsensor/eye/temple/right
12.108.3. Sensor Permissions
Some sensors may require permissions before the application can access the data from the sensor.
Permissions Android applications must have the required permission listed in their manifest to open sensors listed in the table below. |
Permission | Sensor Id |
---|---|
com.magicleap.permission.DEPTH_CAMERA (protection level: dangerous) | /pixelsensor/depth/center |
com.magicleap.permission.EYE_CAMERA (protection level: dangerous) | /pixelsensor/eye/temple/left, /pixelsensor/eye/nasal/left, /pixelsensor/eye/nasal/right, /pixelsensor/eye/temple/right |
permissions android.permission.CAMERA (protection level: dangerous) | /pixelsensor/world/left, /pixelsensor/world/center, pixelsensor/world/right, pixelsensor/picture/center |
12.108.4. Create and destroy a sensor handle
Applications can create a sensor handle using thexrCreatePixelSensorML. This will provide an XrPixelSensorML handle.
Sensor availability may change during the lifecycle of the application. Listen for the XrEventDataPixelSensorAvailabilityChangedML event to be notified of these changes.
The XrPixelSensorML handle is defined as:
// Provided by XR_ML_pixel_sensor
XR_DEFINE_HANDLE(XrPixelSensorML)
The xrCreatePixelSensorML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrCreatePixelSensorML(
XrSession session,
const XrPixelSensorCreateInfoML* createInfo,
XrPixelSensorML* sensor);
The XrPixelSensorCreateInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCreateInfoML {
XrStructureType type;
const void* next;
XrPath sensor;
} XrPixelSensorCreateInfoML;
To close the sensor and release all the resources for that sensor callxrDestroyPixelSensorML.
// Provided by XR_ML_pixel_sensor
XrResult xrDestroyPixelSensorML(
XrPixelSensorML sensor);
12.108.5. Sensor availability change events
The availability of sensor is subject to change at any time. The application must be notified of availability via theXrEventDataPixelSensorAvailabilityChangedML event.
The XrEventDataPixelSensorAvailabilityChangedML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrEventDataPixelSensorAvailabilityChangedML {
XrStructureType type;
const void* next;
XrPath sensor;
XrBool32 available;
XrTime changeTime;
} XrEventDataPixelSensorAvailabilityChangedML;
12.108.6. Sensor streams
Sensors may have multiple data streams. For example a depth camera sensor can support two streams: one for short range sensing and one for long range sensing. A color camera sensor can support two streams, each with different resolutions and frame formats.
The runtime does not make any guarantees about whether the different streams in a sensor are captured at the same time or not.
Use xrGetPixelSensorStreamCountML to get the number of streams supported by the sensor. Streams are indexed from 0 to n-1 where n is the number streams reported byxrGetPixelSensorStreamCountML.
The xrGetPixelSensorStreamCountML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrGetPixelSensorStreamCountML(
XrPixelSensorML sensor,
uint32_t* streamCount);
12.108.7. Enumerate stream capabilities
Use xrEnumeratePixelSensorCapabilitiesML to query the list of theXrPixelSensorCapabilityML that can be configured for each stream. Each capability is identified by it name given byXrPixelSensorCapabilityTypeML, has a data type given byXrPixelSensorCapabilityDataTypeML and a range type given byXrPixelSensorCapabilityRangeTypeML.
The xrEnumeratePixelSensorCapabilitiesML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrEnumeratePixelSensorCapabilitiesML(
XrPixelSensorML sensor,
uint32_t stream,
uint32_t capabilityCapacityInput,
uint32_t* capabilityCountOutput,
XrPixelSensorCapabilityML* capabilities);
The XrPixelSensorCapabilityML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityML {
XrStructureType type;
const void* next;
XrPixelSensorCapabilityTypeML capabilityType;
XrPixelSensorCapabilityDataTypeML capabilityDataType;
XrPixelSensorCapabilityRangeTypeML capabilityRangeType;
} XrPixelSensorCapabilityML;
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorCapabilityTypeML {
XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML = 0,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML = 1,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML = 3,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_DEPTH_ML = 4,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_REALITY_MODE_ML = 5,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML = 100,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_ANALOG_GAIN_ML = 101,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_DIGITAL_GAIN_ML = 102,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_AUTO_EXPOSURE_MODE_ML = 200,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_AUTO_EXPOSURE_TARGET_BRIGHTNESS_ML = 201,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorCapabilityTypeML;
Enum | Description |
---|---|
| Data rate per second, must be specified by application. Data type is uint32_t. |
| Resolution to configure, must be specified by application. Data type is XrExtent2Di. |
| Data format, must be specified by application. Data type is XrPixelSensorFrameFormatML. |
| Range of a depth sensor. Data type is float. |
| Reality mode. Data type is XrPixelSensorRealityModeML. |
| Exposure time in milliseconds, if not specified runtime must use AUTO exposure. Data type is float. |
| Higher gain is useful in low light conditions but may introduce noise. Data type is uint32_t. |
| Higher gain is useful in low light conditions but may introduce noise. Data type is uint32_t. |
| Allowed auto exposure modes. Data type is XrPixelSensorAutoExposureModeML. |
| Set target brightness for auto exposure mode. Data type is float. |
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorCapabilityDataTypeML {
XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML = 0,
XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML = 100,
XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML = 101,
XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML = 200,
XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorCapabilityDataTypeML;
Enum | Description |
---|---|
| Capability is a bool value. |
| Capability is an integer value. |
| Capability is a float value. |
| Capability is a vector of two integers. |
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorCapabilityRangeTypeML {
XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_BOOL_ML = 0,
XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML = 1,
XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML = 2,
XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorCapabilityRangeTypeML;
Enum | Description |
---|---|
| Capability has only two valid states, true or false. |
| Capability can take any value in the given [min, max] range. |
| Capability take any of the discrete values in the list. |
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorFrameFormatML {
XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML = 0,
XR_PIXEL_SENSOR_FRAME_FORMAT_RGBA_8888_ML = 1,
XR_PIXEL_SENSOR_FRAME_FORMAT_YUV_420_888_ML = 2,
XR_PIXEL_SENSOR_FRAME_FORMAT_JPEG_ML = 3,
XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_F32_ML = 4,
XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_RAW_ML = 5,
XR_PIXEL_SENSOR_FRAME_FORMAT_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorFrameFormatML;
Enum | Description |
---|---|
| Each pixel is 1 byte and represents a grayscale value. Datatype of the corresponding frame buffer is uint8_t. |
| Each pixel is 4 bytes and represents R,G,B, and A channels in that order. Datatype of the corresponding frame buffer is uint8_t. |
| Frame is represented in the YUV_420_888 planar forma. Datatype of the corresponding frame buffer is uint8_t. |
| Frame is JPEG encoded. |
| Represents the depth. Depth is the radial distance (in meters) of the real world location with respect to the depth camera. Datatype is float. |
| Raw pixel data representing light captured by the sensor. For depth cameras that have a projector this raw frame will include frames captured both when the projector is on and off. Refer to XrPixelSensorDepthFrameIlluminationML for more details. Data type is float. |
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorRealityModeML {
XR_PIXEL_SENSOR_REALITY_MODE_MIXED_ML = 0,
XR_PIXEL_SENSOR_REALITY_MODE_CAMERA_ML = 1,
XR_PIXEL_SENSOR_REALITY_MODE_VIRTUAL_ML = 2,
XR_PIXEL_SENSOR_REALITY_MODE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorRealityModeML;
Enum | Description |
---|---|
| Camera frame and digital content will be blended into a single frame. |
| Only camera frame will be captured. |
| Only virtual content will be captured. |
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorAutoExposureModeML {
XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_ENVIRONMENT_TRACKING_ML = 0,
XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_CLOSE_PROXIMITY_IR_TRACKING_ML = 1,
XR_PIXEL_SENSOR_AUTO_EXPOSURE_MODE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorAutoExposureModeML;
Enum | Description |
---|---|
| Exposure mode optimized for environment tracking. |
| Exposure mode optimized for close proximity IR light source. |
12.108.8. Query sensor capability ranges
The valid range of the capabilities for each of the sensor streams may be queried using xrQueryPixelSensorCapabilityRangeML. Some stream capabilities may be limited by the configuration of other capabilities within the same stream or a different stream on the same sensor. For example the choice of the frame rate may affect the valid ranges for frame resolution and vice-versa. Another example is a case where a sensor with 2 streams may have an upper limit of 60fps across the 2 streams. So, if one of the streams is configured to have a frame rate of more than 30fps then the valid range of frame rates for the second stream will be reduced so that the total frame rate summed across both the streams does not exceed 60fps.
Creating a valid configuration of the various capabilities is an iterative step. Applications starts by querying the valid range for the first capability. Before calling the query function the second time to get the valid range for the next capability the application needs to pick a value for the first capability and pass it back into the query function. This process needs to be continued till the necessary capabilities are configured.
xrQueryPixelSensorCapabilityRangeML allows the application to detect valid combinations of XrPixelSensorCapabilityTypeML for all the streams in the sensor.
If a sensor does not support a queried capability the runtime must returnXR_ERROR_PIXEL_SENSOR_CAPABILITY_NOT_SUPPORTED_ML
.
The xrQueryPixelSensorCapabilityRangeML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrQueryPixelSensorCapabilityRangeML(
XrPixelSensorML sensor,
const XrPixelSensorCapabilityQueryInfoML* queryInfo,
const uint32_t configurationCount,
const XrPixelSensorCapabilityConfigBaseHeaderML*const * configurations,
XrPixelSensorCapabilityRangeBaseHeaderML* capabilityRange);
The XrPixelSensorCapabilityQueryInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityQueryInfoML {
XrStructureType type;
const void* next;
uint32_t stream;
XrPixelSensorCapabilityTypeML capabilityType;
} XrPixelSensorCapabilityQueryInfoML;
The valid range for a capability can be a set of discrete values, or a continuous range with an upper and lower bound, etc. All the structures that define the valid range for a capability must be an extension of the XrPixelSensorCapabilityRangeBaseHeaderML base type.
The type of structure to use for getting the range of capability can be inferred from the XrPixelSensorCapabilityML::capabilityDataType
and XrPixelSensorCapabilityML::capabilityRangeType
.
The XrPixelSensorCapabilityRangeBaseHeaderML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeBaseHeaderML {
XrStructureType type;
void* next;
XrBool32 valid;
} XrPixelSensorCapabilityRangeBaseHeaderML;
XrPixelSensorCapabilityRangeContinuousFloatML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML
and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML
.
The XrPixelSensorCapabilityRangeContinuousFloatML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeContinuousFloatML {
XrStructureType type;
void* next;
XrBool32 valid;
float minValue;
float maxValue;
} XrPixelSensorCapabilityRangeContinuousFloatML;
XrPixelSensorCapabilityRangeContinuousUint32ML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML
and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_CONTINUOUS_ML
.
The XrPixelSensorCapabilityRangeContinuousUint32ML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeContinuousUint32ML {
XrStructureType type;
void* next;
XrBool32 valid;
uint32_t minValue;
uint32_t maxValue;
} XrPixelSensorCapabilityRangeContinuousUint32ML;
XrPixelSensorCapabilityRangeDiscreteUint32ML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML
and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML
.
The XrPixelSensorCapabilityRangeDiscreteUint32ML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeDiscreteUint32ML {
XrStructureType type;
void* next;
XrBool32 valid;
uint32_t valueCapacityInput;
uint32_t valueCountOutput;
uint32_t* values;
} XrPixelSensorCapabilityRangeDiscreteUint32ML;
XrPixelSensorCapabilityRangeDiscreteXrBool32ML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML
and range type ofXR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML
.
The XrPixelSensorCapabilityRangeDiscreteXrBool32ML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeDiscreteXrBool32ML {
XrStructureType type;
void* next;
XrBool32 valid;
uint32_t valueCapacityInput;
uint32_t valueCountOutput;
XrBool32* values;
} XrPixelSensorCapabilityRangeDiscreteXrBool32ML;
XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML
and range type of XR_PIXEL_SENSOR_CAPABILITY_RANGE_TYPE_DISCRETE_ML
.
The XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML {
XrStructureType type;
void* next;
XrBool32 valid;
uint32_t valueCapacityInput;
uint32_t valueCountOutput;
XrExtent2Di* values;
} XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML;
xrQueryPixelSensorCapabilityRangeML andxrConfigurePixelSensorAsyncML takes an array of structures to configure each of the capabilities. All the structures that define a configuration for a capability must be an extension of the XrPixelSensorCapabilityConfigBaseHeaderML base type.
The type of structure to use for configuring the capability can be inferred from the XrPixelSensorCapabilityML::capabilityDataType
.
The XrPixelSensorCapabilityConfigBaseHeaderML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigBaseHeaderML {
XrStructureType type;
const void* next;
uint32_t stream;
XrPixelSensorCapabilityTypeML capabilityType;
} XrPixelSensorCapabilityConfigBaseHeaderML;
XrPixelSensorCapabilityConfigXrBool32ML must be used for capability with data type of XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_BOOL32_ML
.
The XrPixelSensorCapabilityConfigXrBool32ML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigXrBool32ML {
XrStructureType type;
const void* next;
uint32_t stream;
XrPixelSensorCapabilityTypeML capabilityType;
XrBool32 value;
} XrPixelSensorCapabilityConfigXrBool32ML;
XrPixelSensorCapabilityConfigUint32ML must be used for capability with data type of XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_UINT32_ML
.
The XrPixelSensorCapabilityConfigUint32ML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigUint32ML {
XrStructureType type;
const void* next;
uint32_t stream;
XrPixelSensorCapabilityTypeML capabilityType;
uint32_t value;
} XrPixelSensorCapabilityConfigUint32ML;
XrPixelSensorCapabilityConfigFloatML must be used for capability with data type of XR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_FLOAT_ML
.
The XrPixelSensorCapabilityConfigFloatML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigFloatML {
XrStructureType type;
const void* next;
uint32_t stream;
XrPixelSensorCapabilityTypeML capabilityType;
float value;
} XrPixelSensorCapabilityConfigFloatML;
XrPixelSensorCapabilityConfigXrExtent2DiML must be used for capability with data type ofXR_PIXEL_SENSOR_CAPABILITY_DATA_TYPE_XR_EXTENT_2DI_ML
.
The XrPixelSensorCapabilityConfigXrExtent2DiML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCapabilityConfigXrExtent2DiML {
XrStructureType type;
const void* next;
uint32_t stream;
XrPixelSensorCapabilityTypeML capabilityType;
XrExtent2Di value;
} XrPixelSensorCapabilityConfigXrExtent2DiML;
12.108.9. Configure sensor capabilities
Once the sensor capabilities have been queried usingxrQueryPixelSensorCapabilityRangeML,xrConfigurePixelSensorAsyncML can be used to apply this configuration in one step.
The runtime must return XR_ERROR_CALL_ORDER_INVALID
ifxrQueryPixelSensorCapabilityRangeML has not been called for this sensor handle for all XrPixelSensorCapabilityTypeML requested.
The runtime must return XR_ERROR_VALIDATION_FAILURE
if the sensor is still active. In this case application must call xrStopPixelSensorAsyncML on all the streams before applying a new configuration.
The runtime must support reconfiguring a sensor with new capabilities without requiring to destroying and recreating an XrPixelSensorML.
The runtime must return XR_ERROR_VALIDATION_FAILURE
if any of the capabilities are not valid.
The runtime must return XR_ERROR_VALIDATION_FAILURE
if any of the required capabilities are not provided. See XrPixelSensorCapabilityTypeML for the list of capabilities thatmust be specified by the application.
The xrConfigurePixelSensorAsyncML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrConfigurePixelSensorAsyncML(
XrPixelSensorML sensor,
const XrPixelSensorConfigInfoML* configInfo,
XrFutureEXT* future);
The XrPixelSensorConfigInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorConfigInfoML {
XrStructureType type;
const void* next;
uint32_t streamCount;
const uint32_t* streams;
uint32_t configurationCount;
const XrPixelSensorCapabilityConfigBaseHeaderML*const * configurations;
} XrPixelSensorConfigInfoML;
The xrConfigurePixelSensorCompleteML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrConfigurePixelSensorCompleteML(
XrPixelSensorML sensor,
XrFutureEXT future,
XrPixelSensorConfigureCompletionML* completion);
The XrPixelSensorConfigureCompletionML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorConfigureCompletionML {
XrStructureType type;
void* next;
XrResult futureResult;
} XrPixelSensorConfigureCompletionML;
12.108.10. Allocate memory for sensor data
The application is responsible for allocating the memory to store the sensor data.xrGetPixelSensorBufferPropertiesML must provide the size of the buffer to allocate.
xrGetPixelSensorBufferPropertiesML must returnXR_ERROR_CALL_ORDER_INVALID
if xrConfigurePixelSensorCompleteMLwas not yet called for this XrPixelSensorML.
The xrGetPixelSensorBufferPropertiesML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrGetPixelSensorBufferPropertiesML(
XrPixelSensorML sensor,
const XrPixelSensorBufferPropertiesInfoML* propertiesInfo,
XrPixelSensorBufferPropertiesML* properties);
The XrPixelSensorBufferPropertiesInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorBufferPropertiesInfoML {
XrStructureType type;
const void* next;
uint32_t stream;
uint32_t metadataCount;
const XrPixelSensorMetadataML* metadatas;
} XrPixelSensorBufferPropertiesInfoML;
The XrPixelSensorBufferPropertiesML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorBufferPropertiesML {
XrStructureType type;
void* next;
uint32_t stream;
uint32_t bufferSize;
} XrPixelSensorBufferPropertiesML;
12.108.11. Start and stop sensor streams
Once the sensor capabilities have been configuredxrStartPixelSensorAsyncML can be used to start streaming data from the sensor.
The Runtime must return XR_ERROR_CALL_ORDER_INVALID
ifxrConfigurePixelSensorCompleteML has not been called for this sensor handle.
Sensor stream must be started after the sensor configuration was completed and before requesting new frames via xrGetPixelSensorDataML.
The xrStartPixelSensorAsyncML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrStartPixelSensorAsyncML(
XrPixelSensorML sensor,
const XrPixelSensorStartInfoML* startInfo,
XrFutureEXT* future);
The XrPixelSensorStartInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStartInfoML {
XrStructureType type;
const void* next;
uint32_t streamCount;
const uint32_t* streams;
} XrPixelSensorStartInfoML;
The xrStartPixelSensorCompleteML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrStartPixelSensorCompleteML(
XrPixelSensorML sensor,
XrFutureEXT future,
XrPixelSensorStartCompletionML* completion);
The XrPixelSensorStartCompletionML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStartCompletionML {
XrStructureType type;
void* next;
XrResult futureResult;
} XrPixelSensorStartCompletionML;
Call xrStopPixelSensorAsyncML to stop the sensor streams without having to destroy the sensor handle. The runtime may take advantage of the start and stop APIs to save on system resources. However the runtime must retain the state of the sensors (i.e. the most recent sensor configuration) and use it when the application callsxrStartPixelSensorAsyncML.
The xrStopPixelSensorAsyncML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrStopPixelSensorAsyncML(
XrPixelSensorML sensor,
const XrPixelSensorStopInfoML* stopInfo,
XrFutureEXT* future);
The XrPixelSensorStopInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStopInfoML {
XrStructureType type;
const void* next;
uint32_t streamCount;
const uint32_t* streams;
} XrPixelSensorStopInfoML;
The xrStopPixelSensorCompleteML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrStopPixelSensorCompleteML(
XrPixelSensorML sensor,
XrFutureEXT future,
XrPixelSensorStopCompletionML* completion);
The XrPixelSensorStopCompletionML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorStopCompletionML {
XrStructureType type;
void* next;
XrResult futureResult;
} XrPixelSensorStopCompletionML;
12.108.12. Query sensor data
Once the sensor capabilities have been configured and the necessary streams for the sensors are started xrGetPixelSensorDataML can be used to get sensor data.
The Runtime must return XR_ERROR_CALL_ORDER_INVALID
ifxrStartPixelSensorCompleteML has not been called for this sensor handle.
The xrGetPixelSensorDataML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrGetPixelSensorDataML(
XrPixelSensorML sensor,
const XrPixelSensorDataGetInfoML* info,
XrPixelSensorBufferML* buffer,
XrPixelSensorDataML* data);
xrGetPixelSensorDataML must return the latest frame data available after XrPixelSensorDataGetInfoML::lastCaptureTime
. If multiple frames arrived between the current time andXrPixelSensorDataGetInfoML::lastCaptureTime
only the newest frame will be returned.
xrGetPixelSensorDataML must wait up toXrPixelSensorDataGetInfoML::timeout
for image data to arrive. If no image data arrives before theXrPixelSensorDataGetInfoML::timeout
expires the function must return XR_TIMEOUT_EXPIRED
.
The application must call xrGetPixelSensorBufferPropertiesML at least once per XrPixelSensorML handle. If not the runtime must return XR_ERROR_CALL_ORDER_INVALID
fromxrGetPixelSensorDataML.
The application may reuse the same XrPixelSensorDataML for subsequent calls to xrGetPixelSensorDataML if it no longer needs the data from previous call.
The XrPixelSensorDataGetInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDataGetInfoML {
XrStructureType type;
const void* next;
uint32_t stream;
XrTime lastCaptureTime;
XrDuration timeout;
} XrPixelSensorDataGetInfoML;
The XrPixelSensorBufferML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorBufferML {
XrStructureType type;
void* next;
uint32_t bufferSize;
void* buffer;
} XrPixelSensorBufferML;
All pixel sensors must provide the XrPixelSensorDataML frame data viaxrGetPixelSensorDataML. In addition to this pixel sensors may provide addition metadata as described in later sections.
The XrPixelSensorDataML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDataML {
XrStructureType type;
void* next;
XrTime captureTime;
XrPixelSensorFrameML* frame;
} XrPixelSensorDataML;
XrPixelSensorFrameML structure is used to store per pixel data. The type of data stored for each pixel varies and depends on theXrPixelSensorFrameTypeML. The top left corner of the frame is treated as the origin.
The XrPixelSensorFrameML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorFrameML {
XrStructureType type;
void* next;
XrPixelSensorFrameTypeML frameType;
uint32_t planeCount;
XrPixelSensorPlaneML* planes;
} XrPixelSensorFrameML;
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorFrameTypeML {
XR_PIXEL_SENSOR_FRAME_TYPE_GRAYSCALE_U8ML = 0,
XR_PIXEL_SENSOR_FRAME_TYPE_RGBA_8888_ML = 1,
XR_PIXEL_SENSOR_FRAME_TYPE_YUV_420_888_ML = 2,
XR_PIXEL_SENSOR_FRAME_TYPE_JPEG_ML = 3,
XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_32_ML = 4,
XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_RAW_ML = 5,
XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_CONFIDENCE_ML = 6,
XR_PIXEL_SENSOR_FRAME_TYPE_DEPTH_FLAGS_ML = 7,
XR_PIXEL_SENSOR_FRAME_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorFrameTypeML;
Enum | Description |
---|---|
| Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML for more details. |
| Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_RGBA_8888_ML for more details. |
| Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_YUV_420_888_ML for more details. |
| Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_JPEG_ML for more details. |
| Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_32_ML for more details. |
| Refer to XR_PIXEL_SENSOR_FRAME_FORMAT_DEPTH_RAW_ML for more details. |
| Refer XR_PIXEL_SENSOR_CAPABILITY_TYPE_METADATA_DEPTH_CONFIDENCE_BUFFER_ML. |
| Refer to XR_PIXEL_SENSOR_CAPABILITY_TYPE_METADATA_DEPTH_FLAG_BUFFER_ML. |
The XrPixelSensorPlaneML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorPlaneML {
XrStructureType type;
void* next;
uint32_t width;
uint32_t height;
uint32_t stride;
uint32_t bytesPerPixel;
uint32_t pixelStride;
uint32_t bufferSize;
void* buffer;
} XrPixelSensorPlaneML;
12.108.13. Sensor Metadata
Pixel sensors may provide additional meta data for the captured frames. Application can obtain this metadata by chaining the next pointer inXrPixelSensorDataML::next
pointer.XrPixelSensorMetadataML represents the list of metadata that a sensormay support.
The XrPixelSensorMetadataML is defined as:
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorMetadataML {
XR_PIXEL_SENSOR_METADATA_EXPOSURE_TIME_ML = 0,
XR_PIXEL_SENSOR_METADATA_ANALOG_GAIN_ML = 1,
XR_PIXEL_SENSOR_METADATA_DIGITAL_GAIN_ML = 2,
XR_PIXEL_SENSOR_METADATA_PINHOLE_CAMERA_MODEL_ML = 3,
XR_PIXEL_SENSOR_METADATA_FISHEYE_CAMERA_MODEL_ML = 4,
XR_PIXEL_SENSOR_METADATA_DEPTH_FRAME_ILLUMINATION_ML = 5,
XR_PIXEL_SENSOR_METADATA_DEPTH_CONFIDENCE_BUFFER_ML = 6,
XR_PIXEL_SENSOR_METADATA_DEPTH_FLAG_BUFFER_ML = 7,
XR_PIXEL_SENSOR_METADATA_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorMetadataML;
Enum | Description |
---|---|
| Exposure time in milliseconds used to capture the frame. Refer to XrPixelSensorExposureTimeML for more details. |
| Analog gain used to capture the frame. Refer to XrPixelSensorAnalogGainML for more details. |
| Digital gain used to capture the frame. Refer to XrPixelSensorDigitalGainML for more details. |
| Pinhole camera model. Refer to XrPixelSensorPinholeIntrinsicsML for more details. |
| Fisheye camera model. Refer to XrPixelSensorFisheyeIntrinsicsML for more details. |
| Illumination type used for the depth frame. Refer to XrPixelSensorDepthFrameIlluminationML for more details. |
| Confidence values for each pixel in the camera frame. The confidence score is derived from the sensor noise and it is not normalized. The higher the value the higher the confidence. Applications can determine what confidence threshold to use based on their use case. Data type is float. |
| Flag bits for each pixel in the depth camera frame. Refer to XrPixelSensorDepthFlagsML for more details. Data type is uint32_t. |
The type of metadata supported depends both on the sensor and how it is configured. Use xrEnumeratePixelSensorMetadataML to get the list of metadata supported by the sensor streams.
xrEnumeratePixelSensorMetadataML must returnXR_ERROR_CALL_ORDER_INVALID
if xrConfigurePixelSensorCompleteMLwas not yet called for this XrPixelSensorML.
Some the metadata (ex:XR_PIXEL_SENSOR_METADATA_PINHOLE_CAMERA_MODEL_ML
) might be the same for all the streams in a given sensor. In such cases applications may choose to configure only one of the streams to return the metadata.
The xrEnumeratePixelSensorMetadataML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrEnumeratePixelSensorMetadataML(
XrPixelSensorML sensor,
uint32_t stream,
uint32_t metadataCapacityInput,
uint32_t* metadataCountOutput,
XrPixelSensorMetadataML* metadatas);
The XrPixelSensorExposureTimeML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorExposureTimeML {
XrStructureType type;
void* next;
float exposureTime;
} XrPixelSensorExposureTimeML;
The XrPixelSensorAnalogGainML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorAnalogGainML {
XrStructureType type;
void* next;
uint32_t analogGain;
} XrPixelSensorAnalogGainML;
The XrPixelSensorDigitalGainML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDigitalGainML {
XrStructureType type;
void* next;
uint32_t digitalGain;
} XrPixelSensorDigitalGainML;
The XrPixelSensorDepthFrameIlluminationML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDepthFrameIlluminationML {
XrStructureType type;
void* next;
XrPixelSensorDepthFrameIlluminationTypeML illuminationType;
} XrPixelSensorDepthFrameIlluminationML;
// Provided by XR_ML_pixel_sensor
typedef enum XrPixelSensorDepthFrameIlluminationTypeML {
XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_ON_ML = 0,
XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_OFF_ML = 1,
XR_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_TYPE_MAX_ENUM_ML = 0x7FFFFFFF
} XrPixelSensorDepthFrameIlluminationTypeML;
Enum | Description |
---|---|
| Depth camera frame projector is on. |
| Depth camera frame projector is off. |
The XrPixelSensorDepthConfidenceBufferML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDepthConfidenceBufferML {
XrStructureType type;
void* next;
XrPixelSensorFrameML* frame;
} XrPixelSensorDepthConfidenceBufferML;
The XrPixelSensorDepthFlagBufferML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorDepthFlagBufferML {
XrStructureType type;
void* next;
XrPixelSensorFrameML* frame;
} XrPixelSensorDepthFlagBufferML;
// Provided by XR_ML_pixel_sensor
typedef XrFlags64 XrPixelSensorDepthFlagsML;
// Provided by XR_ML_pixel_sensor
// Flag bits for XrPixelSensorDepthFlagsML
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_INVALID_BIT_ML = 0x00000001;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_SATURATED_BIT_ML = 0x00000002;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_INCONSISTENT_BIT_ML = 0x00000004;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_LOW_SIGNAL_BIT_ML = 0x00000008;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_FLYING_PIXEL_BIT_ML = 0x00000010;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_MASKED_BIT_ML = 0x00000020;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_SBI_BIT_ML = 0x00000100;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_STRAY_LIGHT_BIT_ML = 0x00000200;
static const XrPixelSensorDepthFlagsML XR_PIXEL_SENSOR_DEPTH_CONNECTED_COMPONENTS_BIT_ML = 0x00000400;
12.108.14. Camera Models
Different cameras may support different camera models. Applications can use xrEnumeratePixelSensorMetadataML to determine what camera models are supported by each one of the sensors.
XrPixelSensorPinholeIntrinsicsML specifies the camera intrinsics and distortion co-efficients for a pinhole camera model.
The XrPixelSensorPinholeIntrinsicsML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorPinholeIntrinsicsML {
XrStructureType type;
void* next;
XrVector2f focalLength;
XrVector2f principalPoint;
XrVector2f fov;
double distortion[5];
} XrPixelSensorPinholeIntrinsicsML;
XrPixelSensorFisheyeIntrinsicsML specifies the camera matrix and distortion co-efficients for a Magic Leap’s fisheye camera model. This fisheye model differentiates itself from conventional fisheye models (seehere) by adding an additional tangential term on top of the existing method. Applications can use the intrinsics with the conventional OpenCV fisheye calibration library (seehere) by dropping the tangential terms (p1 and p2 in the equations below) but this may result in lower accuracy.
Radial distortion coefficients: k1, k2, k3, k4
Tangential distortion coefficients: p1, p2
If P = [x, y, z] is a point in camera coordinates and a = x/z, b = y/z are the corresponding point locations in normalized image coordinates, this model will project and distort said point in the following way:
Conventional fisheye model
r = sqrt(a^2 + b^2)
θ = atan( r )
θ_rad = θ * (1 + k1 * θ^2 + k2 * θ^4 + k3 * θ^6 + k4 * θ^8)
x_rad = a * ( θ_rad / r )
y_rad = b * ( θ_rad / r )
Tangential term (can be omitted if reduced accuracy is acceptable)
r_rad_sq = x_rad^2 + y_rad^2
x_rad_tan = x_rad + 2 * p1 * x_rad * y_rad + p2 * (r_rad_sq + 2 * x_rad^2)
y_rad_tan = y_rad + p1 * (r_rad_sq + 2 * y_rad^2) + 2 * p2 * x_rad * y_rad
The XrPixelSensorFisheyeIntrinsicsML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorFisheyeIntrinsicsML {
XrStructureType type;
void* next;
XrVector2f focalLength;
XrVector2f principalPoint;
XrVector2f fov;
double radialDistortion[4];
double tangentialDistortion[2];
} XrPixelSensorFisheyeIntrinsicsML;
12.108.15. Create Sensor pose
For certain applications it is useful to have the exact pose of the sensor available, xrCreatePixelSensorSpaceML creates an XrSpace for the sensor. The runtime must return XR_ERROR_PIXEL_SENSOR_SPACE_NOT_SUPPORTED_ML
if it cannot provide pose information for the requested sensor, the created space must be valid and must yield valid pose information when usingxrLocateSpace.
The xrCreatePixelSensorSpaceML function is defined as:
// Provided by XR_ML_pixel_sensor
XrResult xrCreatePixelSensorSpaceML(
XrSession session,
const XrPixelSensorCreateSpaceInfoML* info,
XrSpace* space);
The XrPixelSensorCreateSpaceInfoML structure is defined as:
// Provided by XR_ML_pixel_sensor
typedef struct XrPixelSensorCreateSpaceInfoML {
XrStructureType type;
const void* next;
XrPixelSensorML sensor;
XrPosef offset;
} XrPixelSensorCreateSpaceInfoML;
12.108.16. Example Sensor pose
Some applications need to know exactly where a sensor is at a given time,xrCreatePixelSensorSpaceML allows the application to retrieve anXrSpace, which can then be used to get the location of the sensor.
Typically this would be combined with sensor data extraction.
class Sensor {
public:
bool Start() {
XrPath sensorPath;
if (XR_SUCCESS == xrStringToPath(instance,"pixelsensor/world/center",&sensorPath)) {
return false;
}
if (!CreateSensor(sensorPath)) { return false; }
XrPixelSensorCreateSpaceInfoML info{XR_TYPE_PIXEL_SENSOR_CREATE_SPACE_INFO_ML};
info.sensor = m_Sensor;
info.offset.orientation.w = 1.0f;
CHK_XR(xrCreatePixelSensorSpaceML(m_Session, &info, &m_SensorSpace));
return true;
}
void OnGameTick(XrTime displayTime) {
if (m_SensorSpace==XR_NULL_HANDLE) {
return;
}
XrSpaceLocation location{XR_TYPE_SPACE_LOCATION};
// Instead of displayTime, XrPixelSensorDataML::captureTime can be used
// to get the pose of the camera at capture time.
CHK_XR(xrLocateSpace(m_SensorSpace, m_LocalSpace, displayTime, &location));
// do something with the pose of the sensor.
}
void Stop() {
// Clean up the space.
xrDestroySpace(m_SensorSpace);
m_SensorSpace = XR_NULL_HANDLE;
// Clean up the sensor.
xrDestroyPixelSensorML(m_Sensor);
m_Sensor = XR_NULL_HANDLE;
}
protected:
// Previously initialized.
XrSession m_Session;
XrSpace m_LocalSpace;
XrPixelSensorML m_Sensor{XR_NULL_HANDLE};
XrSpace m_SensorSpace{XR_NULL_HANDLE};
bool CreateSensor(XrPath sensor) {
// Enumerate all the available sensors.
uint32_t count;
CHK_XR(xrEnumeratePixelSensorsML(m_Session, 0, &count, nullptr ));
std::vector<XrPath> sensors(count);
CHK_XR(xrEnumeratePixelSensorsML(m_Session, count, &count, sensors.data() ));
// Check if the requested sensor is currently available.
if (std::find(sensors.begin(), sensors.end(), sensor) == sensors.end()){
return false;
}
// Create the handle.
XrPixelSensorCreateInfoML createInfo{XR_TYPE_PIXEL_SENSOR_CREATE_INFO_ML, nullptr, sensor};
CHK_XR(xrCreatePixelSensorML(m_Session, &createInfo, &m_Sensor));
return true;
}
};
12.108.17. Example Sensor Configuration
This example builds on the previous example and adds the sensor negotiation to the mix.
class SensorNegotiate : public Sensor {
public:
bool Start() {
XrPath sensorPath;
if (XR_SUCCESS == xrStringToPath(m_Instance,"pixelsensor/world/center",&sensorPath)) {
return false;
}
if (!CreateSensor(sensorPath)) { return false; }
if (!NegotiateSensorStreamCapabilities()) { return false; }
return true;
}
void Stop() {
// cleanup the configuration.
DeleteConfigs();
// clean up the sensor.
xrDestroyPixelSensorML(m_Sensor);
m_Sensor = XR_NULL_HANDLE;
}
protected:
std::array<int, m_StreamCount> m_FrameRate {30, 30};
std::array<XrExtent2Di, m_StreamCount> m_FrameResolution { {{640,480}, {640,480}} };
std::array<XrPixelSensorFrameFormatML, m_StreamCount> m_FrameFormat {XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML, XR_PIXEL_SENSOR_FRAME_FORMAT_GRAYSCALE_U8_ML};
std::array<float, m_StreamCount> m_ExposureUs {8.5, 25.5};
std::vector<XrPixelSensorCapabilityTypeML> requiredCapabilities {XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML, XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML,
XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML, XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML};
std::vector<XrPixelSensorCapabilityConfigBaseHeaderML*> m_ConfigChain;
void AppendConfig(XrPixelSensorCapabilityConfigBaseHeaderML* config) {
m_ConfigChain.push_back(config);
}
void DeleteConfigs() {
for (auto & config: m_ConfigChain) {
switch ( config->type) {
case XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML:
delete reinterpret_cast<XrPixelSensorCapabilityConfigUint32ML*>(config);
break;
case XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_FLOAT_ML:
delete reinterpret_cast<XrPixelSensorCapabilityConfigFloatML*>(config);
break;
case XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML:
delete reinterpret_cast<XrPixelSensorCapabilityConfigXrExtent2DiML*>(config);
break;
default:
// If we reach this we missed one of the types.
break;
};
}
m_ConfigChain.clear();
}
XrPixelSensorCapabilityConfigBaseHeaderML *const * Config() const {
return m_ConfigChain.size() > 0 ? &m_ConfigChain[0] : nullptr;
}
bool PickFormat(uint32_t stream, XrPixelSensorFrameFormatML desiredFormat) {
XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
queryInfo.stream = stream;
queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_FORMAT_ML;
// Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
XrPixelSensorCapabilityRangeDiscreteUint32ML capValues{XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML};
capValues.valueCapacityInput = 0;
capValues.values = nullptr;
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));
if (capValues.valueCountOutput==0) {
return false; // No formats available.
}
std::vector<XrPixelSensorFrameFormatML> values(capValues.valueCountOutput);
capValues.values = reinterpret_cast<uint32_t*>(values.data());
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));
// Check if the required format is supported.
auto it = std::find(values.begin(), values.end(), desiredFormat);
if (it==values.end()) {
return false;
}
// Configure the sensor with the requested format.
XrPixelSensorCapabilityConfigUint32ML * formatConfig = new XrPixelSensorCapabilityConfigUint32ML();
formatConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML;
formatConfig->next = nullptr;
formatConfig->stream = queryInfo.stream;
formatConfig->capabilityType = queryInfo.capabilityType;
formatConfig->value = desiredFormat;
AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(formatConfig));
return true;
}
bool PickResolution(uint32_t stream, XrExtent2Di desiredResolution) {
XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
queryInfo.stream = stream;
queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_RESOLUTION_ML;
// Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
XrPixelSensorCapabilityRangeDiscreteXrExtent2DiML capResolution{XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML};
capResolution.valueCapacityInput = 0;
capResolution.values = nullptr;
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capResolution)));
if (capResolution.valueCountOutput==0) {
return false; // No resolutions available.
}
std::vector<XrExtent2Di> resolutions(capResolution.valueCountOutput);
capResolution.values = resolutions.data();
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capResolution)));
// Check if the required resolution is supported.
bool found = false;
for(const auto& resolution : resolutions) {
if(resolution.width == desiredResolution.width && resolution.height == desiredResolution.height) {
found = true;
break;
}
}
if (found == false) {
// As a fallback if the desired resolution is not available then pick
// the first available resolution. Other heuristics (i.e. closest) can
// be used to pick resolution as well.
desiredResolution = resolutions[0];
}
// Configure the sensor with the resolution.
XrPixelSensorCapabilityConfigXrExtent2DiML * resolutionConfig = new XrPixelSensorCapabilityConfigXrExtent2DiML();
resolutionConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML;
resolutionConfig->next = nullptr;
resolutionConfig->stream = queryInfo.stream;
resolutionConfig->capabilityType = queryInfo.capabilityType;
resolutionConfig->value = desiredResolution;
AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(resolutionConfig));
return true;
}
bool PickFrameRate(uint32_t stream, int desiredFrameRate) {
XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
queryInfo.stream = stream;
queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_UPDATE_RATE_ML;
// Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
XrPixelSensorCapabilityRangeDiscreteUint32ML capValues{XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML};
capValues.valueCapacityInput = 0;
capValues.values = nullptr;
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));
if (capValues.valueCountOutput==0) {
return false; // No frame rates available.
}
std::vector<XrPixelSensorFrameFormatML> values(capValues.valueCountOutput);
capValues.values = reinterpret_cast<uint32_t*>(values.data());
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&capValues)));
// Check if the required frame rate is supported.
// If not pick the closest available frame rate.
auto it = std::min_element(values.begin(), values.end(),
[desiredFrameRate](int a, int b) {
return std::abs(a - desiredFrameRate) < std::abs(b - desiredFrameRate);
});
if (it==values.end()) {
return false;
}
// Configure the sensor with the requested frame rate.
XrPixelSensorCapabilityConfigUint32ML * frameRateConfig = new XrPixelSensorCapabilityConfigUint32ML();
frameRateConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML;
frameRateConfig->next = nullptr;
frameRateConfig->stream = queryInfo.stream;
frameRateConfig->capabilityType = queryInfo.capabilityType;
frameRateConfig->value = *it;
AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(frameRateConfig));
return true;
}
bool PickExposure(uint32_t stream, float desiredExposureUs) {
XrPixelSensorCapabilityQueryInfoML queryInfo{XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML};
queryInfo.stream = stream;
queryInfo.capabilityType = XR_PIXEL_SENSOR_CAPABILITY_TYPE_MANUAL_EXPOSURE_TIME_ML;
// Use XrPixelSensorCapabilityML.capabilityRangeType to determine the range type to use here.
XrPixelSensorCapabilityRangeContinuousFloatML range{XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_FLOAT_ML};
CHK_XR(xrQueryPixelSensorCapabilityRangeML(m_Sensor, &queryInfo, m_ConfigChain.size(), Config(), reinterpret_cast<XrPixelSensorCapabilityRangeBaseHeaderML*>(&range)));
float selectedExposureUs = desiredExposureUs;
if ( selectedExposureUs < range.minValue ) {
selectedExposureUs = range.minValue;
} else if ( selectedExposureUs > range.maxValue ) {
selectedExposureUs = range.maxValue;
}
// Configure the sensor with the exposure time.
XrPixelSensorCapabilityConfigFloatML * floatConfig = new XrPixelSensorCapabilityConfigFloatML();
floatConfig->type = XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_FLOAT_ML;
floatConfig->next = nullptr;
floatConfig->stream = queryInfo.stream;
floatConfig->capabilityType = queryInfo.capabilityType;
floatConfig->value = selectedExposureUs;
AppendConfig(reinterpret_cast<XrPixelSensorCapabilityConfigBaseHeaderML*>(floatConfig));
return true;
}
// Pick closest fps, resolution, and exposure times.
bool NegotiateSensorStreamCapabilities() {
if (m_Sensor==XR_NULL_HANDLE){
return false;
}
// Order these capability discovery functions in the way that prioritizes
// capabilities that are important to the use case of the application.
for (uint32_t i=0;i<m_StreamCount;i++) {
uint32_t count;
CHK_XR(xrEnumeratePixelSensorCapabilitiesML(m_Sensor, i, 0, &count, nullptr));
std::vector<XrPixelSensorCapabilityML> availableCapabilities(count);
CHK_XR(xrEnumeratePixelSensorCapabilitiesML(m_Sensor, i, count, &count, availableCapabilities.data()));
// Check if all the required capabilities are present.
for(auto& capabilityType : requiredCapabilities) {
bool found = false;
for(auto& availableCapability : availableCapabilities) {
if(availableCapability.capabilityType == capabilityType) {
found = true;
break;
}
}
if(found == false) {
return false;
}
}
if (!PickFormat(i, m_FrameFormat[i])) {
return false;
}
if (!PickResolution(i, m_FrameResolution[i])) {
return false;
}
if (!PickFrameRate(i, m_FrameRate[i])) {
return false;
}
if (!PickExposure(i, m_ExposureUs[i])) {
return false;
}
}
return true;
}
};
12.108.18. Example frame Data
This last example shows how image data can be captured after the sensor has been setup as in the previous examples.
class Sensor : public SensorNegotiate {
XrFutureEXT m_ConfigFuture{XR_NULL_FUTURE_EXT};
XrFutureEXT m_StartFuture{XR_NULL_FUTURE_EXT};
XrFutureEXT m_StopFuture{XR_NULL_FUTURE_EXT};
enum State { IDLE, CONFIGURING, STARTING, STREAMING, STOPPING };
State m_State{IDLE};
struct StreamInfo {
XrPixelSensorBufferML buffer{XR_TYPE_PIXEL_SENSOR_BUFFER_ML};
XrPixelSensorFrameML* frame;
float exposureMs{0};
XrPixelSensorFisheyeIntrinsicsML intrinsics;
XrTime lastCaptureTime{0};
};
std::vector<StreamInfo> m_StreamInfo;
public:
bool Start() {
XrPath sensorPath;
if (XR_SUCCESS == xrStringToPath(m_Instance,"pixelsensor/world/center",&sensorPath)) {
return false;
}
if (!CreateSensor(sensorPath)) { return false; }
if (!NegotiateSensorStreamCapabilities()) {
return false;
}
if (!ConfigureStreams()) {
return false;
}
return true;
}
void OnGameTick() {
// Check for data.
switch (m_State) {
case IDLE:
break;
case CONFIGURING:
CheckStreamConfig();
break;
case STARTING:
CheckStartSensor();
break;
case STREAMING:
HandleData();
break;
case STOPPING:
CheckStopSensor();
break;
}
}
bool Stop() {
m_State = IDLE;
std::vector<uint32_t> streams(m_StreamCount);
std::iota(streams.begin(), streams.end(), 0);
XrPixelSensorStopInfoML stopInfo {XR_TYPE_PIXEL_SENSOR_STOP_INFO_ML, nullptr, m_StreamCount, streams.data()};
bool ok = xrStopPixelSensorAsyncML(m_Sensor, &stopInfo, &m_StopFuture) == XR_SUCCESS;
if ( ok ) {
m_State = STOPPING;
return true;
}
// Log error
return false;
}
private:
bool ConfigureStreams() {
if ( m_State != IDLE ) {
return false;
}
// Notice that we pass the configuration chain that was built in the
// 2nd example above to the configure function.
std::vector<uint32_t> streams(m_StreamCount);
std::iota(streams.begin(), streams.end(), 0);
XrPixelSensorConfigInfoML configInfo {XR_TYPE_PIXEL_SENSOR_CONFIG_INFO_ML};
configInfo.streamCount = streams.size();
configInfo.streams = streams.data();
configInfo.configurationCount = m_ConfigChain.size();
configInfo.configurations = Config();
bool ok = xrConfigurePixelSensorAsyncML(m_Sensor, &configInfo, &m_ConfigFuture) == XR_SUCCESS;
if ( ok ) {
m_State = CONFIGURING;
return true;
}
// Log an error
return false;
}
// Since configuring camera sensors can take significant time
// the XR_EXT_future pattern is used here.
void CheckStreamConfig() {
XrFuturePollResultEXT futureResult{XR_TYPE_FUTURE_POLL_RESULT_EXT};
XrFuturePollInfoEXT pollInfo{XR_TYPE_FUTURE_POLL_INFO_EXT};
pollInfo.future = m_ConfigFuture;
CHK_XR(xrPollFutureEXT(m_Instance, &pollInfo, &futureResult)!=XR_SUCCESS);
switch (futureResult.state) {
case XR_FUTURE_STATE_PENDING_EXT:
break;
case XR_FUTURE_STATE_READY_EXT: {
XrPixelSensorConfigureCompletionML completion{XR_TYPE_PIXEL_SENSOR_CONFIGURE_COMPLETION_ML};
CHK_XR(xrConfigurePixelSensorCompleteML(m_Sensor, m_ConfigFuture, &completion));
if (completion.futureResult==XR_SUCCESS) {
if(SetupStreamInfo() == false) {
// Log an error.
m_State = IDLE;
}
// Configuration successful, start the sensor streams.
std::vector<uint32_t> streams(m_StreamCount);
std::iota(streams.begin(), streams.end(), 0);
XrPixelSensorStartInfoML startInfo {XR_TYPE_PIXEL_SENSOR_START_INFO_ML, nullptr, m_StreamCount, streams.data()};
bool ok = xrStartPixelSensorAsyncML(m_Sensor, &startInfo, &m_StartFuture) == XR_SUCCESS;
if ( ok ) {
m_State = STARTING;
return;
} else {
// Log an error.
m_State = IDLE;
}
} else {
// Log an error.
m_State = IDLE;
}
}
}
}
void CheckStartSensor() {
XrFuturePollResultEXT futureResult{XR_TYPE_FUTURE_POLL_RESULT_EXT};
XrFuturePollInfoEXT pollInfo{XR_TYPE_FUTURE_POLL_INFO_EXT};
pollInfo.future = m_StartFuture;
CHK_XR(xrPollFutureEXT(m_Instance, &pollInfo, &futureResult)!=XR_SUCCESS);
switch (futureResult.state) {
case XR_FUTURE_STATE_PENDING_EXT:
break;
case XR_FUTURE_STATE_READY_EXT: {
XrPixelSensorStartCompletionML completion{XR_TYPE_PIXEL_SENSOR_START_COMPLETION_ML};
CHK_XR(xrStartPixelSensorCompleteML(m_Sensor, m_ConfigFuture, &completion));
if (completion.futureResult==XR_SUCCESS) {
m_State = STREAMING;
} else {
// Log an error.
m_State = IDLE;
}
}
}
}
void CheckStopSensor() {
XrFuturePollResultEXT futureResult{XR_TYPE_FUTURE_POLL_RESULT_EXT};
XrFuturePollInfoEXT pollInfo{XR_TYPE_FUTURE_POLL_INFO_EXT};
pollInfo.future = m_StopFuture;
CHK_XR(xrPollFutureEXT(m_Instance, &pollInfo, &futureResult)!=XR_SUCCESS);
switch (futureResult.state) {
case XR_FUTURE_STATE_PENDING_EXT:
break;
case XR_FUTURE_STATE_READY_EXT: {
XrPixelSensorStopCompletionML completion{XR_TYPE_PIXEL_SENSOR_STOP_COMPLETION_ML};
CHK_XR(xrStopPixelSensorCompleteML(m_Sensor, m_ConfigFuture, &completion));
if (completion.futureResult==XR_SUCCESS) {
ClearStreamInfo();
if (m_Sensor!=XR_NULL_HANDLE) {
xrDestroyPixelSensorML(m_Sensor);
m_Sensor = XR_NULL_HANDLE;
}
DeleteConfigs();
m_State = IDLE;
} else {
// Log an error.
m_State = IDLE;
}
}
}
}
void ClearStreamInfo() {
for (auto & streamInfo: m_StreamInfo) {
delete[] (char*)(streamInfo.buffer.buffer);
delete streamInfo.frame;
}
m_StreamInfo.clear();
}
bool SetupStreamInfo() {
ClearStreamInfo();
m_StreamInfo.resize(m_StreamCount);
std::vector<XrPixelSensorMetadataML> requiredMetadatas {XR_PIXEL_SENSOR_METADATA_EXPOSURE_TIME_ML, XR_PIXEL_SENSOR_METADATA_FISHEYE_CAMERA_MODEL_ML};
for (uint32_t i=0;i<m_StreamCount;i++) {
uint32_t count;
CHK_XR(xrEnumeratePixelSensorMetadataML(m_Sensor, i, 0, &count, nullptr));
std::vector<XrPixelSensorMetadataML> availableMetadatas(count);
CHK_XR(xrEnumeratePixelSensorMetadataML(m_Sensor, i, count, &count, availableMetadatas.data()));
// Check if all the required metadata is present.
bool allMetatadataPresent = std::all_of(requiredMetadatas.begin(), requiredMetadatas.end(),
[&availableMetadatas](XrPixelSensorMetadataML metadata) {
return std::find(availableMetadatas.begin(), availableMetadatas.end(), metadata) != availableMetadatas.end();
});
if(!allMetatadataPresent) {
return false;
}
XrPixelSensorBufferPropertiesInfoML propertiesInfo{XR_TYPE_PIXEL_SENSOR_BUFFER_PROPERTIES_INFO_ML, nullptr, i,
static_cast<uint32_t>(requiredMetadatas.size()), requiredMetadatas.data()};
XrPixelSensorBufferPropertiesML properties{XR_TYPE_PIXEL_SENSOR_BUFFER_PROPERTIES_ML};
CHK_XR(xrGetPixelSensorBufferPropertiesML(m_Sensor, &propertiesInfo, &properties));
m_StreamInfo[i].lastCaptureTime = 0;
m_StreamInfo[i].buffer.buffer = new char[properties.bufferSize];
m_StreamInfo[i].buffer.bufferSize = properties.bufferSize;
}
return true;
}
void HandleData() {
XrPixelSensorDataGetInfoML info{XR_TYPE_PIXEL_SENSOR_DATA_GET_INFO_ML};
XrPixelSensorDataML sensorData{XR_TYPE_PIXEL_SENSOR_DATA_ML};
XrPixelSensorExposureTimeML exposureMetadata{XR_TYPE_PIXEL_SENSOR_EXPOSURE_TIME_ML};
XrPixelSensorFisheyeIntrinsicsML fisheyeMetadata{XR_TYPE_PIXEL_SENSOR_FISHEYE_INTRINSICS_ML};
sensorData.next = &exposureMetadata;
exposureMetadata.next = &fisheyeMetadata;
for (uint32_t i=0;i<m_StreamCount;i++) {
info.stream = i;
info.lastCaptureTime = m_StreamInfo[i].lastCaptureTime;
info.timeout = 0;
switch(xrGetPixelSensorDataML(m_Sensor, &info, &m_StreamInfo[i].buffer, &sensorData)) {
case XR_SUCCESS:
// ***************
// DATA AVAILABLE!
// ***************
m_StreamInfo[i].lastCaptureTime = sensorData.captureTime;
m_StreamInfo[i].frame = sensorData.frame;
m_StreamInfo[i].exposureMs = exposureMetadata.exposureTime;
m_StreamInfo[i].intrinsics = fisheyeMetadata;
break;
case XR_TIMEOUT_EXPIRED:
// Nothing yet.
break;
}
}
}
};
12.108.20. New Enums
XrResult enumeration is extended with:
XR_ERROR_PIXEL_SENSOR_PERMISSION_DENIED_ML
XR_ERROR_PIXEL_SENSOR_NOT_SUPPORTED_ML
XR_ERROR_PIXEL_SENSOR_CAPABILITY_NOT_SUPPORTED_ML
XR_ERROR_PIXEL_SENSOR_SPACE_NOT_SUPPORTED_ML
XrStructureType enumeration is extended with:
XR_TYPE_PIXEL_SENSOR_CREATE_INFO_ML
XR_TYPE_EVENT_DATA_PIXEL_SENSOR_AVAILABILITY_CHANGED_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_QUERY_INFO_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_UINT32_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_FLOAT_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_CONFIG_XR_EXTENT_2DI_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_FLOAT_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_CONTINUOUS_UINT32_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_XR_BOOL32_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_XR_EXTENT_2DI_ML
XR_TYPE_PIXEL_SENSOR_CAPABILITY_RANGE_DISCRETE_UINT32_ML
XR_TYPE_PIXEL_SENSOR_EXPOSURE_TIME_ML
XR_TYPE_PIXEL_SENSOR_PINHOLE_INTRINSICS_ML
XR_TYPE_PIXEL_SENSOR_FISHEYE_INTRINSICS_ML
XR_TYPE_PIXEL_SENSOR_DEPTH_FRAME_ILLUMINATION_ML
XR_TYPE_PIXEL_SENSOR_DEPTH_CONFIDENCE_BUFFER_ML
XR_TYPE_PIXEL_SENSOR_DEPTH_FLAG_BUFFER_ML