XR_ML_physical_world_occlusion
This API is still an experimental extension not included in the official OpenXR registry and is subject to change.
Dynamic Occlusion
The XR_ML_physical_world_occlusion
extension allows applications to
specify occlusion settings on a per-layer basis in OpenXR.
This extension provides the ability to occlude virtual content with
real-world objects detected by different sources, improving the sense of
presence and realism.
User Experience
If enabled, the selected sources will be automatically occlude virtual objects behind them from the user's view. This also applies to the segmented dimmer visibility.
For example, if an application has controller occlusion enabled and the controller is held in front of a virtual object, both the virtual object and the segmented dimmer around it will not be visible to the user in the areas occluded by the controller.
Occlusion Sources
There are 4 types of occlusion available to developers: Controller, Hands, Depth and Environment. Note that the Depth and Environment occlusion sources both rely on the depth sensor. Click the OpenXR specs below to view the full extension definition:
- Controller :
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_CONTROLLER_ML
- Hands :
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_HANDS_ML
- Environment :
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_ENVIRONMENT_ML
- Depth :
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_DEPTH_SENSOR_ML
Limitations
Opting for this feature will cause apps to have less GPU time per frame compare to not opting in and apps should budget in the additional cost.
Both the depth and environment occlusion sources rely on the depth sensor. Because environment occlusion uses the meshing system and relies on the long range mode, if both depth and environment occlusion are enabled then the far depth range is set to a minimum of 0.91 so that environment occlusion can function.
Switching between sources or changing depth ranges can cause very noticeable delays, depending on the source:
- Controller -- no delay
- Hand tracking -- medium delay
- Depth sensor -- large delay
- Environment -- large delay
Currently, enabling the first occlusion source can cause an additional delay, due to the occlusion feature starting up, regardless of occlusion source.
While there is no direct frame rate dropoff at the 0.91 depth cutoff, any increase in depth range will increase the amount of geometry being rendered and have some performance impact.
Best Practices
Developers can request a specific range for the depth sensor when using depth occlusion. A far range of \<= 0.9 means the short-range 30 Hz depth sensor mode is active, while a value above 0.9 means the long range 5 Hz mode is active.
Developers who wish to use the depth sensor source at 30 Hz should use this configuration:
- All layers that use depth sensor occlusion should have a far range of less than or equal to 0.9m
- No layers should use environment occlusion
It is recommended to use depth occlusion for any moving real-world objects, such as handheld objects, pets and other people. Use environmental occlusion for any other static surroundings.
API
12.108. XR_ML_physical_world_occlusion
- Name String
XR_ML_physical_world_occlusion
- Extension Type
Instance extension
- Registered Extension Number
479
- Revision
1
- Extension and Version Dependencies
- Last Modified Date
2023-08-16
- Contributors
Andrei Aristarkhov, Magic Leap
Ron Bessems, Magic Leap
Overview
The XR_ML_physical_world_occlusion
extension allows applications to specify occlusion settings on a per-layer basis in OpenXR. This extension provides the ability to occlude virtual content with real-world objects detected by different sources, improving the sense of presence and realism.
New Enum Constants
XrStructureType enumeration is extended with:
XR_TYPE_COMPOSITION_LAYER_PHYSICAL_WORLD_OCCLUSION_ML
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_CONTROLLER_ML
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_HANDS_ML
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_DEPTH_SENSOR_ML
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_ENVIRONMENT_ML
XR_TYPE_VIEW_CONFIGURATION_OCCLUSION_CONTROLLER_PROPERTIES_ML
XR_TYPE_VIEW_CONFIGURATION_OCCLUSION_HANDS_PROPERTIES_ML
XR_TYPE_VIEW_CONFIGURATION_OCCLUSION_DEPTH_SENSOR_PROPERTIES_ML
XR_TYPE_VIEW_CONFIGURATION_OCCLUSION_ENVIRONMENT_PROPERTIES_ML
New Structures
To activate automatic occlusion the application must submit occlusion configuration each frame using the xrEndFrame function, by chaining the XrCompositionLayerPhysicalWorldOcclusionML structures to the next pointer of corresponding XrCompositionLayerBaseHeader structures.
For each composition layer specified in XrFrameEndInfo::layers
the application may provide a correspondingXrCompositionLayerPhysicalWorldOcclusionML structure containing 0 or more occlusion sources.
To enable any occlusion sources for a projection layer the application mustalso submit depth layer information using theXrCompositionLayerDepthInfoKHR
structure in accordance with theXR_KHR_composition_layer_depth
extension.
The XrCompositionLayerPhysicalWorldOcclusionML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrCompositionLayerPhysicalWorldOcclusionML {
XrStructureType type;
const void* next;
uint32_t occlusionSourceCount;
const XrPhysicalWorldOcclusionSourceBaseHeaderML* const* occlusionSources;
} XrCompositionLayerPhysicalWorldOcclusionML;
The XrPhysicalWorldOcclusionSourceBaseHeaderML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrPhysicalWorldOcclusionSourceBaseHeaderML {
XrStructureType type;
const void* next;
} XrPhysicalWorldOcclusionSourceBaseHeaderML;
The XrPhysicalWorldOcclusionSourceControllerML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrPhysicalWorldOcclusionSourceControllerML {
XrStructureType type;
const void* next;
} XrPhysicalWorldOcclusionSourceControllerML;
An application can check whether this occlusion source is supported for a given view configuration by extending theXrViewConfigurationProperties withXrViewConfigurationOcclusionControllerPropertiesML structure when calling xrGetViewConfigurationProperties.
The XrViewConfigurationOcclusionControllerPropertiesML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrViewConfigurationOcclusionControllerPropertiesML {
XrStructureType type;
const void* next;
XrBool32 supportsOcclusion;
} XrViewConfigurationOcclusionControllerPropertiesML;
When calling xrGetViewConfigurationProperties, the application canprovide a pointer to anXrViewConfigurationOcclusionControllerPropertiesML structure in thenext
chain of XrViewConfigurationProperties.
The XrPhysicalWorldOcclusionSourceHandsML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrPhysicalWorldOcclusionSourceHandsML {
XrStructureType type;
const void* next;
} XrPhysicalWorldOcclusionSourceHandsML;
An application can check whether this occlusion source is supported for a given view configuration by extending theXrViewConfigurationProperties withXrViewConfigurationOcclusionHandsPropertiesML structure when callingxrGetViewConfigurationProperties.
The XrViewConfigurationOcclusionHandsPropertiesML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrViewConfigurationOcclusionHandsPropertiesML {
XrStructureType type;
const void* next;
XrBool32 supportsOcclusion;
} XrViewConfigurationOcclusionHandsPropertiesML;
When calling xrGetViewConfigurationProperties, the application canprovide a pointer to an XrViewConfigurationOcclusionHandsPropertiesMLstructure in the next
chain of XrViewConfigurationProperties.
The XrPhysicalWorldOcclusionSourceDepthSensorML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrPhysicalWorldOcclusionSourceDepthSensorML {
XrStructureType type;
const void* next;
float nearRange;
float farRange;
} XrPhysicalWorldOcclusionSourceDepthSensorML;
An application can check whether this occlusion source is supported for a given view configuration and inspect the limits for nearRange
andfarRange
parameters by extending theXrViewConfigurationProperties withXrViewConfigurationOcclusionDepthSensorPropertiesML structure when calling xrGetViewConfigurationProperties.
The XrViewConfigurationOcclusionDepthSensorPropertiesML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrViewConfigurationOcclusionDepthSensorPropertiesML {
XrStructureType type;
const void* next;
XrBool32 supportsOcclusion;
float minNearRange;
float maxNearRange;
float minFarRange;
float maxFarRange;
} XrViewConfigurationOcclusionDepthSensorPropertiesML;
When calling xrGetViewConfigurationProperties, the application canprovide a pointer to anXrViewConfigurationOcclusionDepthSensorPropertiesML structure in thenext
chain of XrViewConfigurationProperties.
The XrPhysicalWorldOcclusionSourceEnvironmentML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrPhysicalWorldOcclusionSourceEnvironmentML {
XrStructureType type;
const void* next;
} XrPhysicalWorldOcclusionSourceEnvironmentML;
An application can check whether this occlusion source is supported for a given view configuration by extending theXrViewConfigurationProperties withXrViewConfigurationOcclusionEnvironmentPropertiesML structure when calling xrGetViewConfigurationProperties.
The XrViewConfigurationOcclusionEnvironmentPropertiesML structure is defined as:
// Provided by XR_ML_physical_world_occlusion
typedef struct XrViewConfigurationOcclusionEnvironmentPropertiesML {
XrStructureType type;
const void* next;
XrBool32 supportsOcclusion;
} XrViewConfigurationOcclusionEnvironmentPropertiesML;
When calling xrGetViewConfigurationProperties, the application canprovide a pointer to anXrViewConfigurationOcclusionEnvironmentPropertiesML structure in thenext
chain of XrViewConfigurationProperties.
12.108.1. Example render loop code
The following example code demonstrates how use the extension.
XrSession session; // previously initialized
XrCompositionLayerProjection projectionLayer; // previously initialized
XrCompositionLayerQuad quadLayer; // previously initialized
// The app renders two layers: projection layer uses both the environment and the hands occlusion while the quad layer uses hands occlusion only
while (1) {
XrFrameWaitInfo frameWaitInfo{XR_TYPE_FRAME_WAIT_INFO};
XrFrameState frameState{XR_TYPE_FRAME_STATE};
xrWaitFrame(session, &frameWaitInfo, &frameState);
XrFrameBeginInfo frameBeginInfo{XR_TYPE_FRAME_BEGIN_INFO};
xrBeginFrame(session, &frameBeginInfo);
std::vector<const XrCompositionLayerBaseHeader*> layers;
// Enable projection layer occlusion by hands and the static environment
XrPhysicalWorldOcclusionSourceHandsML occlusionSourceHands{
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_HANDS_ML};
XrPhysicalWorldOcclusionSourceEnvironmentML occlusionSourceEnv{
XR_TYPE_PHYSICAL_WORLD_OCCLUSION_SOURCE_ENVIRONMENT_ML};
XrPhysicalWorldOcclusionSourceBaseHeaderML* occlusionSourcesProjection[] = {
reinterpret_cast<XrPhysicalWorldOcclusionSourceBaseHeaderML*>(&occlusionSourceHands),
reinterpret_cast<XrPhysicalWorldOcclusionSourceBaseHeaderML*>(&occlusionSourceEnv)
};
XrCompositionLayerPhysicalWorldOcclusionML projectionLayerOcclusionInfo{
XR_TYPE_COMPOSITION_LAYER_PHYSICAL_WORLD_OCCLUSION_ML};
projectionLayerOcclusionInfo.occlusionSourceCount = 2;
projectionLayerOcclusionInfo.occlusionSources = occlusionSourcesProjection;
projectionLayer.next = &projectionLayerOcclusionInfo;
layers.push_back(reinterpret_cast<XrCompositionLayerBaseHeader*>(&projectionLayer));
// Enable quad layer occlusion by hands only
XrPhysicalWorldOcclusionSourceBaseHeaderML* occlusionSourcesQuad[] = {
reinterpret_cast<XrPhysicalWorldOcclusionSourceBaseHeaderML*>(&occlusionSourceHands)
};
XrCompositionLayerPhysicalWorldOcclusionML quadLayerOcclusionInfo{
XR_TYPE_COMPOSITION_LAYER_PHYSICAL_WORLD_OCCLUSION_ML};
quadLayerOcclusionInfo.occlusionSourceCount = 1;
quadLayerOcclusionInfo.occlusionSources = occlusionSourcesQuad;
quadLayer.next = &quadLayerOcclusionInfo;
layers.push_back(reinterpret_cast<XrCompositionLayerBaseHeader*>(&quadLayer));
XrFrameEndInfo frameEndInfo{XR_TYPE_FRAME_END_INFO};
frameEndInfo.displayTime = frameState.predictedDisplayTime;
frameEndInfo.environmentBlendMode =
XR_ENVIRONMENT_BLEND_MODE_ALPHA_BLEND;
frameEndInfo.layerCount = static_cast<uint32_t>(layers.size());
frameEndInfo.layers = layers.data();
xrEndFrame(session, &frameEndInfo);
}
Version History
Revision 1, 2023-08-16 (Andrei Aristarkhov)
Initial extension description