Mixed Reality Capture Tips
This document includes tips on ensuring proper composition of virtual and reality world content when using mixed reality capture.
Ensuring Accurate Alpha Values
Mixed Reality Capture requires accurate alpha values to correctly composite virtual content over the real world. If post-processing effects or HDR capture are enabled, Unity may clear the alpha values, and set them to fully opaque resulting in the mixed reality capture or stream to display virtual content on a black background instead of blending it with the physical world. To avoid this issue, follow the steps below to disable HDR and post-processing effects in your project.
Disabling HDR and Post-Processing Effects
For Universal Render Pipeline (URP) Projects
Disable HDR:
- Go to Edit > Project Settings.
- Select Graphics from the left-hand menu.
- Under the Scriptable Render Pipeline Settings, click on your URP Asset to open its settings in the Inspector.
- In the Inspector, scroll down to the Rendering section.
- Uncheck the HDR option to disable it for your entire project.
Disable Post-Processing:
- While still in the URP Asset settings in the Inspector, scroll to the Rendering section.
- Uncheck the Post Processing option to disable it across your entire project.
For Built-in Render Pipeline (Non-URP) Projects
Disable HDR:
- Go to Edit > Project Settings.
- Select Player from the left-hand menu.
- In the Other Settings section, find the Rendering subsection.
- Uncheck the Use HDR option to disable HDR for the entire project.
Disable Post-Processing:
- Go to Edit > Project Settings.
- Select Graphics from the left-hand menu.
- In the Tier Settings section, locate the settings for each tier (Low, Medium, High).
- For each tier, uncheck the Use Post Processing option to disable post-processing effects globally.
Content Alignment
By default, Mixed Reality Capture will often appear offset because the virtual content is rendered from the left eye/display and then warped to align with the RGB camera located at the center of the headset. This warping can cause the content to be misaligned with the real world. Two features can help mitigate this issue: Secondary View and Focus Distance.
Secondary View
The Magic Leap 2 Secondary View feature uses the OpenXR XR_MSFT_first_person_observer
extension to enhance Mixed Reality Capture by rendering a secondary view from the RGB camera's position. This ensures precise alignment of physical and virtual content, addressing common issues in MR capture and significantly improving the overall quality.
However, using Secondary View introduces a performance overhead because it requires rendering an additional image from the RGB camera’s position.
For more details on how to implement this feature, refer to the Secondary View guide.
Focus Distance
If your application cannot afford the performance cost of enabling Secondary View, you can adjust the Focus Distance to improve the alignment of virtual content during Mixed Reality Capture. The Focus Distance works by adjusting the warped image to improve alignment of content at a specific distance, which can lead to better visual results without the need for rendering an additional view.
The Focus Distance feature is more performant because it does not require additional rendering passes. Instead, it tweaks the existing warping process to favor alignment at a user-defined distance, making it a lightweight option for improving visual fidelity in MR capture.
For more information on configuring the Focus Distance, visit the Focus Distance guide.