AR Cloud
AR Cloud was deprecated on Jan 22, 2025. The following instructions for configuring AR Cloud are for users who purchased an Enterprise license with AR Cloud prior to Jan 22 2025.
AR Cloud was deprecated on Jan 22, 2025. The following instructions for configuring AR Cloud are for users who purchased an Enterprise license with AR Cloud prior to Jan 22 2025.
See our full section on AR Cloud
The Magic Leap 2 has a variety of sensors and tracking capabilities that enable the creation of experiences where virtual content appears to stick and behave as though it were part of the user’s physical environment. As a developer of XR applications, you have the challenging task of leveraging all of the myriad features and technologies exposed by the platform to build experiences that work as well as possible on the hardware given your particular use case. Incorporating the user’s physical environment into your application is both a technical and a design challenge that is unique to developing for XR platforms where the user is able to see their physical surroundings. You’ll need to understand what technologies are available, and the tradeoffs between them, to effectively design experiences that blend virtual objects with physical spaces as seamlessly as possible.
Meshing is the creation of triangle-based meshes from the World Reconstruction model created by Magic Leap devices. The mesh is used for real-time occlusion rendering and for collision detection with digital content.
Learn how to create meshes from surfaces detected by the Magic Leap.
An overview of the meshing demo scene included in the Magic Leap 2 Examples Project, which uses Unity's XR Interaction Toolkit.
This section details how to use Magic Leap 2's Meshing Subsystem Support feature in Unity. This feature allows applications to access a mesh that represents real-world geometry. Unity's mesh manager. Developers can use Unity's AR Mesh Manager or AR Point Cloud Manager component to visualize the mesh. Magic Leap specific settings, such as MeshingQuerySettings, can be configured via the MagicLeapMeshingFeature class.
Learn how to adjust the mesh queries location and bounds.
Learn how the Meshing Subsystem component works and it's features.
Learn to adjust the appearance of the mesh generated by the Magic Leap 2.
Magic Leap 2's Occlusion API offers developers a way to enhance immersion in mixed reality applications by generating a mesh representation of nearby physical objects. This allows virtual objects to interact with the real world in a more believable way by appearing to be occluded (or "masked") by real-world objects.
Before adding the sample scripts to your scene, make sure to complete the following:
Magic Leap 2's Meshing and Plane Finding APIs provide facilities for applications to understand the real-world surfaces around the user in real-time using the time of flight depth sensor on the device. Apps can use this representation to occlude rendering, place digital objects, and for physics-based interactions or ray-casting. The depth sensor has the resolution of 544 x 480 and a field of view of 75° (h) x 70°.
In this tutorial, you'll learn how to set up the Spatial Mapping component with Magic Leap 2 in Unity. By the end of this guide, you'll have a scene capable of meshing the world around you, providing a foundation for immersive mixed reality experiences. The Spatial Mapping in Magic Leap 2 is similar to AR Foundation's Spatial Mapping component.