Skip to main content
Version: 21 Aug 2024

Sensor Data

Sensor Data Access Overview

Visualization of sensor data input from ML2

Sensor Data Access is a system for developers to access the various sensors integrated into the Magic Leap 2 device and configure them to gather data in a specific manner desired (e.g., frame rate, exposure, gain, etc.). We refer to this as Sensor Data Access & Control.

The ability to access and control these data streams allows developers to build more valuable algorithms and applications. They can access the RGB camera, world cameras, depth sensor, eye cameras, Inertial Measurement Units (IMUs), magnetometers, ambient light sensor, altimeters and microphones.

Download an overview PDF of the sensor features here.

Sensors

  1. RGB Camera
  2. World Cameras
  3. Depth Sensor
  4. Microphones
  5. Ambient Light Sensor (ALS)
  6. Altimeter
  7. Eye Cameras
  8. Inertial Measurement Units (IMUs)
  9. Magnetometer

RGB Camera

Visualization of RGB Camera feed
  • What is it?
    • The RGB Camera is a high-pixel density color camera which attempts to replicate how the human eye views the world.
    • It is a general purpose 12.6 MP camera that includes an autofocus element. It can be used to take pictures or videos of the world, capturing both the environment and user interactions.
    • The higher pixel density means this camera captures clear and sharp high-resolution images–but at the same time, it consumes more power and requires more data processing.
  • How does it work?
    • The RGB Camera delivers in-color images of people and objects by capturing light in red, green, and blue wavelengths (RGB). It sees the visible wavelength and attempts to replicate how the human eye views the world.
    • Developers can access raw color pixel data from the RGB camera and have the option to configure media capture type (photo or video), frame rate, resolution, media output format, capture mode (real-world, virtual content), autofocus mode, auto exposure, white balance, and color correction. Camera intrinsics are also available (i.e. focal length, field-of-view, aperture, and more).
  • Why is it important?
    • The RGB camera enables solutions that require a detailed understanding of the user’s surroundings and interactions. For example, it can be used for the object recognizer & scene understanding modules, for virtual 3D overlay, for marker tracking, or for capturing and recording the user’s interactions with the physical and digital worlds.

World Cameras

Visualization of World Camera feed
  • What is it?
    • On the Magic Leap 2 device, there are three outward-facing 1.0 MP cameras which collect wide-view images and video in grayscale of the user’s surroundings and interactions.
    • These cameras are low-resolution cameras compared to the images and video captured on the RGB camera, however this means they consume less power and are more efficient for data-processing.
  • How does it work?
    • Each World Camera has 2 modes: Normal Exposure (NE) mode and Low Exposure (LE) mode.
    • Developers have the ability to configure the media capture type (image or video), frame rate, gain, and exposure of these cameras. The camera intrinsics are also available (i.e. focal length, field-of-view, aperture, and more).
    • They can access raw grayscale pixel data from the left world camera, right world camera, and center world camera.
  • Why is it important?
    • The world cameras enable solutions that require an understanding of the user’s surroundings and interactions, but don't require the detail of a high-pixel density camera. They have a wider field-of-view (meaning they can capture more of a scene at once) and they use less power, making them more efficient for data processing.
    • World cameras can determine user-related data, like the position of the user’s head, and they can identify hand gestures to be used for input and content manipulation. They are also used for refining spatial scans of the environment, for virtual 3D overlay, marker tracking, and object tracking.

Depth Sensor

Visualization of Depth Sensor feed
  • What is it?
    • Magic Leap 2 has a time-of-flight depth sensor that consists of an image sensor and illuminator.
    • Depth sensors collect depth images of the user’s surroundings, which can be used to build spatial scans of the environment. They measure the distance between the device and objects, as well as the distance between objects.
    • Depending on the wavelength selected, it can also detect heat.
  • How does it work?
    • Time-of-flight distance sensors use the time that it takes for photons to travel between two points to calculate the distance between those points. Based on the time-of-flight, we can derive the depth of user’s surroundings.
    • Developers can access processed depth data, confidence data, and raw environment data with and without the illuminator activated.
    • Developers are able to configure the depth mode (short or long), frame rate, and exposure. The camera intrinsics are available (i.e. focal length, field-of-view, aperture, and more).
  • Why is it important?
    • Depth sensors are relevant any time spatial awareness is important for a user. They assist in building spatial scans of the user’s environment (by identifying the relative distance of nearby objects) and can be used to detect objects, count people, and aid in the navigation of a space.

Microphones

Visualization of microphone sound sources
  • What is it?
    • Microphones on Magic Leap 2 devices capture user generated sounds such as voice during video recordings, calls, meetings, and when using voice commands.
    • They also collect acoustic sound from the world around the user.
  • How does it work?
    • The Magic Leap 2 Headset includes 4 microphones. Processed microphone streams with acoustic echo cancellation and a stereo pair are currently available.
    • Developers are also able to access raw microphone streams.
  • Why is it important?
    • Microphones serve to collect sounds from the user and the environment, as well as create realistic digital sound effects.
    • When used in combination with other sensors, microphones can help developers understand audio activity happening with the user and within specific environments.

Ambient Light Sensor (ALS)

Visualization of Ambient Light sensor feed
  • What is it?
    • Ambient light sensors measure the light intensity in the environment around the user. This can be used to appropriately auto-adjust display brightness and dimming depending on the lighting in the user's location.
  • How does it work?
    • The ambient light sensor works by capturing the light level in the environment using photodetectors.
    • When a user moves around (for example, from a bright room to a dark room) the display on the Magic Leap 2 headset can adjust accordingly, so that the user continues to see content with clarity.
  • Why is it important?
    • The ambient light sensor enables our Dynamic Dimming™ technology, which creates the seamless integration of virtual content and elements with the real world. It enables a realistic user experience that is adaptive to the spaces and the environments users move through.

Altimeter

Visualization of Altimeter feed
  • What is it?
    • Barometric Pressure Sensors–usually referred to as Altimeters–collect air pressure readings of the user’s surroundings, which can be used to determine changes in altitude above ground.
  • How does it work?
    • Altimeters are included in the Headset and Compute Pack. Ambient air pressure can be measured in units of hPa or mbar, and these measurements can be converted to altitude.
    • Barometric pressure changes dramatically from environment to environment, so over short periods of time, relative pressure changes can be used to accurately estimate changes in altitude. Users can take a reference of barometric pressure at a starting location, and measure changes in pressure as they move through a space.
  • Why is it important?
    • Altimeters serve as navigational aids, localization aids, and assist in monitoring environments. For example, in a new construction environment, altimeters can determine what floor the user is on in an unmarked building. In existing buildings, altimeters can assist with determining which room the user is actually located in (especially when there are many same-room layouts in one building, like a hotel).
    • For navigational purposes, altimeters assist with guiding a user from point A to point B when there is a change in altitude (i.e. going to different floors in a building).
    • With environmental monitoring, altimeters can be used to measure changes in pressure. For enterprise field workers in condition-dependent workflows, the altimeters can provide information that assists with providing the best service or understanding of maintenance concerns. For example, measurement of room air pressure when configuring an HVAC system.

Eye Cameras

Visualization of Eye Cameras
  • What is it?
    • Eye cameras are inward-facing cameras on the Magic Leap 2 device, and can be used to determine where the user is looking, and other important information about user behaviors/conditions in relationship to their eye movements or expressions.
  • How does it work?
    • The Magic Leap 2 device leverages two cameras and six LEDs per eye. They are located on the inside of the device to view the users eye movement.
    • Developers can access data from gaze tracking, gaze behavior classification, eyeball center, eye expressions, pupil location, pupil diameter, and raw eye images.
  • Why is it important?
    • Eye tracking cameras show and assist the device in understanding where the user is placing their visual attention, can show indications of cognitive process, and be used for clinical assessments.
    • They can also be leveraged by enterprises to better understand a user’s experience from the inside out, and aid with training scenarios that require analysis of gaze behavior and response.

Inertial Measurement Units (IMUs)

Visualization of IMU feed
  • What is it?
    • Inertial Measurement Units (IMUs) are sensors that measure the relative position, orientation, and motion of the Magic Leap 2 device.
    • There are two of these sensors in the Headset and one in the Compute Pack.
  • How does it work?
    • IMUs are motion tracking devices that measure accelerations and rotations to estimate position and orientation over time.
    • The accelerometer serves as a linear motion sensor on the XYZ axes. The gyroscope captures rotations around the XYZ axes. An IMU is made up of both so that the state of the device can be managed in 3-dimensions.
    • IMU data is available at rates up to 1,000 Hz.
  • Why is it important?
    • When used in combination with World Cameras, IMUs make it possible to align digital content accurately in relation to the three-dimensional positioning of the headset. This is important for creating successful SLAM algorithms.
    • IMUs are also used for tracking the movements and motions of users, such as muscular tremors when used for diagnostic aid. IMUs also provide bio-behavioral data, like tracking ambulatory movements such as walking and gait.

Magnetometer

Visualization of Magnetometer feed
  • What is it?
    • Magnetometers are magnetic sensors that can detect orientation of the device based on Earth’s magnetic field.
  • How does it work?
    • When combined with a gravity vector which points towards the center of the Earth, the magnetometers enable determination of the user's orientation with respect to cardinal directions (north, south, east, and west).
    • Two magnetometers are present in the Headset. Developers have the ability to sample both magnetometers at rates up to 100 Hz. End user calibration of the magnetometer supports e-compass and pedestrian navigation use cases.
  • Why is it important?
    • Magic Leap 2 does not have GPS, so magnetometers aid in orienting the user in both mapped and unmapped spaces.
    • Apps can provide directional guidance to the user regardless of a space's mapped status. For example, an app can use the magnetometer data to direct the user to exit from the north side of a building. Magnetometers are also used to help improve motion vector determination.