Skip to main content
Version: 12 Dec 2024

Using the Controller with AR Plugin

This document describes how to use the Magic Leap 2 (ML2) controller when working with the AR Experience plugin. Topics covered include navigation and code samples for how to get controller pose and interact with objects in a scene.

When you enable the AR Experience plugin, spatial input using the ML2 controller becomes available to the kit application you are working with. A set of XR tools, analogous to the 2D tools that are available in the toolbox in the Omniverse USD Composer are provided for manipulating content in your USD scene. When you start streaming content to your ML2 device, the buttons on the ML2 controller are bound to basic tools for navigating and manipulating your scene. To see this, do the following:

  1. Select Show Tooltips in the AR panel of the USD Composer to see the controls that are currently bound to the ML2 controller.
Show Tooltips in the AR Panel
  1. In your scene, a menu should now show you what maps from the controls to the ML2 controller. The image shows the default mapping you see after enabling tooltips.
Mapping from ML2 Controller to Controls

Default ML2 Controller Bindings Table

ML ControlBinding
TriggerSelect items in the scene. primary_controller_button_trigger -> primary_controller:sel:select
BumperHold the button and move the controller around to move the prims in the scene when in raycast mode. primary_controller_button_grip -> primary_controller:grab, primary_controller:sel:grab
Touchpad L/RRotate view. primary_controller_button_dpad_left -> primary_controller:nav:rotate:left, primary_controller_button_dpad_right -> primary_controller:nav:rotate:right
Touchpad UpToggle raycast mode. primary_controller_button_dpad_up -> primary_controller:sel:aim
Touchpad DownWhen held down and pointing to a surface triggers teleportation to the area specified when let go. primary_controller_button_dpad_down -> primary_controller:nav:teleport:forward
Menu ButtonBrings up the Settings menu. primary_controller_button_menu -> menu:tool
Bumper (hold) + Touchpad L/R (tap)Rotates Prim by 30 degrees.
Bumper (hold) + Touchpad Up (hold)Push Prim
Bumper (hold) + Touchpad Down (hold)Pull Prim
note

You will need a surface to teleport onto for the teleport action to work properly. The default scene in USD Composer contains a plane at the origin that will work fine.

Building Custom Interactions Using the ML2 Controller

This section shows how to handle ML2 controller input to implement custom behaviors in your plugin code.

Get the Controller Pose

This code sample shows you the best way to get the controller pose when the controller is in your right hand. The XRDevice object has a get_virtual_world_pose() function that you can call to determine the current location of the ML2 controller in the scene.

note

This sample will not work if you use the controller in your left hand.

from omni.kit.xr.core import XRCore

singleton = XRCore.get_singleton()
if singleton.is_xr_display_enabled():
curr_profile = singleton.get_current_xr_profile()
device_list = curr_profile.get_device_list()
controller = device_list[1]
pose = controller.get_virtual_world_pose()
print(pose)

Determine When the User Selects an Object

This code sample shows you how to determine when a user selects an object in the scene. The 'SELECTION_CHANGED' stage event will be raised when the user selects an object in the scene using the ML2 controller. You can handle this event in your plugin to implement custom object selection behavior.

import carb.events
import omni.usd

def on_stage_event(e: carb.events.IEvent):
if e.type == int(omni.usd.StageEventType.SELECTION_CHANGED):
print("selection changed")

stage_event_sub = (
omni.usd.get_context()
.get_stage_event_stream()
.create_subscription_to_pop(on_stage_event, name="My Subscription Name")
)

Determine What Object the User Selected

You can use the pose that is output from the controller object in the code snippet in the Get the Controller Pose section to raycast into the scene to determine the object that was selected by the user.

from omni.kit.xr.core import XRCore, XRProfile

def _callback(ray: XRRay, result: XRRayQueryResult):
usd_target_path = result.get_target_enclosing_model_usd_path()
# Do my raycast work

singleton = XRCore.get_singleton()
if singleton.is_xr_display_enabled():
curr_profile = singleton.get_current_xr_profile()
device_list = curr_profile.get_device_list()
controller = device_list[1]
pose = controller.get_virtual_world_pose()
origin = pose.ExtractTranslation()
direction = pose.TransformDir(Gf.Vec3d(0, 0, -1)) # assume Y-up
ray = XRRay(origin, direction, 0, 10000)
curr_profile.submit_raycast_query(ray, _callback)