...

/

Mapping VR Headset and Controllers with XR Interaction Toolkit

Mapping VR Headset and Controllers with XR Interaction Toolkit

Deconstruct XR Interaction Toolkit basics for VR by configuring XR Origin, enabling head and controller tracking, and assigning Input Actions for a complete VR experience.

In “Building Your First VR Application,” we learned about the XR Interaction Toolkit. It helps us quickly develop cross-platform VR projects in Unity. In this lesson, we’ll deconstruct some fundamentals of the Toolkit to learn how to map our VR headset (Meta Quest 2) and controllers to Unity World Space.

By now, we hope that you’ve set up a scene and configured the project by ensuring URP selection, importing XR Plug-in Management, and setting up the XRI Toolkit from the previous lesson. The following image shows how your scene should look like right now:

Previously, in “Building Your First VR Application,” we added the XR Origin (VR) GameObject via the “Hierarchy” view that automatically added other GameObjects necessary for the VR headset and controller tracking. However, let’s take a bottom-up approach this time.

XR Origin

The XR Origin GameObject represents the center of the Unity World SpaceIn Unity, World Space refers to a coordinate system and a space where GameObjects and elements exist. in a VR Scene. It’s responsible for mapping objects and trackable features to their final position and orientation in the scene.

So, create an empty GameObject and name it XR Origin. In the “Inspector” window, click the “Add Component” button and search for the “XR Origin” script to attach it to the GameObject.

Now, create a child GameObject Camera Offset and move your “Main Camera”/“Camera” under it. Lastly, on the “XR Origin” script, set “Camera Floor Offset Object” to “Camera Offset” (that you just created), “Camera GameObject” to “Main Camera (Camera),” and “Tracking Origin Mode” to “Floor.”

Adding Main Camera as a child to Camera Offset
Adding Main Camera as a child to Camera Offset
Camera Floor Offset set to Camera Offset and Tracking Origin Mode to Floor
Camera Floor Offset set to Camera Offset and Tracking Origin Mode to Floor

Press “Build and Run,” and you’ll see the following in your VR:

You can see the game scene in VR by clicking “Run Application” below:

VR Not Connected
Experience in VR
Connect your VR headset to get started.

Notice that you were spawned exactly where you placed your XR Origin. You can experiment with this by moving around the XR Origin.

However, you must have noticed that we could only see a static image in the headset. So, to enable head tracking, we’ll need to attach the “Tracked Pose Driver” component to our “Main Camera.”

Tracked Pose Driver

The “Tracked Pose Driver” component applies the pose value of a tracked device to the Transform of the GameObject it’s attached to.

Create two new Bindings by pressing the “+” button for both “Position Input” and “Rotation Input.” Next, set Control paths for both the Bindings that were just created. Utilize the table below to set the correct Control paths:

Control Paths for HMD Input Bindings

Binding

Control Path

Position Input

XR HMD > centerEyePosition

Rotation Input

XR HMD > centerEyeRotation

Here’s what the “Tracked Pose Driver” component will look like:

Now, when you run your game, you’ll be able to move your head around to experience the scene in VR.

Although our head tracking works perfectly, we’re somehow still stuck inside the floor. This is because we set the XR Origin at the floor level because its purpose is to serve as the zero reference. So, in essence, the user’s feet in the real world translate to the position of XR Origin in Unity World Space. Then, how do we correctly position the camera representing the user’s head?

Tracking Origin Mode

“Tracking Origin Mode” specifies how to set the height of the camera. Primarily, we have two options: “Floor” and “Device.”

The “Floor” mode

The “Floor” mode automatically sets “Camera Y Offset” based on the device’s calculations. If you use this mode, you'll get the following result, depending on your height and the device’s Guardian configuration:

The “Device” mode

The “Device” mode doesn’t provide the height of the user; therefore, we can set the height by specifying a value in the “Camera Y Offset” field, as shown below:

This is how the scene will look like in VR:

Note: Notice the camera is closer to the floor for the “Floor” mode. In the “Device” mode, the camera takes into account the user’s height.

For our project, we’ll set the tracking origin mode to “Floor.” You can see the game scene in VR by clicking “Run Application” below (APK with “Tracked Pose Driver”):

VR Not Connected
Experience in VR
Connect your VR headset to get started.

Before we head into that, let’s ensure that our default Input Actions Asset gets enabled when the game boots up.

Input Action Manager

The “Input Action Manager” component simply enables/disables the list of “Actions Assets” attached. Since we intend to use references of Input Actions ahead, we’ll need them to be enabled. So, with your XR Origin GameObject, attach the “Input Action Manager” script and the “Default Input Actions” you imported as part of the Interaction Toolkit.

XR Controller (Action-based)

Since our head tracking works perfectly, let’s move on to controller tracking. To track the position and rotation of our controllers, we’ll need to create two empty GameObjects under “Camera Offset” named LeftHand Controller and RightHand Controller. However, to enable controller tracking, we’ll need to use an “XR Controller (Action-based)” component.

Similar to “Tracked Pose Driver,” the “XR Controller (Action-based)” component tracks the position and rotation of the tracked controller device to update the Transform of the GameObject it’s attached to.

Controller orientation tracking

On both the LeftHand Controller and RightHand Controller, attach the “XR Controller (Action-based)” component and ensure “Position Action,” “Rotation Action,” and “Tracking State Action” are attached as shown below.

XR Controller on LeftHand Controller GameObject
XR Controller on RightHand Controller GameObject

If you run the game at this point, you still won’t see any visuals updated due to controller movement, even though the “Transform” component gets updated. To solve this, we’ll attach a GameObject with our “XR Controller.”

We’ve provided you with a set of hands in the package below. You can use these or any other resource you find online.

Hands (Left & Right)-Package

Once you’ve imported the package, it will be located in the “Imported Packages > Hand Prefabs” folder in your project’s “Assets” window. Simply set the “Model Prefab” attribute to these two Prefabs for left and right hands for the respective XR Controllers.

Now, when you run the game, you’ll observe that your hands move and rotate based on the orientation of the respective controllers.

Controller input tracking

The second purpose of an XR Controller is to communicate state changes for that controller to the associated Interactor (you’ll study more about this in the next lesson). The XR Controller detects these state changes (such as button press) via the Input Actions. So, let’s attach all such Input Actions with our controllers as shown below:

XR Controller (Action-based): LeftHand
XR Controller (Action-based): RightHand

You can see the game scene in VR by clicking “Run Application” below (APK with hand models):

VR Not Connected
Experience in VR
Connect your VR headset to get started.

In this lesson, you learned to set “Tracked Pose Driver.” You also learned to include hands and enable head tracking. Congratulations on completing the basic components of the XR Toolkit in Unity!

Access this course and 1200+ top-rated courses and projects.