Updated on
October 11, 2023
VR Builder is a free and open source asset on the Unity Asset Store, which takes care of adding and configuring the XR Interaction Toolkit for you automatically. If you are not using VR Builder, you can set up the XR Interaction Toolkit by going through the following steps manually.
Open the package manager and select Packages: Unity Registry. Scroll down to find XR Interaction Toolkit. Select it and press Install.
Can't find it in the list?! This is a bug with some Unity versions at the moment of writing. In this case, add the package manually. Click Add package by name…, type in the name com.unity.xr.interaction.toolkit and press Add.
The Unity XR Interaction Toolkit requires the new input system and uses a separate set of interaction layers instead of the previously used Unity layers. After importing the package, if necessary, it will request to switch to the new input system and to automatically convert the layers to the new system. Click Yes on the first pop-up to change the input system, and Cancel on the second if you are creating a new project. If upgrading an existing project, you might want to select the I Made a Backup, Go Ahead! option. After making a backup, of course!
Since we will later need the Starter Assets (formerly Default Input Actions), import them as well. They are listed under Samples.
In Project Settings, install the XR Plugin Management. Afterwards, select the device category you want to support with your app, for instance Windows Mixed Reality devices.
Depending on your device type, enable Oculus or Open XR. If you want to use the device simulator, check Unity Mock HMD.
If you are using Unity 2021 or later, you might notice that you can no longer select Magic Leap Zero Iteration or Windows Mixed Reality from this list. This is due to the deprecation of these plug-ins. Instead, select Open XR.
If you are upgrading an existing project, also make sure to remove the deprecated packages from the package manager, e.g. the Windows XR Plugin if you are upgrading from a WMR project.
This example shows how to set up Open XR in the XR Plug-in Management for Windows Mixed Reality devices but setting it up for other hardware works the same way.
First, you will notice the warning next to OpenXR:
Click on it to open the OpenXR Project Validator.
This will tell you that you don't have selected an interaction profile. Press Edit.
This will open the OpenXR tab of the XR Plug-in Management in the Project Settings. In section Interaction Profiles, you can add a corresponding interaction profile, e.g. Microsoft Motion Controller Profile for WMR controllers.
In the main menu bar, click on GameObject > XR > XR Origin (action based).
If you now run the app, your head tracking should already work, but your controllers won't. We will fix this with the next two steps.
Select XR Origin and add the component Input Action Manager. Then, add the XRI Default Input Actions from the samples as an action asset.
Replace the XR controller components in the Hand Controller game objects of the rig with the default ones from the sample.
Now your controllers are tracked and you can see the interaction rays moving around with your controller movement. Since no controller model is provided, you will not see the controllers themselves. You can add proxy objects as children to the Hand Controller game objects as a quick solution. However, we recommend adding a proper controller prefab as a reference in the XR controller component.
There are further tutorials on Unity Learn which explain how you can, for instance, add teleportation. If you need further help, we invite you to join our VR Builder Community on Discord.