Search Unity

Eyes, hands, simulation, and samples: What’s new in Unity XR Interaction Toolkit 2.3

March 13, 2023 in Engine & platform | 8 min. read
What’s new in Unity XR Interaction Toolkit 2.3 | Hero image
What’s new in Unity XR Interaction Toolkit 2.3 | Hero image
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

The XR Interaction Toolkit (XRI) is a high-level, component-based interaction system for creating VR and AR experiences. It provides a common framework for interactions and streamlines cross-platform creation. This update adds three key features: eye gaze and hand tracking for more natural interactions, audiovisual affordances to bring interactions to life, and an improved device simulator to test in-Editor. To help you get started, let’s explore each addition in more detail.

For a more in-depth breakdown of the update, check out what’s new in XRI 2.3, or explore the sample project.

XR developer and founder of LearnXR.io, Dilmer Valecillos, has put together an awesome video tutorial on XRI 2.3:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Full support for articulated hand tracking

Along with XRI 2.3, we’re shipping the Unity XR Hands package in prerelease. XR Hands is a new XR subsystem which adds APIs to enable hand tracking in Unity. It includes built-in support at release for OpenXR, with support for Meta platforms soon to follow. In addition, external hardware providers can pipe in hand-tracking data from their existing XR SDK by following the provided API documentation.

This release of XRI includes the Hands Interaction Demo, a sample package showcasing a hand interaction setup where you can switch between hands and controllers without changing anything in your scene on-device. Using this functionality, your content may start with a standard controller setup, but transition seamlessly to hands for specific tasks or natural interactions in gameplay.

XRI 2.3 also supports natural poking interactions through the XR Poke Interactor. This allows you to poke using hands or controllers on 3D UI or XRI-enabled UGUI Canvas elements.

Interacting using eye gaze

New headsets like the HoloLens 2, Meta Quest Pro, and PlayStation® VR2 include sensors to track where users are looking. Gaze-based interactions can help you build XR apps that feel more natural and provide an additional way to engage with content. To support this type of interaction, we have introduced the XR Gaze Interactor, driven by eye-gaze or head-gaze poses. You can use this interactor for direct manipulation, like hovering or selecting by dwelling on interactables.

Since we generally don’t recommend that apps be controlled entirely with eyes, we have introduced an additional form of controller and hand-based interaction assistance to help users select specific objects: the XR Interactable Snap Volume. This component complements the gaze interactor, as it allows for snapping interactions to a nearby interactable when aiming at a defined area around an object. Snap volumes can also be used without the gaze interactor to enable easier object selection for users.

Tobii, a global leader in eye-tracking technology, assisted with the concepts and research. If you’re interested in learning more, you can browse their knowledge base of eye-tracking concepts.

Bringing interactions to life with affordances

Using hands for interaction is different from using controllers in that there’s no haptic or tactile feedback to confirm when an interaction takes place. The affordance system, a set of performant components that animates objects or triggers sound effects in reaction to an object’s interaction state, helps mitigate this feedback gap. This system is built to work with any combination of interactor and interactable in both new and existing projects.

Stretch, swing, and spin using both hands

The new XR General Grab Transformer reduces the complexity of the hierarchy and allows one general-purpose transformer to support both single and two-handed interactions on an interactable, rather than multiple grab transformers. It also enables two-handed scaling, letting you scale objects up and down by moving your hands apart or together, similar to zooming in and out on a mobile phone.

We’ve also added an Interaction Group component. This behavior allows a developer to group interactors together and sort them by priority, which allows only a single interactor per group to interact at a given time. For example, when a Poke, Direct, and Ray Interactor are grouped together, poking a button will temporarily block the other interactors from interacting with the scene. This can keep you from accidentally grabbing something nearby when you’re working in the distance, and prevents rays from shooting into the scene while you’re grabbing or poking an object up close.

Iterating for XR without a headset just got easier

Testing XR apps on a headset is important, but testing in-Editor helps reduce iteration time. In this release, the XR Device Simulator received a major usability update with a new onscreen UI widget that makes it easier to see what inputs drive the simulator, and which ones are currently active.

New simulation modes have also been added so you can toggle between commonly used control modes. At startup, the device simulator activates the new first-person shooter (FPS) mode, which manipulates the headset and controllers as if the whole player was turning their torso. You can then cycle through the other modes to manipulate individual devices: the headset, the left controller, and the right controller. To use the XR Device Simulator, import the sample from the Package Manager.

Take a tour of our new XRI sample project

It’s been a long time coming, and our updated sample project is finally here. It showcases the array of XR experience building blocks you can use in XRI 2.3. The project is divided into stations that help you understand how each major feature of XRI works, and includes both simple and advanced examples for each. You can access the sample project on GitHub and use it to kick-start your next XR app.

Looking ahead

Though it’s still early days for eyes and hands in the XR Interaction Toolkit, we’re always working to make building expressive XR experiences easier. As we head towards XRI 2.4 and beyond, we would appreciate your feedback. We’d also love to see what you build with these tools, so feel free to include the hashtag #unityXRI when posting on social media.

March 13, 2023 in Engine & platform | 8 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered
Related Posts