This blog is the latest in Games Focus, a series that shares what Unity is doing for all game developers – now, next year, and in the future. This installment covers the status, upcoming release plans, and future vision for XR.
I’m Tarrah Wilson, director of product for XR at Unity, where I drive product strategy for augmented, virtual, and mixed reality. I’ve been at the ground floor of building XR experiences for the last six years, including shipping the HoloLens and social VR apps at Microsoft.
Prior to working in XR, I made worldbuilding games and 3D creation apps with heavy focus on user-generated content. The XR leadership team at Unity shares a passion for the practical usage of XR in people’s everyday lives and helping build the best tools for creators and developers. Scott Flynn, Dave Ruddell, Dorrene Brown, and Will McDonald are colleagues from the XR team who have also contributed technical and engineering expertise to this blog.
We’re fortunate enough to enjoy an incredible XR game developer community building up around Unity during a period of rapid growth and evolution in XR technology and hardware. We anticipate that the XR space will continue to evolve – pushing us and all of you to keep up with new input paradigms alongside ever more capable hardware, world understanding, and user expectations.
In this edition of the Games Focus series, we’ll cover newly supported hardware and updates to OpenXR and AR Foundation. With XRI (XR Interaction Toolkit), we now provide key interactions like natural object selection and manipulation, common locomotion patterns, and support for touchscreen gestures when combined with AR Foundation.
Making XR multiplatform has always been essential to provide customers with maximum choice and flexibility. That’s why we prioritize building API abstractions and features over rapidly changing platform APIs and capabilities. We’ve invested heavily in low-level abstractions to enable support for a wide variety of XR hardware possible while providing you with a unified offering. We also support the growing OpenXR standard so you can reach more users on more headsets.
As we continue to invest in reducing the fragmentation at the device and SDK layer, we also recognize the growing fragmentation and challenges with managing input, sensor data, testing and iteration time. To address these challenges, we’re continuing to build out both the AR Foundation and XRI Toolkit.
Unity’s XR support includes handheld platforms including ARKit for iOS, ARCore for Android, mixed reality devices such as Magic Leap and Hololens, and fully immersive VR HMDs like Meta’s Quest (including the recently announced Meta Quest Pro) and PlayStation®VR. It also supports OpenXR plug-ins for conformant runtimes.
Unity’s OpenXR support has continued to evolve as well. In recent versions, we have added:
The recently released AR Foundation (5.0) adds two new, important features to improve quality of life, Simulation and AR Debug Menu. For those unfamiliar with AR Foundation, it is Unity’s abstraction layer that delivers “build once, run anywhere” convenience for developing AR experiences. Platform-specific provider packages relieve you of having to interface with different platform SDKs by writing to a common C# API instead. It’s also extensible, so third-party platforms can add new features.
AR Foundation’s Simulation frees you from the time-consuming loop of building, deploying to devices, then testing on that device while you’re developing applications by allowing you to test your app directly in the Editor. On pressing play, AR Foundation loads a simulation environment and detects planes, markers, and other AR features, feeding them to the application as a player navigates the environment. Out of the box, you’ll find a variety of environments to test with that represent different indoor and outdoor use cases, and you can also build your own to model a specific use case or target environment.
The AR Debug Menu shows you information and available configurations while your application is running on the device. It displays statistics including the current FPS and tracking mode, and you can add visualizers to help understand how the device is perceiving the world or show a device’s available configurations and capabilities.
Unity’s XR Interaction Toolkit (or XRI) provides a high-level system for creating interactive XR experiences. It’s an interaction framework that’s designed to make it easier to translate input into interactions with 3D and UI objects. It does this by providing an abstraction layer over the Input System package (or legacy XR Input if required), so you can build against individual actions rather than specific input devices.
XRI has matured so that it is now the foundation for the latest Microsoft Mixed Reality Toolkit (MRTK3). The XRI package allows you to quickly add grabbable objects, a locomotion system, UGUI interaction, and more, simply by setting up your scenes with the provided components.
With the release of the latest versions of XRI, the package now includes the following support:
For more information about what’s new in XRI, please see our documentation. XRI’s main features and capabilities are demonstrated in the included Starter Assets sample package. You can also check them out in our upcoming XRI Examples project, launching soon on GitHub.
We have extended and improved our device support in our 2022.2 beta with the addition of Magic Leap 2, including newly supported segmented dimming. The same release includes added support for Playstation®VR2, including foveated rendering for improved performance and graphics fidelity.
We’ll be releasing a new sample package for XRI on Github toward the end of the year, which will include examples for hand tracking, interaction assistance using eye gaze, two-handed object interaction, physics components like hinged doors, sliding drawers, 2D and 3D UI controls, and more.
Building on the OpenXR standard, we will be introducing a multiplatform solution for hand and eye input as well. These new inputs will be made available through our XR Interaction Toolkit early next year.
We’re thrilled about 2023 being an incredible year in XR for our customers and the industry as a whole. XR hardware is moving in an exciting direction – not only with big improvements in hardware performance but also new capabilities. New hardware functionality, such as passthrough video, hand tracking, and gaze or eye tracking, promise to unlock an entirely new set of interaction models and opportunities for creators to bring more immersive and intuitive experiences to their customers. We plan to continue to innovate alongside the growing XR ecosystem to bring Unity developers the very best tools to create exciting and engaging games in AR, VR, and the emerging mixed reality space.
You can learn more about VR and AR development, as well as find examples of some of the successful Made with Unity games, on our solutions web page. You can also check out a recent Creator Spotlight with Ramen VR on the team’s VR MMO Zenith: The Last City below, or a case study outlining how Fictioneers used AR Foundation to build a city-scale AR app.
That’s not all. Take time to discover some of our other incredible XR resources:
We are best guided by the games you make and the challenges you encounter along the way. It’s our job to remove the obstacles, so we’d love to get to know you more. Tell us what game you are building next: Which features are you using and why? What do you think of our vision and our direction?
Stay tuned for the next Games Focus update and don’t forget to join us at Unite 2022 on Tuesday, November 1, where you’ll learn even more about the Unity vision.