Search Unity

Games Focus blog 05, hero image
Games Focus blog 05, hero image
Share

Is this article helpful for you?

Thank you for your feedback!

This blog is the latest in Games Focus, a series that shares what Unity is doing for all game developers – now, next year, and in the future. This installment covers the status, upcoming release plans, and future vision for XR.

I’m Tarrah Wilson, director of product for XR at Unity, where I drive product strategy for augmented, virtual, and mixed reality. I’ve been at the ground floor of building XR experiences for the last six years, including shipping the HoloLens and social VR apps at Microsoft.

Prior to working in XR, I made worldbuilding games and 3D creation apps with heavy focus on user-generated content. The XR leadership team at Unity shares a passion for the practical usage of XR in people’s everyday lives and helping build the best tools for creators and developers. Scott Flynn, Dave Ruddell, Dorrene Brown, and Will McDonald are colleagues from the XR team who have also contributed technical and engineering expertise to this blog.

We’re fortunate enough to enjoy an incredible XR game developer community building up around Unity during a period of rapid growth and evolution in XR technology and hardware. We anticipate that the XR space will continue to evolve – pushing us and all of you to keep up with new input paradigms alongside ever more capable hardware, world understanding, and user expectations. 

In this edition of the Games Focus series, we’ll cover newly supported hardware and updates to OpenXR and AR Foundation. With XRI (XR Interaction Toolkit), we now provide key interactions like natural object selection and manipulation, common locomotion patterns, and support for touchscreen gestures when combined with AR Foundation.

XR

Image of the Meta Quest Pro
Meta Quest Pro

Making XR multiplatform has always been essential to provide customers with maximum choice and flexibility. That’s why we prioritize building API abstractions and features over rapidly changing platform APIs and capabilities. We’ve invested heavily in low-level abstractions to enable support for a wide variety of XR hardware possible while providing you with a unified offering. We also support the growing OpenXR standard so you can reach more users on more headsets.

As we continue to invest in reducing the fragmentation at the device and SDK layer, we also recognize the growing fragmentation and challenges with managing input, sensor data, testing and iteration time. To address these challenges, we’re continuing to build out both the AR Foundation and XRI Toolkit.

What’s ready for you today

Unity’s XR support includes handheld platforms including ARKit for iOS, ARCore for Android, mixed reality devices such as Magic Leap and Hololens, and fully immersive VR HMDs like Meta’s Quest (including the recently announced Meta Quest Pro) and PlayStation®VR. It also supports OpenXR plug-ins for conformant runtimes.

Still from Ramen VR's The Game Changers
Still from Ramen VR's Zenith: The Last City

Unity’s OpenXR support has continued to evolve as well. In recent versions, we have added:

  • Generic Android loader, enabling you to make one binary target for all Android XR devices 
  • Support for foveated rendering, motion vector-based space warp
  • Support for 3+ views (enabling flexible support for future XR devices)
  • Oculus XR and Hololens Integration for Unity now natively supports OpenXR

The recently released AR Foundation (5.0) adds two new, important features to improve quality of life, Simulation and AR Debug Menu. For those unfamiliar with AR Foundation, it is Unity’s abstraction layer that delivers “build once, run anywhere” convenience for developing AR experiences. Platform-specific provider packages relieve you of having to interface with different platform SDKs by writing to a common C# API instead. It’s also extensible, so third-party platforms can add new features.

AR Foundation’s Simulation frees you from the time-consuming loop of building, deploying to devices, then testing on that device while you’re developing applications by allowing you to test your app directly in the Editor. On pressing play, AR Foundation loads a simulation environment and detects planes, markers, and other AR features, feeding them to the application as a player navigates the environment. Out of the box, you’ll find a variety of environments to test with that represent different indoor and outdoor use cases, and you can also build your own to model a specific use case or target environment.

The AR Debug Menu shows you information and available configurations while your application is running on the device. It displays statistics including the current FPS and tracking mode, and you can add visualizers to help understand how the device is perceiving the world or show a device’s available configurations and capabilities.

Image of a person using a VR headset for a Made with Unity game
Image from Rune Skovbo Johansen's Eye of the Temple

Unity’s XR Interaction Toolkit (or XRI) provides a high-level system for creating interactive XR experiences. It’s an interaction framework that’s designed to make it easier to translate input into interactions with 3D and UI objects. It does this by providing an abstraction layer over the Input System package (or legacy XR Input if required), so you can build against individual actions rather than specific input devices. 

XRI has matured so that it is now the foundation for the latest Microsoft Mixed Reality Toolkit (MRTK3). The XRI package allows you to quickly add grabbable objects, a locomotion system, UGUI interaction, and more, simply by setting up your scenes with the provided components.

With the release of the latest versions of XRI, the package now includes the following support:

  • Multi-hand grabbing and manipulation through our grab transformer system
  • Expanded starter Prefabs that allow you to bootstrap your project with the most common XR interaction and locomotion settings
  • Additional locomotion options such as directional teleportation, flight, and grab-to-move locomotion, which let you develop interactions for players go pull rotate by grabbing the world around them
  • Intention filtering to allow selection of objects more intuitive by using different input characteristics such as where a user is looking or how close their hand is to the edge of an object

For more information about what’s new in XRI, please see our documentation. XRI’s main features and capabilities are demonstrated in the included Starter Assets sample package. You can also check them out in our upcoming XRI Examples project, launching soon on GitHub.

What’s next

We have extended and improved our device support in our 2022.2 beta with the addition of Magic Leap 2, including newly supported segmented dimming. The same release includes added support for Playstation®VR2, including foveated rendering for improved performance and graphics fidelity.

We’ll be releasing a new sample package for XRI on Github toward the end of the year, which will include examples for hand tracking, interaction assistance using eye gaze, two-handed object interaction, physics components like hinged doors, sliding drawers, 2D and 3D UI controls, and more.

Building on the OpenXR standard, we will be introducing a multiplatform solution for hand and eye input as well. These new inputs will be made available through our XR Interaction Toolkit early next year.

We’re thrilled about 2023 being an incredible year in XR for our customers and the industry as a whole. XR hardware is moving in an exciting direction – not only with big improvements in hardware performance but also new capabilities. New hardware functionality, such as passthrough video, hand tracking, and gaze or eye tracking, promise to unlock an entirely new set of interaction models and opportunities for creators to bring more immersive and intuitive experiences to their customers. We plan to continue to innovate alongside the growing XR ecosystem to bring Unity developers the very best tools to create exciting and engaging games in AR, VR, and the emerging mixed reality space.

Resources

You can learn more about VR and AR development, as well as find examples of some of the successful Made with Unity games, on our solutions web page. You can also check out a recent Creator Spotlight with Ramen VR on the team’s VR MMO Zenith: The Last City below, or a case study outlining how Fictioneers used AR Foundation to build a city-scale AR app.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

That’s not all. Take time to discover some of our other incredible XR resources:

  • Explore OpenXR and how Unity enables conformant runtimes via the OpenXR Plugin manual pages. 
  • Check out an informative session from AWE 2022 that covers the simulation feature in AR Foundation 5.0 (find more details in our forum).
  • Learn about XRI and MRTK3 from our session at Microsoft’s Mixed Reality Dev Day earlier this year.
  • Find out more about our support for PlayStation VR2 from a 2022 GDC session.
  • Uncover all that’s available through the new Unity Learn VR pathway, designed to help those who want to learn how to build for VR with Unity.

We are best guided by the games you make and the challenges you encounter along the way. It’s our job to remove the obstacles, so we’d love to get to know you more. Tell us what game you are building next: Which features are you using and why? What do you think of our vision and our direction?

Stay tuned for the next Games Focus update and don’t forget to join us at Unite 2022 on Tuesday, November 1, where you’ll learn even more about the Unity vision.

Is this article helpful for you?

Thank you for your feedback!

Related Posts