Search Unity

Get started developing mixed reality for Meta Quest 3 with Unity

June 20, 2023 in Engine & platform | 5 min. read
Caucasian man seated on couch wearing Meta Quest 3 headset while game characters and sets unfold around him in mixed reality.
Caucasian man seated on couch wearing Meta Quest 3 headset while game characters and sets unfold around him in mixed reality.
Share

Is this article helpful for you?

Thank you for your feedback!

We’re excited to unveil a new way for you to create captivating, cross-platform, immersive experiences for Meta Quest. In this blog, we detail a brand-new preview of mixed reality development tools for Meta Quest 3, Meta Quest 2, and Meta Quest Pro, powered by Open XR and Unity’s AR Foundation. With this release, get ready to revolutionize the way we interact with the world around us.

African American women in action, smiling, while wearing Meta Quest 3 headset in living room interior

The power of mixed reality

Mixed reality enables you to interact with digital content in the real world, enhancing your surroundings with virtual objects, characters, and experiences. Advanced sensors and tracking technologies allow for precise mapping of the physical environment and accurate placement of virtual content within it. Mixed reality also enhances the way we perceive and engage with our surroundings, offering a truly transformative and immersive user experience. Unity’s new toolset aims to provide you with the resources you need to create compelling cross-platform mixed reality experiences for Quest devices.

OpenXR and AR Foundation

OpenXR is a royalty-free standard that simplifies AR and VR development by enabling applications to reach a wide range of hardware without the need to rewrite code. Developed by a consortium of industry leaders, OpenXR’s interoperability makes it easier to create content that reaches a wide audience.

Unity’s AR Foundation is a cross-platform framework, purpose-built for creating applications across mobile and headworn AR/VR devices. It allows developers to create experiences and deploy them to multiple platforms. By leveraging features from common SDKs, such as ARCore, ARKit, and the OpenXR standard, AR Foundation provides a seamless workflow in Unity so you can focus on unleashing your creativity.

White-colored Meta Quest 3 headset and controllers floating over ombre pastel background

In preview

We are introducing a preview of AR Foundation support for Quest through a new Meta OpenXR package.

This preview release offers Quest support for essential features such as passthrough, plane detection, device tracking, raycasting, and anchors. It also includes Quest-specific updates for samples like Simple AR, which demonstrates basic plane detection and raycasting, and Anchors, which demonstrate how to create an object that specifies the position and orientation of an item in a physical environment.

Let’s take a closer look at passthrough and plane detection.

Blend the real world with digital content

With passthrough support, developers can now seamlessly blend the virtual and real worlds, allowing users to see and interact with their physical environment while engaging with virtual content.

Imagine creating games where players can navigate their living rooms or offices while battling virtual enemies, or designing applications that overlay virtual objects onto real-world surfaces with unmatched precision. The possibilities are truly limitless.

Understand physical space

Plane detection in AR Foundation opens up a realm of possibilities for developers seeking to create context-aware experiences for Meta Quest. With plane detection, your applications can analyze and interpret the physical environment, allowing virtual objects to interact intelligently with the real world.

Imagine building games where characters navigate obstacles in real time, or designing levels that adapt to different room layouts. AR Foundation’s plane detection for Quest will give you the data you need to understand physical space and push the boundaries of immersion.

Start building for Quest 3

Unity Editor displaying the welcome window of the mixed reality template overlaid on top of a mixed reality template project.

We know that having robust templates, sample content and pre-defined interactions can save you a lot of time. That's why we’re adding new XR templates and samples to Unity. You'll be able to streamline your project setup, explore complex object interactions and see examples of user interfaces. Stay tuned for the release of these templates in Unity Hub.

You can get started building apps for Quest 3 with AR Foundation and OpenXR today by downloading Unity 2022 LTS or later. You will also need to download the experimental Meta OpenXR package. To do this, open the Unity Package Manager from inside the Unity Editor, click the plus (➕) symbol in the top left, then select “Add package by name” and type com.unity.xr.meta-openxr. Once downloaded, it will automatically trigger other required packages, such as the OpenXR Plugin and AR Foundation packages, to download. For sample content, check out Simple AR and Anchors on Github.

The Unity XR team is continuously improving AR Foundation. As we carry on with development, we want to hear from you and would love to see what you build with these tools. Feel free to include the hashtag #unityARF when posting about your project on social media.

June 20, 2023 in Engine & platform | 5 min. read

Is this article helpful for you?

Thank you for your feedback!

Related Posts