Following the Apple Vision Pro and visionOS announcements at Apple’s Worldwide Developers Conference (WWDC) 2023, we are excited to share that Unity’s beta program for creating spatial experiences on the visionOS platform starts today. We worked closely with Apple to provide a deep integration of visionOS with Unity, enabling creators to bring beloved games and apps to a whole new audience and ecosystem, or create something entirely new.
The visionOS platform represents an exciting opportunity for developers to create the next generation of compelling spatial experiences using the Unity Editor they know and love. We’re also thrilled to debut Unity’s PolySpatial technology, which will power Unity content alongside other apps in the Shared Space on Apple Vision Pro.
We know developers are excited to get started with this new platform. Beta participants will be added to the program over the next few months, but there are lots of things you can do today to start preparing content. Let’s dive into what you need to know.
WWDC 2023 was an exciting moment for Unity and the XR ecosystem as a whole, as Apple announced its collaboration with Unity to help bring creators into the era of spatial computing through Apple Vision Pro.
To learn more about Apple Vision Pro, visionOS, SDK, and core concepts around spatial design, check out the Apple Developer website.
Two important Unity learning sessions were released as part of the WWDC event. We highly encourage interested developers to watch each session to learn more about Unity development for visionOS:
Let’s review the ways apps can run on Apple Vision Pro. There are three main approaches to creating spatial experiences on the visionOS platform with Unity.
Porting an existing application or creating an entirely new one is straightforward with Unity. Here’s a quick overview:
Workflow: With full support for the visionOS platform in Unity, you can see your projects running on Vision Pro in just a few steps. To start, select the build target for the platform, enable the XR plug-in, and generate an Xcode project. Then, from within Xcode, you can build and run to either Vision Pro or the device simulator.
Graphics: Unity recommends using the Universal Render Pipeline for visionOS projects because it enables a special feature called foveated rendering for higher-fidelity visuals.
Input: People will use their hands and eyes to interact with content on Vision Pro. Unity’s XR Interaction Toolkit adds hand tracking to make it easier for you to adapt existing projects. You can also react to built-in system gestures with the Unity Input System, and access raw hand joint data for custom interactions with the XR Hands package.
Shared Space: Unity’s new PolySpatial technology enables developers to create apps that can run side by side in the Shared Space.
In addition to immersive apps, developers can also run content in a window that the user can resize and reposition in their space. This is the easiest way to bring existing mobile and desktop applications to visionOS, and is the default mode for content targeting the visionOS platform. Beta support for windowed applications is available to try today in Unity 2022 LTS (2022.3.5f1 or newer).
While Unity’s beta for visionOS gradually rolls out to participants, there are several important steps you can take to prepare your projects for this new platform:
Register your interest in joining Unity’s beta program by signing up today. You’ll be notified by email when participants are selected to join the beta program. We can’t wait to see what you create!