Discover tools for body tracking, a new collection of content, companion app, and more.
Unity MARS provides augmented reality (AR) creators with specialized tools and a streamlined workflow to create intelligent mixed and augmented reality experiences that fully integrate with the real world. By bringing environment and sensor data into the creative workflow, Unity MARS helps you to build AR apps that are context-aware and responsive to physical space.
Over the last couple of releases, we’ve focused on expanding the functionality of Unity MARS to include body tracking, improve backward compatibility with AR Foundation, and deliver a better onboarding experience for users. These enhancements address new types of world understanding for AR applications, improve the usefulness of these tools for Unity AR projects that are already in progress, and address known bugs.
Dan Miller, Senior Developer Advocate, XR, demonstrates the simulation as a subsystem workflow.
Unity MARS is built on top of AR Foundation, the framework that powers many existing in-development or already-released AR applications. To provide an easier path for existing AR Foundation projects to integrate with Unity MARS, we’ve built simulation as a subsystem. This allows you to import Unity MARS into an existing AR Foundation project, where everything just works. You can now test, iterate, and continue to develop your app leveraging the powerful Simulation system offered with Unity MARS.
The Simulation system is one of the most popular features in Unity MARS, as it drastically speeds up iteration cycles as you create your AR experience. The Simulation view gives you the power to visualize and test your AR content against a variety of scenes and scenarios using real-world or simulated data. It also helps to ensure that your AR experience works smoothly in your desired location.
We currently support simulation of planes, image markers, face landmarks, point clouds, and raycasting directly to AR Foundation, and we will continue adding support in subsequent releases.
Body tracking features enabled by platforms (and by iOS in particular) have created the ability to virtually try on clothes, wear digital costumes, and run inexpensive motion capture directly from a mobile device. With Unity MARS, we want to ensure that AR developers can seamlessly describe how their content attaches to the human body and simulate that experience in the Unity Editor.
Unity MARS proxies now support human avatar rigs and body poses defined by Mecanim. Unity MARS Simulation also supports synthetic human avatars that can be rigged and animated to create example environments for testing your experience. We power all of this with AR Foundation.
Unity MARS is one of our most content-packed products. We’ve been closely watching what’s been working for users and where we could do better. We’ve found challenges with discovery – from verified sources such as the package manager and package samples, to more experimental or fast-paced sources such as GitHub and our forums. We’ve also found challenges with delivering sample content with new dependencies from Unity MARS. And we want users to have an a la carte experience – keeping file sizes low and upgrading their projects as new content becomes available.
The content manager (beta) is our solution, and we’re thrilled that users will be able to test it out now. It’s a product-centric view that surfaces content available in GitHub, forums, and the package manager in one place. We’re rolling this out slowly – but in coming versions you’ll see a much slimmer Unity MARS install, with new and existing content split into Content Packs.
After installing Unity MARS, you’ll find our content hub in Unity’s main menu. Go to Window -> Content Manager. We’re sitting on a wealth of experience, assets and demos that we can’t wait for you to see – and this is just the first step.
The Unity MARS Companion app (beta) works alongside the authoring environment to help deliver more efficiency and ensure higher accuracy in how AR apps run in their target environments.
With the app, available on iOS and Android devices, creators can capture real-world data directly on their device and bring it into the Unity Editor, as well as create content and lay out assets on a device.
The app is available in open beta, and we encourage you to try it out and share your feedback as we continue to develop the tool. You can learn more about the beta here and submit your feedback to this survey.