Search Unity

3 reasons to consider Unity Mars for your next AR project

April 23, 2021 in Games | 8 min. read
An animated pond with lily pads, rocks surrounding an in it and tall grass dotted around. One one side their are little wooden Japanese-style houses and pink cherry blossom trees. On the other there is a wooden Japanese-style tower. This animated scene has been created in AR and is being displayed on the ground and part of the wall of a room with light wooden floors and white walls.
An animated pond with lily pads, rocks surrounding an in it and tall grass dotted around. One one side their are little wooden Japanese-style houses and pink cherry blossom trees. On the other there is a wooden Japanese-style tower. This animated scene has been created in AR and is being displayed on the ground and part of the wall of a room with light wooden floors and white walls.
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

Unity Mars is a tool to help kickstart and further support AR development. Read on to explore three specific scenarios and use cases that can make Unity Mars a win for your team’s next AR project.

You have a diverse team with various skill sets working in the Unity Editor

Unity Mars brings AR creation right into the Unity Editor. In addition to providing a unique set of samples, the Simulation view enables drag-and-drop functionality to produce and place AR content, as well as the ability to visualize markers for image tracking-based applications.

If you want to write scripts, Unity Mars is built on top of AR Foundation so that you can access all the unique platform features and functionalities through provider interfaces. You can also link directly into Unity Mars components and the Unity Mars query and data API via scripts. 

Let’s look at two ways to trigger an animation when a Unity Mars object finds a match for its conditions: The first demonstrates how to use the MARS Component Action, whereas the second involves a custom script.

To trigger an animation on match, first add the Match Action Unity Mars component to any proxy object. From there, you can add events for Match Acquired, Updated, Lost and Timedout. In our example, we’re linking into the Match Acquired event to store a reference to an animator and set the trigger for Grow.

Menus in the Unity editor are on the right. Simulation view has an animated interior of a room

To accomplish the same thing in code, you can store a reference to the proxy component and subscribe to the MatchChanged event. Check the callback to see if the query result is null. If it’s not null, this means that a match has been found. From there, you can call SetTrigger on the animator passing through the Trigger Name.

Here’s a look at the script hooked up in the Editor.

Screenshot of the Unity editor. The inspector and Hierarchy menu are open. In the Inspector menu a script called 'Play animation on match' it circled. In the game view there is an animated flower with red buds. Sitting in dirt surrounded by stones. The scene view is zoomed in on the lower bit of the stem and stone/dirt area. The animator window is open with a flow set up but it's difficult to read.

This example highlights the incredible flexibility that Unity Mars offers developers and non-developers alike. As you can see, both of these users can approach the same task differently, whether they’re using specific scripts, or harnessing the Unity Mars interface more exclusively. These workflows empower the creators on your team to work directly in the Unity Editor with Unity Mars.

Your AR experience is based on transforming and interacting with the user’s environment

Many AR applications prioritize the placement of digital content in the real world. With Unity Mars, you can do even more thanks to the proxy-based and rules workflows, which allow you to configure the conditions that determine how and where your content appears. In other words, not only can an app place content in the real world, it can heighten your environment into a much more unique experience. Unity Mars also handles integration with core Unity systems like navmesh and physics within an AR context.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

This video shows how different environments, models and textures are spawned based on rules. The models used are from Synty Studios on the Unity Asset Store

With Unity Mars, you can quickly create content that procedurally spawns in your world based on the different surfaces you’ve scanned. You can utilize core Unity features in AR more easily with Unity Mars extensions for features like NavMesh, which enables characters to pathfind and move on different surfaces in the real world more fluidly.

You have limited access to the space where your AR app will run

Over the last year, remote work has reached an all-time high. It’s now more common than ever to collaborate with teams across the globe, from different locations and time zones. While AR has the capacity to enhance a space, it needs to work seamlessly to feel believable, or like a natural extension of the real world. If you’re working on an app that is predicated on a specific location, or you have limited access to certain spaces, the Unity Mars Companion app (beta) is a great solution for bringing captured AR sessions back into the Unity Editor. The companion app’s AR capture and data recording features let you scan any environment and record surface data, camera paths and videos that can later be imported into Unity Mars and used in Simulation view. The Simulation view in the Editor then allows you to iterate and adjust parameters to better control how and when your content appears.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

See how a capture is imported from the Unity Mars Companion app into Unity Editor. You can then adjust the parameters on the proxy objects and track their changes based on the capture in Simulation view.

Testing and iterating in a location where your app or experience will be used is crucial for creating compelling AR content. With the Unity Mars Companion app, you can record several AR sessions from any location, at any time, and then save to the cloud and import them directly back into the Unity Mars Simulation view, so you don’t have to be onsite to see how your content and updates will run.

Get started with Unity Mars

To get started with Unity Mars, try our 45-day trial at no cost. After you start your free trial, be sure to check out "First Steps in Unity Mars" – a step-by-step course on Unity Learn that unpacks the foundation of the product so you can create an AR application.

April 23, 2021 in Games | 8 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered
Related Posts