Search Unity

Designing for mixed reality: Learnings from the Unity Mars HMD companion app

Share

Is this article helpful for you?

Thank you for your feedback!

The spatial design team, part of the authoring tools group in Unity Labs, is currently working on a head-mounted display (HMD) companion app for Unity Mars, a Unity extension that gives creators the power to build mixed and augmented reality experiences that intelligently interact with the real world. The HMD companion app acts as a bridge between the Editor and the real world.  While the Editor experience is the primary authoring environment for Unity Mars, both the mobile and HMD companion apps provide on-device solutions that extend authoring beyond just a mouse and keyboard. In the HMD companion app, you can quickly capture environments from the real world to later import into the Editor for simulation, perform content editing, greyboxing, author conditions, and queries. It leverages the work of both Editor XR and the XR Interaction Toolkit, which provide reusable components that enable you to build interactive and immersive experiences quickly and easily. This post shares insight into the design process to show the thinking behind our work – creating meaningful connections between bits and atoms to help produce better interfaces for augmented reality (AR).

Spatial design principles

When designing for screens, digital representations are often closely coupled to the design tool. When designing for AR, however, the gap between the actual thing and its representation can be deceptive. We live in an unpredictable world, and our virtual spaces need to adapt and respond to the contexts they inhabit. Because of this, we favor working in 3D, at 1:1 scale, and on-device throughout the design process. In our minds, this is the only way to truly know how users are going to experience the completed app.  Constraints such as a limited field of view, the near clipping plane, and brightness/contrast levels can drastically influence the decision-making process, so gaining feedback quickly is of the utmost importance. AR HMDs’ limited field of view imposes many design challenges that we have had to address directly. For example, in Unity Mars, when you are authoring for particular conditions in the environment, matches can populate all around the user, even in another room. Only by prototyping in the device with a large spatial map did we understand how critical it would be to overcome this large range and have a “miniworld” version of the space. Being able to see all the objects in your scene from multiple angles through the miniworld gives you a better sense of the space and reduces moments of occlusion.

Miniworld prototype showing a 3d map of your surroundings

The creator is at the center of how we design for spatial computing, and we explore practices of structuring space as a series of layers surrounding the human body. Our clothing, objects, and interior/exterior boundaries are all layers that influence how we think about the potential organization of space. These layers translate to how we perceive space, which is largely based on our relationship with scale.  Understanding scale within the design process is a dense topic. How do spatial UI elements respond and adapt to these varying layers surrounding the user? How much space does it take to perform a given task? What is the right method of manipulation at a given scale? The HMD companion app will favor Unity Mars functionality that is inherently spatial and requires 3D manipulation or benefits from the juxtaposition of real-world contextual information.

Graboffset

Prototype for distance manipulation

Designing workflows

The traditional workflows for design, whether for screens or physical objects, don’t meet the needs of current XR development. More than other digital media, XR demands accurate visualizations, faster iteration, and some way to account for ever-changing environments. The spatial team looked to fill in the gaps of traditional processes by customizing and designing our own workflows in order to better understand the complexities of creating XR experiences. Here are some of the workflows that we tried – and what we learned as a result.

Paper prototyping

Spatial design requires an understanding of scale, not just for objects and spaces, but also for bodily movement in order to avoid fatigue. One-to-one paper prototypes shine as a means of testing movement and ergonomics.

For example, we designed and tested a virtual tray that would fall or rise based on the user’s head tilt movement. This physical prototype of a paper model tray was hung around the neck, directly below the chin. During a normal work day, we put frequently used tools in the tray, such as a phone, glasses, pens, and so on. After wearing the paper prototype for an hour, the neck movement associated with the summoning interaction was quite tiring, as were the frequent arm movements to access tools near the neck region. Therefore, we chose not to move forward with this design proposal. When designing for mixed reality, building quick-and-dirty paper prototypes can save you time and help you to think spatially.

Paper Prototype for a design proposal for a chin tray for tools

Simulating AR in VR

User flows for mixed reality are difficult to review effectively using traditional 2D tools like presentations and animated mockups. Instead of creating 2D storyboards, we built prototypes in VR using Tvori. Tvori enabled fast iteration while editing and animating interactions. We used the timeline feature to keyframe cameras, potential UI elements, virtual objects, and input devices.

Flow prototyped in Tvori VR with AR FOV view

Using VR, we were able to change the scale of our world constantly. We could experience the world firsthand at human scale while simultaneously miniaturizing the scene to enable easier editing. When we were presenting flows to others on the team, they could simply put on the VR headset, set the scale to 1:1, and scrub the timeline. 

In order to get a better understanding of the appropriate scale and proportion of virtual elements, a typical AR headset field of view “safe area” was introduced as an efficient way to quickly simulate AR in VR. The rectangle obstructing the human viewing angle is very practical: as spatial designers, this constraint influences how immersive our experiences are, not only in terms of what we see but also how we move (i.e., hand gestures).

User flow and input mapping

Input mapping for palette menu

Throughout the design process, we kept two living documents, one for input mapping and the other for user flow. The user flow document was continuously being updated as the source of truth. Here, we determined both how and where a user would accomplish a certain task. Is a task accomplished within a particular workspace – our word for individual pieces of XR UI – or in the real world, or can the task be performed in both? 

We also employed an input mapping exercise to better understand actions being performed in the app and their corresponding frequency. It can be simply described as taking the following steps:

  • List all known actions and rank them in terms of their priority. 
  • Write down how often each action will be performed.
  • Decide on the type of interaction. 

As complexity increased, we had to enhance our prototype fidelity. By using the “mind map” view, we could better understand overlapping inputs and how to adjust accordingly. This understanding is increasingly important with input devices such as the Magic Leap One controller, where the number of inputs can be limited. By requiring hover states for particular actions, we were able to distribute inputs while avoiding conflicts. For generic actions, we decided to use the bumper for a contextual menu and the home tap to summon the main palette menu. Also, in order to address fatigue, we avoided any input that required a trigger or bumper press in combination with the touchpad.

Rapid on-device iteration

Using Unity, we assembled 3D models, UI designs, and controller scripts to quickly get designs onto the Magic Leap. The goal was to get feedback from quick iterations of interaction prototypes, while not being concerned with writing perfect code early in the process. In addition to building our own prototypes, we were inspired by the many apps developed by the broader XR community, and we often used them as references to convey particular interaction behaviors.

Most interactions rely on “magic” numbers such as timings, thresholds, and distances. The first step after building a prototype was to tune those values to match the feeling that we imagined for the interaction. In the Unity Editor, we could tweak values while in Play Mode, but we also wanted to be able to adjust values in a runtime build. To help with this, we built a simple reusable settings menu within the app. Any prototype script could use a single line of code to add a value slider or checkbox to tweak those magic numbers.

Tuning the “magic” numbers for interaction settings

The next step was to get feedback and iterate. Here are some tips that we found useful when  someone else is testing an interaction:

  • Say as little as possible at first. If it’s truly necessary, you can tell a user what they should be trying to do, and maybe briefly explain how. Remember, you don’t come with the software, so you need to observe how they interact without your influence.
  • Look for red flags such as confusion, frustration, or surprise, then ask the user what they expected to happen (versus what actually happened).
  • If the user is having trouble explaining their feedback, you can stream or record the headset view so they can demonstrate and gesture towards virtual objects.

UI scaling

A virtual object’s scale is hard to measure in the real world, and it should be responsive to users’ viewing distance. We cannot use real-world scale as the only reference, since the user can only sense that the UI is physically larger because of stereoscopic vision and other depth cues such as the ray’s translational movement. If a user closes one eye or looks at a UI without any other context, then they can’t tell how large or far away it is.

We used the distance-independent millimeter (DMM) system created by Google’s Daydream team for VR design. The idea is to scale distance and size proportionately. For example, 1 dmm can refer to the size of a 1*1 mm button placed 1 meter away; it can also refer to the size of a 3*3 mm button placed 3 meters away. Using the dmm system, Google provided a set of suggested font and minimum ray-based hit sizes, such as 20 dmm as the smallest font size and 64 dmm with 16 dmm padding as the minimum hit size. 

When we tested the suggested sizes with a Magic Leap, we felt that they were too large, and we ended up making things a bit smaller. This is likely because of the Magic Leap’s smaller field of view and 6DoF controller, compared to Daydream’s higher FOV and 3DoF controller. When the UI scales proportionally with distance, it renders at the same pixel size and covers the same angular region for a raycast, but we noticed that it feels very different. We’d be curious to hear from the community if they have similar findings.

Visual design

The spatial UI design for the companion app is centered around the concept of using light as a material. The layering of light and shadow helps to create a more tactile experience, and it is meant to enhance the user’s perception of depth in an AR HMD. Ultimately, light is relational, in that we only perceive it by how it’s shaped by other materials and surfaces. All of the spatial UI elements in the companion app are treated as if they were a physical construction and carefully considers the relationship between the light source and the plane of light.

This effect is created by employing a negative shadow technique. AR head-mounted displays use a light-based additive color space. Black renders transparent, as it is the absence of light. If a black object is positioned in front of other virtual objects, it acts as a mask. In order to create the illusion of shadows, we have to surround the darker shadow area with light. Human perceptual systems rely heavily on the cues provided by shadows to infer the motion and depth of 3D objects. If AR apps built for these devices neglect to incorporate shadows, we deprive the user of important depth cues.

Depth cues provided to the user, button hover shadow (Top) and textured legibility mask (Bottom)

In addition to shadows, the use of texture gives users cues to understand scale and the relative position of objects. Banding is a problem when dealing with color in light-based HMDs and often occurs when using high-contrast gradients. Using this as a constraint, we decided to incorporate the use of texture into the design of the spatial UI elements, not only to expand the range of colors we could use for gradients, but also as another way to improve the perception of depth within the authoring experience.

Baked shadow texture maps with noise for palette menu (Left) and context menu (Right)

The relationship of a spatial UI element and a light source is designed to highlight the potential behavior of the interface. These visual (alongside audio and physical) modalities help to communicate affordances or provide input feedback to the user. Much like in our everyday spaces, light accents create visual interest and highlight objects or specific features of a space. Borrowing from these affordances found in the physical world, we created various relationships between the light source and spatial UI elements that resemble embedded, cove, and recessed lighting conditions.

Input feedback visual/motion studies for spatial UI elements

What’s next

We are always searching for new workflows that can help us to build better experiences. We hope the learnings we’ve shared in this post help you with your own spatial designs, and please don’t hesitate to reach out with techniques that you have discovered. As a community of creators, sharing knowledge is crucial to shaping this new medium and enabling us to create things we never thought were possible.  If you are interested in learning more about Unity Mars or the companion app for AR HMDs, join us in the forums or reach out to us directly at mars@unity3d.com.

Is this article helpful for you?

Thank you for your feedback!

Related Posts