Search Unity

Unity Slices: Table, a look into the future of social mixed reality

October 28, 2021 in News | 8 min. read
Unity Slices Table
Unity Slices Table
Share

Is this article helpful for you?

Thank you for your feedback!

Today at Facebook Connect, the Unity Labs team shared a new proof of concept, Unity Slices: Table, to show how Passthrough VR can support mixed reality social experiences focused on tangible interfaces.

At today’s Facebook Connect event, our team showed off a new mixed reality (MR) experience called Unity Slices: Table. This Unity Labs project explores the future of tangible interfaces, blending realities using the Oculus Passthrough feature to bring people together around humanity’s most enduring social hub, the table.

We view Unity Slices: Table as a social, mixed reality, proof of concept built to bring people together, whether they share the same physical table, or are an ocean apart.

Unity Slices: Table

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

As a Senior Research Engineer on the Unity Labs team, I help transform ideas surrounding the future of XR into tangible prototypes. That’s how Unity Slices: Table began.

This upcoming proof of concept, vertical slice game made for the Oculus App Lab, brings up to four players together for classic tabletop fun. Over the past year, our team at Unity Labs has been prototyping and iterating on this experience, and we’re excited to give you a behind-the-scenes look at how we made it happen.

Building a prototype

The first milestone we had to reach while building our prototype was to align the virtual representation of a table to a physical table or desk. Since Oculus Quest 2 does not yet offer a way to accurately detect planes in Unity, we had to adopt a manual approach for using the controllers, so that users could quickly and precisely align the virtual desk to the table.

Once the alignment was complete, we needed to find a way to network it. While this might seem simple, there are a number of factors to consider when centering your social experience around a table. Whether you’re in the same space or connected remotely, the tangible table interface is shared and needs to look and feel right for everyone participating.

As such, we fleshed out what the shared computing experience would look like. With support from Oculus Quest 2’s advanced, articulated hand tracking, we managed to build a system that allowed us to turn any table into a giant touchscreen. But this is VR, and our aim was to do more than build flat interfaces, so we began experimenting with reactive 3D objects.

We primarily tested a game of chess by prototyping voluminous 3D pieces that collapsed vertically as the user approached their hand. However, early user testing revealed that the shape of the objects could lead players to misunderstand how to properly interact with them. Since the pieces collapsed vertically, users thought they had to raise their hand and point downward to interact with the piece they wanted to move. This was a problem because hand tracking systems don’t work as well when they can’t clearly discern the top silhouette of your hand.

Despite our reluctance to turn away from the appeal of futuristic 3D interfaces, we decided to make pieces users could interact with flat. This had two key benefits: First, it was easier to select the pieces users were targeting. Second, users no longer felt the urge to contort their hands; they simply touched and dragged.

We learned that, while interactions along a surface should remain simple and loosely similar to traditional touchscreen interfaces, there is new potential around what can be derived from a user’s hand positioning relative to an object in 3D space. Unlike touchscreens, interfaces can light up in anticipation of being touched, which makes them more playful and predictable as a result.

It took us awhile to get to a system that worked smoothly, but the moment we first hopped into a networked session with expressive avatars, and could both see and hear the other person tapping our table over the voice chat as if we were in the same room, was truly mind-blowing. It felt almost magical to bring this tangible part of our reality into a shared experience.

Passthrough volumes

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

If you’re not yet familiar with Passthrough VR, it is a novel way of bringing AR to VR headsets by showing a video feed of the real world inside the headset. While remote virtual experiences are already pretty fantastic, Passthrough also enables you to share your physical space with others in the same virtual experience. Not only can you see the avatar of a friend connecting from a distant place, but you can also observe the people sitting right next to you.

Once we incorporated Passthrough into this experience, we were faced with a variety of options for implementation. With traditional AR, content simply exists on top of the real world, and the illusion of presence is easily broken as you move your hand up to the content and notice it hiding your hand, for instance. With Passthrough, however, we could selectively control how the real and virtual worlds merged together, giving us the ability to create an occlusion mesh that allows your hands to feel as though they truly blend with the experience.

Going further with this idea of moving beyond overlaid content, and controlling how the real and virtual interact, we began playing with the idea of portals into virtual worlds that you could stick your head and hands right into.

We landed on a bubble of reality encircling the table. When engaged in a game or app, the virtual content has your full attention, and the virtual world around this content can similarly come to life. But when you look away from the table, the real world is present, so you can easily eat or drink without taking your headset off. Watching hands transform as they move in and out of the bubble has been a favorite XR experience among our Labs’ team members.

As we further fleshed out three ways to experience reality, we wanted to give our users the freedom to move between them with a simple interaction. That’s how we landed on a slider-based approach. The power to move freely between virtual and physical reality makes for an impressive experience that needs to be tried to understand.

Getting your hands on Passthrough and Unity Slices: Table

We’re excited to share the Unity Slices: Table experience with our community of creators. On behalf of the Unity Labs team, we hope that this experiment inspires the creation of your own mixed reality experiences. You can also start building your own Oculus Passthrough experiences with the experimental functionality found in Oculus SDK on the Unity Asset Store.

We’re still working hard to make sure that Unity Slices: Table will soon be accessible to anyone who wants to give it a try. Keep an eye on our social channels for future announcements.

October 28, 2021 in News | 8 min. read

Is this article helpful for you?

Thank you for your feedback!