Search Unity

Unity reveals latest AR Companion app feature at Apple WWDC 21

Image of phone capturing an image of a guitar
Image of phone capturing an image of a guitar
Share

Is this article helpful for you?

Thank you for your feedback!

At this year’s WWDC, Apple announced their exciting new technology for capturing real-world objects. Unity has been working closely with Apple to bring Object Capture into our Companion-to-Editor workflows.

Since the introduction of Apple’s mobile technology, Unity has made it easy for users to seamlessly create experiences that harness the power of new tech on Apple’s cutting-edge devices. Coming later this year, the Unity AR Companion app will include Object Capture to allow XR creators to scan real-world objects and generate 3D assets.

One pain point for many Unity developers is figuring out how to integrate the unpredictable real world into their digital experience – everything from scanning the space to figuring out where things are, to adapting and embedding real-world information into your experience. The Unity AR Companion app extends authoring beyond the Editor to provide new ways to create and capture in context. While we’ve enabled you to capture plane, mesh, and other environment info to help you build your app, we’re excited to finally add real-world objects to the list.

An exciting next step in content creation & AR

The Object Capture project also speaks to a larger trend in creation tools: the use of companion apps on specific hardware in conjunction with the Unity Editor. This means that you can use the medium and each device for what they're best suited for. In particular, an AR-enabled phone is a great device for environment capture, AR session recording, and most recently, object capture.

It’s worth noting that this new technology also supports the processing of any existing photo sets. This “traditional camera” use case is important to support, as this opens up the tech to an even wider audience, and brings with it the capacity to do photogrammetry with images you've captured in the past. Say that, years ago, you took photos of a favorite, now-lost heirloom; with this tech, you might be able to see that object in its full 3D glory once again.

The capture experience

This new Object Capture functionality is built into the iOS version of the Unity AR Companion App. The AR Companion App will release later this fall, and we'll do a full documentation how-to then, but for today, we wanted to give an overall rundown of this workflow, and discuss some of the thinking that went into it.

The experience starts in the AR Companion App, where, this fall, you'll find a new mode: Object Capture. Before you start capturing photos, you'll be presented with an interactive UI to set up a guide object over your object to capture. After lining up the guide, you can start taking your photos.

Image of a phone screen looking at a guitar, reading, "Take photos of object from all sides with 70% overlap between photos."

Like any photogrammetry process, you'll want to take quite a few photos of the object, from every angle you can. For each photo, a “pin” is dropped on the shell to communicate that this angle has been captured. At any point, you can flip the object over or sit it on its side to capture the bottom or underhanging edges.

Phone screen grabbing an image of a guitar

Each photo is analyzed as you take them in order to identify low-quality images that can lead to a bad result. When a blurry or otherwise unusable photo is detected, its pin appears red, so you can investigate and possibly delete and retake the photo.

Image of phone screen grabbing image of guitar with red pin

Once you've taken all your photos, it's time to head to Unity on the Mac to process them, generate your model, and put it to use.

There are a few different entry points supported: First, we've added the ability to use local wireless file transfer for this project. We also support our existing and robust Companion Resources Sync workflow, where “Captured Objects” has been added to existing resource types such as image markers and environment scans. Finally, we support the case of simply using a directory of local images, regardless of whether they were captured with the Companion App on an iPhone, or with a conventional DSLR camera.

Whatever the source, you can select the images you want to use and then quickly start processing. This happens in two stages: A preview-quality model is processed, which can be used to adjust a bounding box and make any necessary translation and rotation adjustments.

Image of guitar model in Unity

Then with one more button click, the full-quality model is processed. This step uses much of the data from the preview model generation, meaning that despite its higher quality, the second processing stage won’t take significantly longer than the preview.

And that's it – the final model is imported into your project as a prefab, ready to be used in your app.

Image of guitar model in Unity

Building in best practices

By testing early versions of the Object Capture mode in the AR Companion App, we realized that we had a great opportunity to use AR to guide the user toward best practices for photogrammetry capture. While the first version had simple written instructions, we were met with the time-honored tradition of users not reading or simply dismissing the prompt. That’s why we introduced the guide object.  

While the guide isn't strictly necessary, we found it valuable for giving the user feedback on how to take the images and maximize coverage by clearly showing the areas that haven't been covered yet. Along with the guide, we introduced the photo pins to show exactly where photos have already been captured, and for low-quality photo detection and feedback.

Image of phone screen grabbing image of guitar with all green pins

The guide and pins, while enormously helpful, did introduce a new consideration: Since we're not tracking the object that's being scanned, we can't automatically move the guide when the object is moved. But if the guide and pins remain where they were and the object has been turned over, the photos represented by the pins no longer correlate to the side of the object we're now facing. We tried a few clever solutions to address this, but it came back down to the simplest one: After moving the object, we encourage the user to adjust the guide and immediately reset the pins. We continue to track the total percentage of the object covered to communicate that the previous photos haven't been lost.  

All in all, we found AR to be a helpful teaching technology, especially for new interactions that involve moving around physically. 

Looking forward

This project is exciting to us for a number of reasons. First and foremost, it deeply aligns with our mission to continue democratizing content creation. While some game developers have used photogrammetry in their pipelines for years, it can be a highly specialized and frustrating process. However, Apple's announcement of this functionality means that it is now much more accessible to a wide range of creators. We look forward to seeing indie game developers, mid-sized studios, and students, among others, start to use real-world object capture in their processes.

But what we're maybe even more excited about is how this powerful toolset – now more accessible on everyday devices – can unlock content creation and curation for non-developers. When we initially started the project, we gathered use cases for the tech and were quickly struck by how extensive the impact was beyond games. For example, the owner of a music store, previously limited to posting images of instruments coming through the shop, can use Object Capture to create stunning, realistic, high-quality, and easily shareable captures of each instrument in the shop, ready to share online or even in AR.  

By integrating this creative technology directly into our tools, we empower and unlock the potential of non-traditional users. From curators and architects to artists and designers, you will be able to bring your ideas to life almost instantly. We can’t wait to see what you come up with later this year. Learn more about building intelligent AR applications with Unity Mars.

Is this article helpful for you?

Thank you for your feedback!

Related Posts