A lot has happened since we first announced the AR Foundation package for multi-platform handheld AR development. We want to take this opportunity to share how the package has evolved since developers started using it, and where it’s headed in the future.
We also want to provide some resources to help you better understand how AR Foundation fits into the handheld AR development ecosystem and how to use it to build great handheld AR applications.
We recently made significant updates to AR Foundation and other XR packages.
You can now have more control of rendering by using the Lightweight Render Pipeline on ARCore and ARKit apps built with AR Foundation.
This also opens up the ability to utilize Unity’s shader graph to create interesting effects through a visual node editor.
We now provide low-level access to the camera image on the CPU, as well as optimized conversion utilities to convert the image to RGB or grayscale. This is ideal for developers looking to do their own image-processing for custom computer-vision algorithms.
We added support for ARKit’s ARWorldMap feature, which allows you to create persistent and multi-user AR experiences. Note that this will only work on ARKit enabled iOS devices.
AR Foundation now includes support for ARKit’s face tracking feature, which lets you track a face and access blend shapes for several facial features.
Today, AR Foundation provides a platform-agnostic scripting API and MonoBehaviours for making ARCore and ARKit apps that use core functionality shared between both platforms. This lets you develop your app once and deploy to both devices without any changes. For a full list of currently supported features in AR Foundation refer to the chart below.
However, AR Foundation does not yet implement all the features for ARKit and ARCore, so if your app depends on a specific feature that isn’t yet in AR Foundation, you can use those specific SDKs separately. We are constantly adding features to AR Foundation and hope that AR Foundation will serve all the needs of developers looking to target ARCore or ARKit.
If you are only targeting ARCore and want the full feature set, Google maintains an SDK for Unity. If you are only targeting ARKit and want the full feature set, we still maintain the original ARKit plugin for Unity.
The charts below summarize the differences:
A major feature we are testing and hope to roll out next year is remoting, which is the ability to stream sensor data from a device running ARCore or ARKit to the Mac or PC Editor. This should improve iteration time and aid in debugging your AR apps.
In addition to remoting, we are adding in-Editor simulation. This will let you develop and test an AR app without ever connecting an Android or iOS device to your computer. This can dramatically improve development time and debugging.
In 2019 we are going to expand platform support beyond handheld AR to include wearable AR devices as well.
We created a sample repository that has a Unity project and scene with AR Foundation packages already included. There are scripts available for visualizing planes and feature points, placing objects on found planes, and using light estimation. We recently added some UX features to the samples repository that includes various animations to guide the user to find planes, place objects and fade out planes when they are no longer being updated. Check out the SampleUXScene for all of these features and more!