Search Unity

EditorXR Runtime

December 20, 2019 in Technology | 10 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

In 2016, Unity Lab’s Authoring Tools Group released the first version of EditorXR, our extension for working in Unity’s Scene View in XR head-mounted displays (HMDs). EditorXR lets you create content spatially, bringing the full functionality of the Unity Editor’s authoring tools into reality. Since its introduction, our experimental EditorXR feature has evolved alongside the Unity platform and XR community. We’re using this blog post to share our recent progress and take a deep dive into an exciting feature that we released earlier this year: EditorXR Runtime.

Latest updates (0.3 release)

We recently released version 0.3 on GitHub. This version includes updates to support the latest versions of Unity, but more importantly, it removes our dependency on the Oculus and SteamVR SDKs. This change makes the setup process much easier. Another major feature is the Spatial UI, which is part of an ongoing effort to improve how menus work and enable quick mode-switching in EditorXR. This feature focuses on ease of use and compatibility with AR HMDs like Magic Leap and HoloLens, which do not include two tracked controllers. Out of the box, EditorXR still only works on Oculus and SteamVR, but we plan to officially support more devices in the future. We use the underlying systems of EditorXR to drive menus and interactions in the Unity Mars companion apps, which provide reality capture and authoring capabilities for smartphones and AR HMDs. Other minor but still noteworthy enhancements include improved two-handed direct manipulation and menu updates for the default Annotation and CreatePrimitive tools. You can read more about these changes on our GitHub Releases page. We’ve also added a permanent EditorXR feature page to the Unity website, which will serve as a hub for information and updates on all things EditorXR.

Why we built EditorXR Runtime

One of the biggest learnings we have taken away from our four years of XR development is the degree to which the tools and interaction patterns for editing your app in XR and what the user can do in XR are intertwined. That’s why we decided to make all of the tools we created for EditorXR available for our developer community to use in their apps as well.

What is it?

EditorXR Runtime allows you to include any of our features in an XR application made with Unity. From the early days of EditorXR’s development, we made a conscious decision to make as few changes as possible to the core engine and to write our code using systems and APIs that are available to any user. This means that you can use almost all of the features we designed for EditorXR inside your own projects. Many of Unity’s Editor features are not available in player builds. Because of this, EditorXR Runtime includes a subset of the features available in Edit Mode. For example, the SerializedObject and SerializedProperty classes, which are used by the Inspector, do not exist at runtime. In the short term, we simply do not include the Inspector, Hierarchy (and Locked Objects), Project, Profiler, and Console Workspaces in builds. The Inspector, Hierarchy (and Locked Objects), and Project Workspaces are also not available in Play Mode. We have plans to replace these Editor systems with runtime equivalents, some of which will be released alongside the Unity Mars companion apps.

How do you use EditorXR in runtime?

In Edit Mode, the VRView window is the entry point into EditorXR. We bootstrap the system when you open the view, and shut down when you close it. In Play Mode and player builds, we use OnEnable and OnDisable. The EditingContextManager component will start up and shut down EditorXR and must be included in any scene which uses EditorXR Runtime. It is also possible to replace EditingContextManager with your own manager to control the lifecycle of EditorXR from user scripts. In the default configuration, EditorXR starts up as soon as the EditingContextManager is enabled and shuts down when the component is disabled.

Why use it?

One advantage of running EditorXR in a player or in Play Mode is very simple – much better performance! In Edit Mode, EditorXR draws the scene in sequence with the Editor GUI, which results in some unavoidable CPU overhead and limited performance when using a complicated layout. As a reminder, you will get the best performance in Edit Mode if you close all Editor windows, including those in the default layout – but in Play Mode, you don’t need to. You’ll also be able to take advantage of scene manipulation tools in context while your game code is running. For example, maybe a few minutes into a play session, you want to move a tree a little to the left. In EditorXR, you can use the transform tool to nudge it over and see how that feels. Of course, you will still have to transfer your Play Mode changes back into your scene after exiting Play Mode, but this can easily be done by making a prefab out of the modified objects or using an Editor extension like Play Mode Saver. We are working to make the Hierarchy and Inspector available at runtime, and when this feature is released you will be able to inspect and debug your scene objects in a build inside the HMD. In some situations, like if you’re trying to use EditorXR tools on an Oculus Quest, player builds are your only option. These capabilities haven’t yet realized their full potential, but the current version serves as a solid foundation upon which to build authoring tools for platforms that do not support the Unity Editor. As we continue to develop and improve EditorXR, we plan to extend the default UI to work on AR smartphones, HMDs, and even on a conventional tablet or screen using touch or mouse and keyboard interactions. The idea is to have one single framework for Unity authoring on any device, in any context. Users are free to do whatever they want with our code. Unity is also be building player-based authoring tools, like the Unity Mars companion apps, to support specialized workflows that take advantage of devices, XR or otherwise, that cannot run the Unity Editor. Another use case for EditorXR Runtime could be to ship a level editor with your game or build creation and collaboration software using EditorXR as a foundation. You can leverage existing features like the Annotation Tool or Poly Workspace and build your own customized workflows or data pipelines that cater to a specific workflow. You could even create and sell a VR productivity tool!

What’s next?

There are still some unanswered questions about how to incorporate the EditorXR Runtime into an existing project. For example, what if you want to use your own controller models? How do you open the EditorXR menu or use tools if existing interactions require using the trigger – or any other button mappings for that matter? How do you save changes out of a player build? Should you extend EditorXR’s menus to drive the experience or create your own to drive EditorXR? At this point, all of these questions are up to you; you can simply fork EditorXR and make modifications for your own needs. If you’ve already been extending EditorXR or if this post inspires you to include EditorXR in your project, please get in touch and let us know what you’re working on. Pull requests on GitHub are welcome!

XR Interaction Toolkit

With the release of Unity’s XR Interaction Toolkit, Unity developers have a de facto set of interaction code with which they can create XR experiences. Because of this, we can now start making some assumptions about how user projects are handling certain aspects of their scene setup, and, even better, we can delete some code in EditorXR, which serves a similar purpose. Starting with the code that controls the XR Camera Rig and the MultipleRayInputModule, we will be replacing general-purpose code in EditorXR with its equivalent in the XR Interaction Toolkit. This way, we can utilize these components in existing scenes the way they are already configured. EditorXR will use customized controller models and interaction settings if they are set up using the XR Interaction Toolkit. It is also possible to use EditorXR on a non-VR project, in which case a default setup is created when EditorXR starts up and shuts down.

Smartphones and AR HMDs

As part of the Unity Mars companion app, we’ve been updating EditorXR’s systems to work on smartphones and use touch screen input for manipulators and menus. The existing VR menus and tools don’t translate directly to AR today. Our goal is to introduce a generic main menu and workspace equivalent for smartphones and AR HMDs. The first step in this direction is to simply enable the underlying systems to function on these platforms so that they can be set up by users’ code.

Serialization

Saving and loading scenes, undo/redo, and the Hierarchy and Inspector are crucial features to Unity authoring. The fact that they are missing from runtime builds means that you’re limited to scene manipulation that will only last for as long as the build is running. That said, bringing objects from the Poly Workspace into your game, if only temporarily, is lots of fun! To bridge these gaps, we’re working on a runtime serialization package that can import/export Unity scenes in a player build. The plan is to be able to build a basic scene editor out of EditorXR when this feature is ready. The Inspector and undo/redo functionality also rely on serialization, and we will be able to provide runtime versions of these features.

Assets

Finally, the ability to publish new assets to your EditorXR player builds in a predictable way via AssetBundles will enable you to work in a facsimile of the Project Workspace and eventually evolve into a cloud-based asset pipeline like the Poly Workspace. We plan on tackling these features in the coming months and years, and we’re excited to see what you can build on top of EditorXR. As always, we welcome your feedback and input. What do you think we should make next?

Authoring as gameplay

EditorXR Runtime opens up a creative possibility that we’re very excited about: the potential to ship in-XR authoring functionality as a feature of your app and let your players design their own content – in the medium. This could come in the form of a level editor for your app, with all the benefits of user-generated content that games with similar functionality have benefitted from in the past. In a VR or AR app, you also get all the advantages native to authoring in XR: direct, natural, two-handed manipulation of 3D objects, the construction of environments at actual scale, and the transfer of muscle memory and embodied experience (such as drawing, sculpting, etc.) into the digital realm without having to learn complicated new controls.  

Going a step further, if you’re looking to build an app entirely about creation – the next Tilt Brush or Gravity Sketch – then the EditorXR Runtime can offer you a massive shortcut and solid architectural foundation. Starting from the base we’ve provided here; you can mod, extend, and stylize it to your vision of what the next great XR design app will be.  

The idea of using XR technologies to enable more and different kinds of creators is something our team is very passionate about.  We think that EditorXR and our plans for its features deeply embody Unity’s vision of the world being a better place with more creators in it.  

Get more information and updates on all things EditorXR.

December 20, 2019 in Technology | 10 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered