Since we first released the pre-alpha of EditorXR in December, we’ve been hard at work refining, collaborating, and experimenting. Today we’ll be covering some of the highlights. Bonus: Our friends at Kinematic Soup have been hard at work on Scene Fusion and have an update to support out latest version. Try them both out today!
EditorXR now works in vanilla Unity 2017.2p1 and above. Devs rejoice! Now, with version 0.0.9, you can simply go get the latest version of Unity, download the EditorXR package, Text Mesh Pro, and partner SDK (SteamVR or Oculus Utilities), and you’re set.
With this shift, we hope to see a lot more people creating with, and more importantly on top, of EditorXR. We’ve barely scratched the surface of XR tools. While we’ve created a solid foundation for doing scene layout and exploration, object browsing, and some interaction tweaking, the sky’s the limit for bespoke pipeline tools, or assistive editing tools. We encourage you to create with #EditorXR and share your creations, whether it be on the Asset Store, Unity Connect, or social media (remember to @ Unity!). And as always, we’re actively developing EditorXR publicly on GitHub. Come share your feature ideas and report issues. Pull Requests are welcome.
SceneFusion is a Unity Editor extension that enables multi-user scene editing within Unity. Users can start a “session” in the Editor, and any other users who join this session and have the same assets in their project will download the host’s current scene and send and receive any edits done in the scene view. Not only is it an incredible technical achievement, it’s pretty magical to work on a scene together in Unity. We are constantly surprising ourselves with how much better it is to work collaboratively in an actual 3D scene, in XR. It’s also a whole lot of fun.
We show off SceneFusion and EditorXR every chance we can. Our last big demo was at SIGGRAPH’s Real Time Live presentation. We've also gotten a lot of value out of collaborating on EditorXR in SceneFusion, both in situations where we are far away from each other, as well as at adjacent desks. The old adage is true—two (XR) heads are better than one.
You can see the foundations for a new discoverability system in the latest release. Currently this is limited to dynamic tooltips attached to controller affordances (buttons, triggers, etc.), and basic visual changes like outlines and transparency control. Also, the affordances animate now, based on user input.
While all of this may sound like a basic task, we have designed the system to support any conceivable controller (“Proxy”, in EXR terms) or control mapping you could throw at it. We also wanted it to be simple to interact with the system, so that user tools can take advantage of the feature and maintain a consistent look.
To this end, tools provide the hand, an affordance ID, and the tooltip text, if there is any. The rest, as well as arbitrating conflicting feedback requests, is handled by EditorXR’s Proxy system. We’ll be pushing another release soon, and detailing our UX improvements there.
The Controllers Project marks the end of our Year of Usability. 2018 holds exciting new opportunities to try new concepts like editing in AR, diving into animation, post effects, or interacting with more advanced Unity features. Of course there is still a lot of low-hanging fruit sitting around to improve the scene layout experience, and we will be continuing to improve existing functionality. Why not try some ideas of your own and let us know how it goes? We’re eagerly awaiting your Pull Request. Happy editing!