This is a guest blog post collaboration by Daisy Leak, executive producer; David Swift, Director of Software Engineering; and Ben Grossmann, co-founder at Magnopus. Magnopus is a team of 170+ artists, designers, and engineers with studios in the US and UK that create, deploy, and operate, cross-reality experiences at scale.
At Magnopus, we’re focused on bringing people together with great experiences across the physical and digital worlds. Over the past three years, we developed and launched the largest geospatial Metaverse experience in the world, and we leveraged our Magnopus technology stack as well as the Unity engine to build the experience.
This incredible and challenging project started almost four years ago when we joined a team from Expo 2020 Dubai to develop a visitor experience for the Mobility Pavilion, one of the three signature pavilions reflecting the themes of the World Expo: Mobility, Sustainability, and Opportunity. During the discovery with the team, we were compelled by the mission statement conveyed by Her Excellency Reem Al Hashimi: “connecting minds, creating the future.”
Many pavilions espouse a narrative of “past, present, future,” but we proposed to take the visitors on a journey of human mobility through the physical world, into the digital world, arriving at a world where both are unified. This story empowered children to overcome the present limitations of mobility, to build the world they wanted to see, as their ancestors had for millennia.
It was during this exploration that we all wondered: Could we build an “alpha” of this vision across the entirety of the Expo site? Could we share the experience of the physical site in the form of a social digital twin to people around the world, and connect them across both?
It would be much more complicated than creating a game and would require building on constantly shifting sands the entire time, but we gained the support of the leadership of Expo to pursue this ambitious vision. So, we assembled a diverse team of over 200 engineers, designers, and artists working across seven countries, and a network of companies. Together, we worked night and day to create a city-scale cross-reality space where on-site and remote visitors to Expo 2020 Dubai could connect in real-time in shared experiences.
After 39 months of development, 6 months of live operations, and a global pandemic, that future-facing experience is now live. Expo Dubai Xplorer is accessible through mobile apps, web, cloud technologies, as well as digital signage on the site.
The experience consists of two key components united by complex interoperable technologies—a digital layer of content to enhance the real-world Expo; and a digital twin of the 4.38km² site, filled with inspiring experiences and more than 200 buildings from the world’s leading architects, including 192 unique country pavilions.
Virtual reality was used to collaboratively develop the designs and experiences so they could be tested before the physical site was even built.
We created a huge assortment of digital art installations and located them throughout the site to reveal insightful stories of the UAE, Expo 2020 themes, country pavilions, and the little details of the site itself that transcend what’s physically present. These activations include:
More than just spectacles, our Activations invite users to engage with the World Expo in meaningful ways that will earn them progress towards “Seeds of Change”, which they can then pledge to real-world causes. This creates a positive cycle of play that rewards users for their continuous curiosity and in turn, empowers them to make the world a better place.
Connectivity across the physical and digital worlds
The multiplayer experience features real-time connectivity between on-site and remote users, allowing us to bridge the physical/digital divide, extending reach and engagement. Standing on the Expo site, a visitor may see a friend at home on the other side of the world (viewed through the lens of their phone) as an avatar they can share experiences with, and communicate with via group creation and messaging.
Together, both types of visitors can explore the interactive digital twin. A “collect them all” capability incentivizes users to explore the expansive site, leading them to find additional content and site features they might have otherwise missed. We were able to anchor these specifically to low-traffic areas so that the game loop encourages and rewards people for “going off the beaten path.” Treasure artifacts are linked together as a set of related items with clues to find the next one.
Precise spatial mapping
One of the largest deployments in the world of persistent ARCore Cloud Anchors by Google allows millions of on-site visitors to enjoy augmented reality spectacles aligned accurately to real-world locations. Remote visitors can access the same AR content and experiences, over a connected digital twin of the site, via an interface similar to popular mobile games like Roblox.
Powered by the technology stack and cloud-hosted services written by Magnopus, content is automatically streamed down to the user's device when they approach a location without impacting the application’s install size. Geospatial authoring interfaces and global content delivery networks for updating and publishing new content in real-time, enable creators to release new interactive AR experiences relevant to a visitor’s position on the site.
A social, interactive digital twin of the 4.38km² Expo site
Hundreds of artists working around the world spent more than two years creating the living digital replica, enhancing it with dynamic lighting, art installations, animated experiences, and stunning spectacles.
The platform geolocates content on the site in real-time, so the digital twin was built to a high degree of accuracy—a 1:1 scale from the architect’s CAD or BIM files—and is streamed to visitors based on their location and proximity, without impacting the application’s install size.
Using advanced stylized avatar digitization technology, users can create a feature animation quality 3D character of themselves by uploading a single photo. The solution we created produces an animation-ready and personalized 3D head model including hair, which can be attached to a body model and fully customized from over 700 options. The avatars run completely on the cloud and can be accessed through a web browser or a mobile device.
Accurate virtual humans offer a more engaging and personalized experience. While the selection system we custom-built reduces user friction to make the whole process as simple as possible.
To create this cross-reality experience, our teams leveraged our internal technology stack, a robust set of custom solutions, and Unity to solve a diverse set of problems, including:
Lightweight, extensible, and cross-platform
Unity is built to make cross-platform applications easy to create with minimal platform-specific code. This allowed the team to concentrate on implementing engaging features, custom Magnopus integrations, and eye-catching content without spending too much time catering to specific devices, particularly in the very diverse ecosystem of Android devices.
During the development process, we were able to generate a virtual reality version on desktop PCs, a simplified app to be embedded inside native mobile applications, and even a version built for the outdoor kiosks on the Expo site. The kiosk application, built for Windows 10 PCs with dedicated GPUs, used cameras to show a live, high-quality augmented reality feed of animated content and the same multiplayer avatars as in the mobile application.
Streamlining the AR development process with AR Foundation
Unity’s AR Foundation, which wraps ARKit and ARCore, provided a very straightforward base for building the AR application for Expo Dubai Xplorer, but also meant we could implement features in a generic fashion with Unity. This enabled the team to build both the full virtual world experience and the AR experience from the same codebase and source assets, rather than having to build multiple different applications.
Dynamic content using Addressables
Our team created a custom over-the-air delivery system for users both in AR and digital. Instead of building the entire application every time a piece of content changed, we expanded our Magnopus CICD solution to leverage the Addressables build process to rebuild and deploy the content that had changed. For small changes, this could be the difference between a half-hour build for all platforms and a few minutes.
Since the content and code are decoupled, we could deliver updates to content even after the application had passed through the lengthy approval process for Android and iOS. We built a service that was able to provide Unity Addressable Bundles on demand, which were delivered through our custom location triggered content delivery network.
Our tools for geospatial authoring allowed us to associate Addressable IDs with physical locations so designers could update the placement of content in real-time. The client application could then quickly download just the content it needed for nearby activities based on our cloud APIs.
The personal growth we’ve all experienced as a result of facing the unique challenges of a project of this scale and ambition has been unbelievable. We’ve learned an insane amount…
We created Expo Dubai Xplorer as an example of what the future offers; to unite people and places in a better way. From what we learned on this large-scale “alpha,” we’re refactoring for performance and flexibility, and designing new capabilities. We aim to help others create similar large-scale experiences with the ability to spend more of their effort on the content that’s unique to them, rather than developing technologies that should already be consistent across big experiences like this.