Search Unity

The virtual production behind Stowaway’s space walk scenes

September 13, 2021 in Industry | 8 min. read
The director and cinematographer of stowaway talking through a space scene
The director and cinematographer of stowaway talking through a space scene

Is this article helpful for you?

Thank you for your feedback!

How Unity’s virtual camera offering was used by the filmmakers

This article by VFX journalist Ian Failes was originally published June 2021 in befores & afters.

In Stowaway, director Joe Penna’s Netflix film about a fraught trip to Mars, some of the most dramatic scenes center around a crucial spacewalk the characters make in an attempt to retrieve spare oxygen from tanks outside their spacecraft. Ultimately, the sequence would involve filming live-action actors in spacesuits on partial sets, along with extensive visual effects additions.

With weightlessness, wirework and plenty of VFX required, significant pre-planning was necessary. Traditionally this might involve a previs process where artists would imagine shots using a 3D tool. But here the filmmakers took previs a step further by utilizing Unity’s virtual camera rigs plugged into the Unity game engine to craft scenes with previs assets made by RISE FX (which also crafted the final visual effects). Furthermore, director Penna, director of photography Klemens Becker and VFX cinematographer Jannicke Mikkelsen all had a direct hand in imagining the shots themselves at RISE FX’s office in Berlin.  

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

The video above from Unity showcases some of the virtual production shoot for the previs, which took place at RISE FX’s offices. Footage credit: Ryan Morrison, Patrick Nugent. Music track: Nima Fakhrara.

During this pre-production stage on Stowaway, a team from Unity (back then the outfit was known as Digital Monarch Media, or DMM, which was acquired by the game engine maker in late 2018) visited RISE FX to work with the filmmakers on the previs. They brought with them their virtual camera (V-cam) unit known then as Expozure VFT. The idea was that the controller of the V-cam could see the scene on a tablet-like screen, as well as on a bigger external screen, and use the bespoke unit to “operate” a virtual camera, which was tracked in space, to come up with the necessary angles, camera moves and storytelling points for the spacewalk.

This Unity game engine workflow, of course, meant that previsualization could all be done in real-time, with instant feedback, enabling numerous iterations. Ultimately, various “takes” could be recorded and edited together to form a previs’d sequence, which served as the template for the final shoot and final visual effects.


Director Joe Penna navigates a space scene in Unity
Director Joe Penna

How it worked

Things kicked off with RISE building the assets; at this point they were previs-ready only, but would inform the final assets seen in the film. Animation of the characters were keyframed as basic moves into a master scene. “Then,” advises DMM co-founder Habib Zargarpour, who had previously overseen similar previs work on films such as Blade Runner 2049 and Greyhound, “the filmmakers, using the Unity system, could go and film things bit by bit in any way that wanted to cover that scene, from the beginning to the end.”

The idea was to envisage coverage, just as if the filmmakers really were on the spacecraft shooting stage with real cameras and film gear. Zargarpour notes that Penna, Becker and Mikkelsen picked up the operation of the V-cam quickly. “There’s a weight to it,” he says, “that automatically gives them a sense of momentum, and you don’t have to worry about your hand shaking too much. You get some automatic dampening from just the mass. There’s that analogue aspect of it that it should feel similar to when you’re physically operating something.”  

Cinematographer Jannicke Mikkelsen films a space scene in Unity
VFX cinematographer Jannicke Mikkelsen

The filmmakers could set-up the Vcam in two different configurations, one with a viewfinder on the hand-held unit and one where they just viewed the large television screen. An initial phase saw them explore the virtual set and compose shots. Then they could actually go back in and refine them for more of a technical visualization, i.e., to check that the shots would actually be feasible – or filmmable – on the real-world set.

Indeed, one of the features of the system was that it could replicate real-world camera lenses and rigs like dollies, curved dollies, drones or tripods. “Mimicking real-world rigs instantly like that is important, especially if they’re visualizing something they’re going to do on the practical set,” advises Zargarpour. “Say they want to bring in a dolly and then see, how does that work for the shot? Where would they attach the dolly? What kind of a height does it need to have?”

Image of the raw footage being touched up in Unity
A frame from the real-time previs

The takes recorded in the Vcam system are “live.” This meant it was possible to go back into the Unity scene, go to the particular take, and adjust it. Camera movement could be changed, a different lens could be tried, the lighting changed, something different could be focused on. “You can even update your models, up-res them or do any kind of modification,” says Zargarpour.

On set, rendered out previs scenes served as a guide for filmmaking, with Zargarpour noting that another option was always to keep things “in-engine” to continue making tweaks or continue browsing around the virtual set for shot compositions, if necessary.

Director of photography Klemens Becker

“What I’m really happy about is seeing the key creators using the tools hands-on,” states Zargarpour. “It’s not like someone else decides what the shots should be, and then the director has to give them notes. I think that was the success of it, that it was done live, it was done together.”

The future of virtual production at Unity

Since the work carried out for Stowaway, which was released in April on Netflix, Unity’s virtual cinematography offerings have continued to expand.

Cinematographer Ryan Morrison reviews a space scene in Unity

In particular, Unity recently released a suite of Cinematics features. Some of these tools relate to taking advantage of tracking and AR capabilities on iPad Pros, for instance, while others dive further into virtual camera, facial capture and sequence editing. All the while, Unity’s abilities to deliver ray-traced scenes, complex FX sims and volumetrics and other more cinematic looks has been on a path of continuous improvement.

The idea here is to make the real-time tools even more accessible, says Zargarpour. “We’re going to build on top of the new tools and make them even more powerful and maintain capabilities that we’ve already had in our on-set systems. I’m hoping the new tools are going to be used by all the creators on a project, right from the inception through to the final shots.”  

Learn more

Animation creators: read about all the cinematics features now available in Unity Pro, and download the free Unity virtual camera app for iOS from the iTunes AppStore. 

September 13, 2021 in Industry | 8 min. read

Is this article helpful for you?

Thank you for your feedback!