Search Unity

Creating Dark Asset: From previs to final pixel with VFX Artist Setareh Samandari

October 21, 2022 in Industry | 15 min. read
Creating Dark Asset: From previs to final pixel with VFX Artist Setareh Samandari | Hero image
Creating Dark Asset: From previs to final pixel with VFX Artist Setareh Samandari | Hero image
Share

Is this article helpful for you?

Thank you for your feedback!

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Dark Asset, the latest film project from director Michael Winnick (Guns, Girls and Gambling), is a tale of spies, microchips, and military scientists – one of whom is played by T-1000 himself, Robert Patrick. Talk about the perfect combo for an action thriller.

We won’t ruin the plot. But, we can reveal some of the virtual production methods that Setareh Samandari, the lead visual effects (VFX) artist on the film, used to bring the project to life and accelerate the overall production workflow via previs in Unity. In this article, Setareh shares her experience as a VFX artist over the past four years, having studied at the Gnomon School of Visual Effects in Hollywood.

We’d love to hear a bit about the early production process for Dark Asset. What led you to choose a virtual set for this production?

Michael Winnick and I discussed whether we should build a virtual location or find an actual one. We talked through the benefits, and ultimately, it was both cost and time savings that led us to build a virtual set. We created two environments for use on LED walls, and two other scenes that were shot entirely on green screen.

A virtual set provided a host of other benefits due to the nature of working in real-time with Unity. With previs, we could discuss which shots would be CG, which would be on green screen, and how we were going to storyboard and decide on shot angles instantly – all done over Zoom.

How did using real-time assets affect the dynamics on set? 

With Dark Asset – and indeed, with other projects that I’ve worked on, such as the music video “Breathe Free” for the artist Shani – using Unity meant that key creatives like the director, DP, production designer, and VFX supervisor got to see the virtual set and interact with it directly in real-time, which made things more efficient. Both communication and direction were clearer.

We could also do the keying on set and composite the virtual background to make sure the green-screened characters fit. All of this helped give Michael and the rest of the team a lot of options, and meant that we could remain close to our creative intent.

It helped the actors as well. In cases where a green screen was used instead of LED volumes, we could use the same tools to provide a real-time preview composite of the actors on the virtual stage, and also give them a preview of camera moves.

All the lighting and virtual set dressing could be adjusted interactively, which was so much easier than the old way of going back into NUKE and figuring it out. It saved us a lot of time in post-production and was a bit of a lifesaver; especially after having done all the previs work. It meant that everyone was much more aware of what was going to happen on set.

That sounds like a big change for the better. Can you take us through a scene that you created using these real-time assets?

There’s one scene where our female protagonist is sitting in an office, and I created a 3D background that I built in Unity. I could match the perspective of the camera and integrate real lighting with the green screen to make it look real.

I had so much fun working with Michael to design the office style he wanted for the scene. I could just grab some assets and materials that I already had in Unity and lay out the office, while Michael directed me to get the look he wanted.

Using the Unity Virtual Camera for tracking shots on green screen and inserting the actor into the RT3D scene in Unity

Did working with Unity make any obvious difference on the set of Dark Asset?

It made things a lot quicker and easier! I didn’t have to worry about long render times, as I could make changes and see the results immediately. Compositing is a much slower process compared to the real-time 3D workflow.

For one particular set of shots, we needed some glass to break, and it didn’t break on the day of the shoot. So we added an element of the same glass, broken later in front of a green screen, all lit and rendered in Unity. It was great because we could make all the edits in real-time as if we were shooting it live. It gave us more control over the final shot without having to reshoot the scene, providing us all the obvious time and cost savings.

On set with Habib Zargarpour: Unity in action

We used the Unity Virtual Camera for tracking to preview the virtual set on the green screen shoots. It was also used to create new camera moves in CG environments where we inserted the actors from the green screen into the virtual sets.

The Virtual Camera paired with the iPad tracking system, so when we would move the device, it would move the camera in the CG scene and stream the footage back onto the device. I could walk around freely and see the shot composition, then film a scene by recording the camera movement.

We also found another use for Unity, which saved us a lot of time. Throughout the film, we had to have the actors interact with an app. Normally, this would have involved an iPad with a green screen and corner-tracking markers, plus hundreds of tracking shots with screen replacements, so we thought: Why not build an app in Unity?

We were able to quickly build a “prop” app in Unity and then update it on the fly with whatever the director wanted it to do. It had all the interfaces that the actors needed, and they could just use it on camera which saved a lot of time and effort with tracking!

Interactive “prop” app created in Unity

You mentioned the Virtual Camera. Why was this so important for the shoot?

We wanted to preview the virtual environments on the green screen stage while on set. Then we could later add details and render the final pixel directly out of Unity into NUKE. 

For the virtual scenes, we used the Unity Virtual Camera to create new camera moves and entire shots. It allowed us to decide on the lighting, lens, and focus for the shots, scrub the timeline, and see the animation right in the scene. Then, for final shots that are all CG or that have green screen elements inserted, we used it to create actual final camera moves.

You’ve also shared how virtual production is speeding up the process for crafting the lighting on set. How did you achieve your vision for lighting? 

We added Probe Volume Lighting as a lighting node. This allowed us to bake the lighting very quickly, interpolating the lighting per pixel so that it’s more detailed and realistic. The old way was per-object interpolation of probes.

I then combined this with the High Definition Render Pipeline (HDRP), real-time Ray-Traced Global Illumination, and real-time reflections. It’s nice to be able to crank up the settings for final renders.

All of this, combined with the HDRP Depth of Field component, and other post-processes, really gave me full control over tuning each shot to exactly how I wanted it to look.

You’ve touched on HDRP, one of Unity’s Scriptable Render Pipelines (SRP). Tell me a little more about your work with HDRP.

As a VFX artist, HDRP allowed me to achieve realistic lighting in real-time. Not only can you use it for previs, but also to render the final images for compositing into the movie. 

For Dark Asset, I used HDR EXRs in linear color space with alpha channels and the ACES color space for Tonemapping in the Post-process Volume (but when rendering in linear space, you have to turn it off).

Scene rendered in HDRP for final comps, using HDR EXRS with alpha channels in Recorder

How else were you able to make the scenes look so realistic?

For added realism, I used Area Lights for the interiors. But there was one complex scene where we had to create the effect of the characters as a holographic display.

For this, I took the green screen footage into Adobe After Effects to develop the first part of the look, then used the HDRP Volumetric Fog in post-processing to light the beams for the holograms. I placed the After Effects footage into the scene by creating a polygon and assigning the footage to it. Then I used the render texture to place the characters there. This way, they fit the lighting of the scene and also interacted with the set lighting by casting shadows onto the virtual set and Volumetric Fog!

It’s a new way to do lighting. Basically, instead of using lightmaps, you bake the lighting into these Probe Volumes, and just give it an area you want to work with. You can tell it how many bounces you want to calculate and at what density, and it will bake in four or so seconds. There’s no waiting, and no distinction between dynamic or static objects, so if a 3D character moves through the scene, it’s going to be lit by the Probe Volumes – and the same for a static object.

You can still add ray tracing and SSGA on top, and then Unity can decide as it’s rendering the different objects. It gives an extremely close match to what a real set would have looked like.

HDRP Probe Volumes in Dark Asset
HDRP Probe Volumes

Had you previously used Unity for film prior to this project?

Yes, I had already built a whole virtual world for a set of three short films, set to music by Jack Lenz, after only a week of learning Unity. I found the tools easy to pick up. The great thing is that I had the flexibility to import and move assets around for each film, all on one set.

Besides your work on Dark Asset, what effect has virtual production had on your industry, and where do you see it going?

Virtual production makes my work more accessible, especially for indie projects. Normally, large teams are needed, but virtual production methods allow us to do more with less, on any size project.

I love the innovation of VFX, and especially virtual production. Everything is constantly advancing in terms of technology, and it gives me the ability to express my ideas on set. You can change the whole world on set without having to leave to location scout, so you can worry less about time and other limitations.

There are many aspects of VFX work that are very time-consuming, like lighting environments, rendering full scenes, and compositing complex shots. But working in real-time for previs, especially for lighting and rendering, saves me so much time. And for things like LED volumes, you can also save the expense of being on location.

Using LED volumes allows us to do shoots quickly and make use of the same assets from previs to final pixel VFX shots. The industry is turning more and more to working predominantly in real-time because there are so many benefits to a shoot.

Discover how Unity can help bring your next project to life with our tools for virtual production, broadcast, and animation. If you need bespoke support, reach out to our solutions team.

To learn more about the amazing ways you can use Unity and where we’re taking our solutions, join us virtually or in person on Tuesday, November 1, for Unite 2022.

October 21, 2022 in Industry | 15 min. read

Is this article helpful for you?

Thank you for your feedback!

Related Posts