Based on the Dark Horse graphic novels written by My Chemical Romance’s lead singer, Gerard Way, The Umbrella Academy was released on Netflix in February 2019. A visual feast for VFX fans, the second season hit the number one spot on the streamer’s most-watched list at the time of its release.
With great superpowers come great visual effects. Previsualization (previs) was paramount in bringing different teams together to map the show’s complex scenes and deliver a compelling narrative. Plus, earlier decision-making in the production process meant completing the project on time.
Andrea Aniceto-Chavez and her team at Cinecode, the LA-based virtual production division of DigitalFilm Tree (Ted Lasso, Our Flag Means Death, NCIS: Los Angeles) led this behind-the-scenes process.
“Previs can help define anything within any department that’s leveraging it.”
How did you find yourself working in virtual production?
I’ve been working in this industry for about four years. I studied film and television at New York University, while also majoring in business. During this time, I was really interested in post-production.
I understood the film and TV world because I had already done so much in that space. I also had experience as a game engine artist from building a VR game. I was able to combine these experiences when I was approached by DigitalFilm Tree’s CEO, Ramy Katrib, to help build a virtual production department. Eventually I became a producer, and I now lead my own team.
Tell us a little about DigitalFilm Tree, Cinecode, and the type of work you specialize in.
Cinecode is the virtual production division of DigitalFilm Tree, a post-production house specializing in VFX, color, editorial, dailies, and remote secure cloud-based solutions. We previsualize for film and television.
We plan shots, scenes, stunt work, blocking, location, environment building, and much more, leveraging game engine technology like Unity to help our clients save time, money, and reduce their carbon footprint. DigitalFilm Tree worked on the VFX for The Umbrella Academy season one, and as soon as previs was mentioned for season two, we got to work.
What was the initial ask for previs?
They [The Umbrella Academy producers] first wanted to use previs for their version of the JFK assassination, a central theme in season two. This includes scenes in all other episodes that take place in Dealey Plaza.
Their number one problem was time. They were planning to shoot a week’s worth of shots on location, but found out that they could only use Dealey Plaza for two days. Our goal was to show how we could still tell the story in that location.
During previs, we blocked out the action and created the shots with the team. When they got to set, they only needed to shoot what was a priority and then tackle the rest, depending on time.
“90% of what we did in previs within Unity actually got shot. The 10% that didn’t was more so because they ran out of time and just wanted to prioritize other shots.”
Would you say that saving time is the most beneficial element of previs?
Saving time is much more complex than you’d first think. Previs is important because it can determine the way that production will shoot, and whether things will work or not, way ahead of time.
For example, a cinematographer can find out whether the crane they want to use will fit in the space before actually getting on set. We’re able to show them a sample of the space and a digital crane inside Unity. Also, a director can see whether the shots or direction that they’re going in with the actors is compelling enough to tell the story they’re envisioning. Previs can help define anything within any department that’s leveraging it.
Do you think previs is replacing traditional storyboarding?
Sometimes people see it like that, but I still think storyboarding is so important, as it shows us what the team is thinking of. Concept art teaches us more about textures and lighting and what the environment is actually going to look like.
We can use previs to bring those images to life, and show teams how their creative vision can appear. This includes elements like movement and how fast the camera is going to go.
How often does what you show in previs actually get shot?
We spoke to the cinematographers before the second season of The Umbrella Academy was released – we hadn’t seen the final footage yet – and they said that 90% of what we did in previs using Unity actually got shot. The 10% that didn’t was more so because they ran out of time and just wanted to prioritize other shots.
Why is staying true to what was shown in previs so important?
By using real-time, we have the capability to do many alternate versions of a sequence. I can show what sequences could look like on the spot, and that also gives us the ability to make adjustments right then and there.
In episode 310 of The Umbrella Academy, the actors are in Hotel Obsidian and it’s crumbling down around them. In previs, we helped create that VFX and show how the hotel was going to change as it crumbled. The production team then applied what we had done in previs to the production and had the actors standing on the same sigil.
For a scene like that, we work with many different departments. These include the art department, stunt coordinators, and the director and cinematographer themselves.
Has virtual production changed the dynamics on set, such as the way you work with a director, for instance?
I never go on set, but I do get to work with the director as well as other heads of production. Once the director or DP [director of photography] gets on board with how virtual production works, they tend to have more ideas than they would from reading a script or through traditional storyboarding alone. They end up being right there with us, conceptualizing ideas live.
The director can see what the sequence looks like after blocking it out, and create a camera for it. We could easily take that same sequence, duplicate it, or create another shot with a different lens to toggle between. We can do what the director would do on set, but virtually.
And is this what happened with The Umbrella Academy?
We’d meet with the showrunner, Steve Blackman, and the production designers to decide what each room would be used for. The director, Jeremy Webb, would direct many different versions of the scene on Zoom. We’d review each take and then the production team would decide on the best option.
Cinematographers Neville Kidd and Craig Wrobleski would create shots right after Jeremy directed them. This felt like being on set.
It created opportunities for the team to join our sessions whenever they had the time. They could figure out the most compelling story, make changes, and edit ahead of actually shooting the show.
It sounds like previs really benefits the entire production workflow.
Everyone benefits. We don’t always have productions that let every department utilize it, but this usually happens because they don’t have enough time. Now we see firsthand how it supports showrunners, directors, cinematographers, stunt coordinators, VFX, the art department… they all have different reasons to use previs, and it helps solve bottlenecks on set.
It also benefits the camera department because they’re able to shoot shots ahead of time to see what lenses and equipment they need. Again, working remotely saves them a lot of time, money, and reduces their carbon footprint.
Could you give us a particular example of how previs benefited a final shot on set?
In episode 306, we used a Phantom camera as the team wanted to see how the frame rate would work. They would cut between the 500 frames per second (fps) we shot in previs as well as other frame rates, with the frame rates changing throughout the sequence, as seen in the scene where Jayme [played by Cazzie David] spits toward the wall.
What was really cool was that we were able to export a shot like that out in Unity and test it at different frame rates. The director then had all these options that he could work with and decide which was the best way to shoot, as opposed to spending so much time on set figuring it out without knowing the result each time.
How did Unity help with the previs and making these preshoot decisions?
As I referenced earlier, there were a lot of scenes set at Dealey Plaza, which we had as a LiDAR scan with accurate heights and dimensions of the actual location.
We were able to quickly import that into Unity, so the team could decide on things like where they needed more room to shoot, or where stunts could take place. We could also have a production designer there, who would tell us if parts of the set weren’t removable, for example.
Are you using Unity for any other projects?
We recently used Unity on the Miracle Workers series where we went from drawings on the back of a napkin to a fully 3D set for previs!
The team had drawn out how they wanted their medieval landscape to look and they gave us this napkin with a sketch on it. So we scanned it and brought it into Unity. We then populated it with the assets they were looking for, some of which were preexisting.
It was also really easy to light the setting up. This 3D environment gave them a better idea as to whether they would shoot in Prague or whether they could just build the scene in Los Angeles.
Do you create your own assets or get them from the Unity Asset Store?
Whenever we use Unity for a project, we create our own features but still use any preexisting features, assets, or components we can find. For instance, episode 303 of The Umbrella Academy was VFX-heavy, so we did a lot of techvis and planning for the VFX particles. With Unity there’s a bunch of particle effects on the Asset Store that we were able to work with, and that helped inspire how the team actually shot the explosions, and what colors and tones they would use.
Also, every time we use Unity, we leverage Cinemachine, post-processing, and 3D models from the Asset Store. Cinemachine is great because we can input all the components that the production is using on set with their camera to get it to match as closely as possible. From there, our developers can create a text file that has a camera report for each shot, as well as information like shot ID, sensor size, lens info, focus subject, focus distance, camera height, etc.
Post-processing helps us establish the look and color tone to match the way we plan on editing the show in post. The Asset Store has been a great resource, giving us characters or particle FX that are game-engine ready for use.
|Disclaimer: The information and opinions contained in these interviews are those of the interviewees and are provided here for informational purposes only. Unity and its affiliates assume no liability for any inaccurate, delayed, or incomplete information, nor for any actions taken in reliance thereon. The information contained about each individual and company has been supplied by such individual or company without verification by us.|