Search Unity

How indie studio VRFX uses Unity and ArtEngine for real-time filmmaking

April 9, 2021 in Industry | 12 min. read
A forest full of mushrooms
A forest full of mushrooms
Share

Is this article helpful for you?

Thank you for your feedback!

Want to know how to use Unity for virtual production and real-time iteration? Read on to discover how this innovative studio helped its clients stay on time and on budget while fine-tuning textures and lighting in real-time.

As the pandemic swept the world into chaos, virtual reality (VR) and augmented reality (AR) experiences were in high demand. But traditional filmmaking processes were also disrupted: production needed to pivot quickly to accommodate safety concerns and the challenges of remote work. One indie studio recognized the opportunity to expand its scope.

Founded in 2017 as a group of freelance creatives, programmers and technical artists, VRFX Realtime Studio has been operating under a simple but ambitious mission: to bring stories to life. 

Based in Lucerne, Switzerland, the five-person team began their journey with Unity several years ago when they recognized the potential of VR/AR and its wide applicability across a variety of industries – and saw a gap in the market for agencies that could develop these tailored solutions. Upon realizing how open and accessible Unity’s VR/AR developer framework was and how it would allow the team to create a huge variety of software and experiences, their choice of Unity as a platform was an obvious one.

Photo - man using VR headset and others watching screen where he is playing
VRFX has brought the power of real-time 3D to clients in industries that run the gamut from theater production companies to geological research centers to manufacturers. Pictured here is a VR experience using Unity created for a film festival.

As self-described “technovisual problem solvers,” the studio works with companies in a variety of industries to tailor solutions that leverage the power of real-time 3D. By gathering in-depth knowledge about their client’s customers, products, and positioning, they can collaborate closely with all parties to create the perfect visual content, be it a classic TV commercial or an AR experience for an art museum. The studio produces content that spans several formats, so the team constantly needs to stay up to date on the latest trends and find new and interesting ways to work with hardware and software.

Woman in yellow using VR while others watch her and a screen
A VR theater show developed in 2018, also made with Unity

“As we’ve expanded our team, we’ve looked for not only technical expertise with Unity and other animation software, but people with a positive attitude, a desire to own full workflows, and a thirst for new challenges,” explains Pascal Achermann, VRFX cofounder and technical director. “As a result, we’re a group of people that love what we do. I think this has been essential to our success thus far."

An indie studio’s foray into real-time filmmaking

While VR and AR experiences remained the company’s sweet spot in 2020, the year presented opportunities for VRFX to expand into new fields, such as virtual production. Indeed, the team’s professional history in visual effects, 3D compositing, film production, and audio engineering positioned them well to explore the realm of real-time filmmaking.

Screenshot of an animated scene inside of Unity HDRP
Configuring an animated scene inside of Unity HDRP

While the Unity Editor and MiddleVR are their core tools, VRFX also leverages packages developed by other Unity developers to accomplish specific tasks, be it a tool for creating plants and trees with perfect meshes and UVs, a DMX controller to control on-set lighting in real-time, or their in-house developed “Snap Shot” to make changes to a scene while in play mode.

“When we don’t know how to do something, we research to find out who’s solved the problem before,” describes Nick Schneider, VRFX engineer. “We have yet to encounter something that we can’t create. We’ve always found a solution by looking to the Unity community. It’s a fantastic resource.”

Screenshot of VRFX
View of “Snap Shot,” a tool that VRFX developed in-house

Project Silvester: Virtual production with Unity HDRP and ArtEngine

The studio’s debut with virtual production using LED walls was a project for the Lucerne government for a series of TV commercials to encourage residents to practice social distancing during the Christmas holidays to slow the spread of coronavirus. VRFX pitched a virtual production setup, arguing it was the only way to stay on budget and meet the imminent deadline. With about a week to write and deliver the content, there was little time to spend on activities such as location scouting and postproduction. VRFX knew that with a little bit of pre-production planning, they could not only deliver the project with nearly zero post-production costs but also offer the client the ability to reuse the same environment for future commercial spots. The client was sold, and so planning began.

VRFX was responsible for the virtual production setup and real-time content, while partner agency Orisono wrote the story and did the on-set lighting, filming and post-production. The teams worked closely with each other in their resident location at Soundville Media Studios to deliver the entire project in about a week, with the Unity work done in about three days.

Man setting up camera on rig
VRFX art director Claudio Antonelli testing the camera tracking system for a virtual production

One of VRFX’s key tasks was to create the virtual backdrop content. As such, during preproduction, VRFX created a replica living room in Unity that represented what the real camera would see as a background. The team leveraged Unity’s High Definition Render Pipeline (HDRP) and MiddleVR. With this setup, the team was able to use real-world lighting values, simulate lens distortion on their Blackmagic URSA camera and project the 3D content onto the LED wall. Using Unity ArtEngine, they very quickly replicated the materials onsite. These materials were then used for texturing to create the virtual clone in Unity.

Setting up the virtual environment during preproduction

The first step in setting up a virtual production environment involved replicating the real set – a festive living room – in 3D. To do so, the team measured the set components and modeled a virtual clone in Modo.

A digital replica of a room
A digital replica of the real set, made in 3D modeling program Modo

With the digital model, VRFX created a quick previz mockup using Marmoset Toolbag to help the client and film crew visualize the scene. With previz approved, the team then created a higher fidelity lookdev render to illustrate the vision for lighting, mood, and materials.

Previz render of room
Previz render
Lookdev render of the room
Lookdev render

With the lookdev scene, VRFX worked with the director of photography Alex Stratigenas to align on shot framing. After another round of client feedback, the scene was imported into Unity HDRP to apply the lighting and materials.

Diagram showcasing the shot framing with the LED wall
Diagram showcasing the shot framing with the LED wall

Iterating on set with ArtEngine

A key component of any virtual production setup is flexibility. While a real-time engine is at the core of this setup, other tools that support quick iteration and enable creativity, such as Unity ArtEngine and MiddleVR, play a critical role in cultivating an agile workflow.

For example, while on set, the crew decided to swap out several materials to better match the live set and artistic vision. Previously, the materials were placeholders downloaded from a public library. Using their phone, a team member snapped a couple of photos of the set carpet and studio walls, imported into ArtEngine, and within minutes had created a tileable physically based rendering (PBR) material.

“The speed at which we’re able to create textures from photos using ArtEngine is amazing,” explains Achermann. “For this production, we created the carpet material minutes before shooting. In a few clicks, we color-corrected the photo, removed seams and unwanted artifacts, and generated all the PBR maps. The ability to quickly iterate on set like this has been critical to our success in real-time filmmaking.”

Green carpet
Original photo of the set’s carpet, taken on a phone
Green carpet with ArtEngine’s Content-Aware Fill node on it
Using ArtEngine’s Content-Aware Fill node, an artist was able to quickly remove unwanted noise and dust
Final node graph inside ArtEngine
Final node graph inside ArtEngine

The 2K textures from ArtEngine were then painted inside an 8K atlas using the Quixel Suite, which was then imported to Unity.

Close-up of the carpet material in Unity, with lighting applied
Close-up of the carpet material in Unity, with lighting applied
The room from earlier but it's different
The background scene in Unity, which was projected onto the LED screen

The scene was then projected onto the LED wall. “It was an amazing feeling seeing Unity projected onto such a big screen, and even cooler when we saw the scene captured from a real film camera,” explained Achermann.

Photo - they have created a room, a table, a light, a bookcase
The final virtual production setup, as seen through the camera
The final virtual production setup, as seen through the camera

Matching lighting in the real and virtual worlds

Matching the lighting in Unity to that captured by the camera in real life is a critical part of ensuring realism with virtual production. In this case, the lighting and post-processing setup was relatively simple. 

In Unity, the team opted for Mixed Lights (as opposed to Baked or Realtime). Since lighting in HDRP is physically based, VRFX was able to set light intensities in lux, candela, or Kelvin – standard units of measurements for physical lights in the real world. 

For the environment lighting, the team used a standard image-based lighting workflow, which involved lighting the scene with high-dynamic-range imaging (HDRI) photographs of the real-world set, similar to the workflow used in other digital content creation (DCC) tools. For an added sense of global illumination, VRFX also leveraged a lightmap baking technique using the GPU, and combined it with OptiX denoising for a speedy process.

Man looking at three screens, there is a Christmas tree on the main screen
Once the director of photography was satisfied with the lighting, shooting began

Post-processing was minimal. The team applied simple Screen Space effects (Reflections, Ambient Occlusion, etc.), which were sufficient to make the environment look great.

Actors play out a fondue dinner scene in front of the LED wall
Actors play out a fondue dinner scene in front of the LED wall

The final results were a series of short, tongue-in-cheek TV commercials that aired during the holidays. To be sure, the client was pleased with the results and expressed hope for further collaboration with VRFX in the future.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Project Eichenfresser: Animation lookdev in HDRP and ArtEngine

VRFX’s expansion into real-time filmmaking has not stopped at live-action. Indeed, the company has been recently experimenting with animation in Unity. Part of this exploratory process has involved creating “playground projects,” one of which was nicknamed Project Eichenfresser and focused on using the Unity HDRP for lookdev to experiment with storytelling in a spooky world. Such ongoing projects have allowed VRFX to play around with workflows using animation-ready assets and bolster their expertise for future client projects.

Dark night, camera movement, a jolly figure moving around in the darkness.
A dark night, character is in green, there are will o' the wisp type things floating around
Working with the camera inside Unity

The character art for this project, originally hand-drawn, was transformed into a 2.5D look using Cinema 4D, and then imported to Unity as baked animations. The vision was to keep the character designs hand-drawn and use 3D for the lookdev cinematics and final animations.

The aesthetic of the Eichenfresser world entails a mix of cartoonish, paper-like assets as well as real-world, scan-based textures and 3D models. While creating the initial lookdev, VRFX leveraged content from Quixel Megascans. However, as the concept developed, they wanted to swap out all of the Megascans content and incorporate factually accurate biomes (indeed, the Eichenfresser story takes place in Switzerland) and began photographing their own textures and models, using ArtEngine to clean up the images.

Rocky like images
ArtEngine graph of material
ArtEngine graph of the material used in the cave (see screenshot below), created by combining two materials into a single material
The project inside Unity. A woolly mammoth type creature is on screen
View of the project being configured inside of Unity
Two different textures, transformed with ArtEngine. One is green and grass like, One is brown and earth like.
Photographs taken of a forest floor in Switzerland, transformed into digital materials with ArtEngine for use in Project Eichenfresser

VRFX continues to experiment with real-time animation in Unity, specifically as it pertains to creating character movements and controlling the camera and shots. For example, one current challenge is adapting the character rigs so they can be used directly in Unity to create movements. For this, they have been experimenting with motion capture techniques, with assembly done with Unity’s state machine; however, due to the dedicated 2.5D look and the way the character’s topology is built, this remains an ongoing process. 

You can see a teaser of Project Eichenfresser here:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Project Leolina: The road to real-time animation in Unity

In the early days of the pandemic, VRFX’s art director, Claudio Antonelli, also began working on another playground project that has since grown into a concept for an animated kids TV series. The stories, developed in partnership with teachers and parents, portray a family of three going about the ups and downs of daily life and are intended to teach a young audience meaningful lessons about the real world. The artwork was designed to represent life inside a dollhouse, with the characters and most objects appearing to be made out of wood.

A still from Project Leolina. A robot-like boy is pulling out a chair at a table in a kitchen.
A still from Project Leolina, rendered in Unity HDRP-DXR with RTX ray tracing

Beginning as a classic 3D animation project, Project Leolina is now in the process of being adapted for a real-time pipeline using Unity. Though the characters are wooden puppets, they were created to work well with a motion capture rig and can be controlled with a character controller for more repeatable animation parts.

A still from Project Leolina. A robot-like boy is pulling out a chair at a table in a kitchen. This time the shot is from above.
Inspector view in the Editor for the scene above. Two images side by side. One is the volume menu, one is the Screen Space Global Illumination.
Inspector view in the Editor for the scene above

Again, VRFX used ArtEngine to author materials for the scene. For example, to create the white wall paneling, VRFX downloaded a flat image from Textures.com and used ArtEngine to make several adjustments before generating the full PBR material.

Graph inside ArtEngine
Graph inside ArtEngine
Final PBR material for the inside of the dollhouse. It is white.

Stay tuned in the coming weeks for a teaser of Project Leolina.

The next chapter has yet to be written

VRFX is of the mindset that real-time filmmaking is a journey – workflows evolve, tools advance, and the community continues to uncover new problems and develop solutions for fixing them. With recent successes with client projects in virtual production, and seeds planted to flourish in the animation space, VRFX remains hopeful about their future in the world of filmmaking with Unity.

To learn more about the studio’s work or share feedback, you can email them, visit their website, or connect on LinkedIn:

The team:

If you want to experience how AI material creation can augment your real-time filmmaking pipeline, we invite you to give ArtEngine a try. Until May 17, ArtEngine is available for $19/mo (vs regular price of $95/mo).

Recommended resources

Virtual production and real-time film tools

Texture libraries and packages

Renderers and 3D DCC tools

Unity Asset Store tools and developers

April 9, 2021 in Industry | 12 min. read

Is this article helpful for you?

Thank you for your feedback!

Related Posts