Creating a character with the help of Visual Effects Graph was an interesting challenge for the Unity Demo team. As someone who spent a lot of time in his career waiting for renders to finish, Demo team’s Technical Artist Adrian Lazar has appreciation for the creative options made possible by real-time authoring. Read the post below for his detailed breakdown of the process behind the character Morgan, as well as useful tips for anyone doing VFX in Unity.
My name is Adrian Lazar and I’ve been working in the computer-generated graphics industry for the last 18 years or so, starting with post-production in advertising and transitioning to real-time graphics with game development in 2009. I have a generalist background and in the last few years I’ve been taking more technical art tasks - this helped me ship my own indie title together with a small, but talented team.
When I joined Unity’s Demo Team in early 2019 as a technical artist, we were getting ready to release the first part of The Heretic so I helped with some finishing effects. Soon after that we started talking about Morgan, the god-like, VFX-driven character introduced in the short film 2nd part.
On the storytelling side, Vess (Veselin Efremov, writer, director and creative director of The Heretic) had some clear requirements: Morgan needed to morph between multiple states, calm and angry, female and male or a combination of these, grow in height multiple times over, crumble, catch fire, and more.
Regarding the appearance, on the other hand, Vess intentionally left it quite open for exploration and experimentation. We had some early concepts created by our former colleague Georgi Simeonov, but those didn’t get into the vfx and shape-shifting aspect of the character, both fundamental to the final look - this meant that I had a pretty blank slate to start which was challenging but fun!
I started my initial tests in Houdini 3D, a tool I feel comfortable with from before and that gave me a good opportunity to explore and bounce some initial ideas with Vess.
Of course, for the production of the real-time short film, we wanted the effects that build Morgan to be developed inside Unity, so that it would be easy to iterate on the character and make sure it reacts correctly to everything else that happens around it. Therefore I had to look for a different solution and move away from pre-simulating the effect in another software.
One thing that gives me joy as being part of Unity’s Demo Team is having the chance to stress test, improve, and sometimes develop tools and processes that our many users can make use of daily.
Early tests video
In Morgan’s case, the opportunity was two-fold: create a complex VFX-driven character that runs in real-time, and equally important, take a first step into real-time authoring of complex effects. This was very exciting for me.
Just think about it, being able to develop and iterate on the character look in the final environment, from the desired camera angle and with the final lighting, post-processing, and other VFX! This is a dream that would have been unthinkable only a few years back.
It was not a smooth ride, but with a good team effort, we achieved both.
And so, with a visual look open to experimentation and with real-time playback and real-time authoring being the two technical goals, I turned to a Unity tool still in its infancy back then: the Visual Effects Graph developed by Julien Fryer and the Paris team.
The VFX Graph was still quite new at the time, and had a long road ahead until you could really use it for true & deep real-time authoring. However, the benefits it promised were huge. I was excited about not having to wait for the effects to be simulated in DCC, exported & evaluated in Unity then back to DCC for tweaks.
As a team, we knew that we wanted to be able to do changes until the very end, sometimes hours before the final deadline, and why wouldn’t we - this is one of the promises of real-time graphics.
One of the earlier versions of Morgan with particles flowing across the body
It was a back-and-forth familiar to those working on both: first, you need to have an idea of what you want to achieve, then you need to build the tech for it, but, as with any other creative process, things are rarely straightforward. Ideas are changed and adapted in the process, sometimes due to creative direction, other times due to the tech restrictions.
To complicate things even more, we were in uncharted waters with a tool that hadn’t been used at this scale before, working towards real-time authoring of a complex VFX-driven character.
The second major version of the character
Second version morphing effect
So the tech and the look-dev closely followed each other and when I had both it was great, fast iterations, fast experimentation, and just overall lots of fun. This creative freedom was addictive, with the visuals changing direction in a manner that was closer to working on concept art rather than on a production.
One added benefit of fast experimentation is that you can quickly change directions and repurpose a test, as it was the case when I was working on some details across the face. It didn’t turn into anything that was useful for that purpose, but it inspired a new direction for the calm version of the hair.
Hair evolution
But there were a few times when I got stuck because one part couldn’t advance without the other. If I didn’t know where to take the character creatively or I couldn’t find a way to do what I wanted with the tech I had, things couldn’t move forward.
Luckily my colleagues were there to help, so here’s a shout out to them:
After 3 major versions and countless smaller experiments, the final version of Morgan was finally emerging, just as we were getting close to the final deadline.
The fire effect started as some sort of energy burst
For The Heretic short film, we wanted a more physical destruction so we combined this crumble effect with a simulation exported from Houdini.
Early crumble tests
Meteorite effect WIP
Morgan is made of 17 visual effect graphs each covering a different part. We did this so that it would be easier to manage them.
First, we needed the particles to spawn on the skinned mesh and follow it during character animation. As skinned meshes aren’t yet supported by default in the VFX Graph, we had to find another way around it.
The position, normals and tangents for the base meshes are rendered in UV space, which are then set as texture parameters in the VFX graphs - this allows us to position and orient the particles correctly on the character.
The vertex color and the albedo texture are also rendered in UV space - these textures are used to manipules certain properties like size, scale, angle and pivot.For the Morgan package, the process of generating the textures was greatly improved by my colleagues Robert Cupiz (tech and rendering lead for The Heretic) and Torbjorn Laedre (principal engineer at Unity Demo Team).
A custom editor centralizes all the graphs making up Morgan - this makes it easy to update shared properties fast. There are about 300 parameters exposed but there’s no real limit for how many can be added, however having too many parameters in the interface can make it less practical to work with.
Fire effect evolution
The best way to learn more about Morgan is to download the standalone package and play with it. You can find more blog posts and videos about the creation process behind this project on The Heretic landing page.
For someone who has spent a long time waiting for simulations and renders to finish, this realtime revolution that is reshaping so many industries is a dream come true. I'm looking forward to the next challenge we'll pick up.