A few months ago, we released Volume 1 of the Icon Collective, and it’s time to share more about what went into the animation and cinematics of Yggdrasil. Since one of the key goals of this ongoing project is to inspire developers and help them deliver amazing content themselves, in this blog post we want to reveal the tools, techniques and processes we used to achieve AAA quality.
To start, here are the main tools we used to produce the character animations.
This Autodesk tool makes it easy to retarget any motion-capture animation and motion blending. We decided to use HumanIK for MotionBuilder rigging because it’s simple to set up and you don’t need any 3rd-party plugins. Consider it a “friendly” controller for IK/FK switch and position/rotation pinning. Another key benefit is that HumanIK can switch to Autodesk Maya, if you’re more familiar with that software. Because Unity provides a seamless integration for FBX format, drag and drop to finish the integration.
Because Unity understands game developers’ needs and workflows, they made it easy for us to develop our content in third-party tools and get it quickly into Unity. For that reason, we decided to use Blender as a skinning and rigging tool, saving and importing our work in .blend files. As well, since Blender allows squash and stretch bone hierarchy scaling, it encouraged the animators to push the boundaries of their creations.
Unity’s Timeline, along with Cinemachine extensions, allowed us to create the cinematic trailer using a dynamic, flexible and non-destructive workflow. Once the character animation drafts were ready to be implemented in Timeline, we tailored Cinemachine’s virtual cameras around it.
Using LookAt, Cinemachine allowed us to aim a virtual camera at a custom target (in this case, a specific bone of the character), which made it possible to create shots that would work even after the animations were refined by the animator, without having to recreate the camera rig or without too much human intervention.
Since all the cameras used in the trailer were procedural (not keyframed), and we needed to have some dynamic shots on our Timeline (not the usual A-to-B transition), the Timeline “Mix” mode came in handy and allowed us to blend different camera rigs as well (visualized as clips on the Timeline) literally interpolating all the camera settings and values from one camera to the other.
The best part was using the Post-Processing Stack editor during finalization. When necessary, different profiles for each camera were created. For example, on one shot we needed to animate Depth of Field according to the storyboard requirements. With Unity, this process was completely automatic when blending one camera clip into another. In the end, not a single keyframe was used in the process.
Check out the webpage and download Volume 1 to start creating your own projects!
Along with the Substance team, we are excited to bring you the next Asset Store challenge: the Inside the Vault: Unity 3D Environment Art Contest! Using Substance Painter and Designer – and your wildest imagination – show us what’s inside Yggdrasil’s vault.
Find out more here: https://connect.unity.com/challenges/inside-the-vault