Search Unity

Unity 2017.1 feature spotlight: Playable API

August 2, 2017 in Technology | 5 min. read

It’s time. It’s been a long run, but here we are: in Unity 2017.1 the Playable API is out of experimental.

The graph-like Playable API allows you to have precise programmatic control over animation and audio – we are also working on using the same API for video control. Our brand new Timeline tool in 2017.1 uses the Playable API.

For animation, this provides a level of control that was not possible before with the Animator:

  • Play an AnimationClip on an Animator without an AnimatorController
  • Direct control of the timing and the weight of each individual AnimationClip being played
  • Dynamically add AnimationClips to the graph at runtime
  • Mix AnimatorController with AnimationClips, other AnimatorControllers, or even with Timeline!

I wrote a blog post sometime ago regarding this, which you can read here.

In order to make it easier for users to create and debug, we created a PlayableGraph Visualizer tool. The tool can be used to display any PlayableGraph in both Play and Edit mode and will always reflect the current state of the graph. Playables in the graph are represented by colored nodes, varying according to their type. Wire color intensity indicates the local weight of the blending. You can get the visualizer tool on our github.

Note that the API has changed a bit from 5.6 to 2017.1. We’ve just finished writing a manual so you can see what it’s all about.

Something that is really interesting about this API is that for all “native” playables that Unity provides, we use C# structs instead of C++ objects to hold the objects. Using structs has the advantage of not allocating GC memory. It can be a bit more tricky to use, but since we are building a lot of our future technologies with this API this was important for us.

It’s now possible for our users to implement and tailor their own systems and API that rely on Playables. For instance, for those who are still fond of the Legacy Animation system, we are currently implementing – using Playable API – a frontend API that mimics the behavior of the Legacy system but using Mecanim. This will soon be available via our GitHub. For now, let’s just see a small example:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

With this, you get the benefits of our multithreaded and retargetable animation engine but with a Legacy-like level of control.


When the Playables API was still in the experimental phase we had something called ScriptPlayable. It was both the container (the playable node) and the content (your custom code). We’ve since split this concept in two: the ScriptPlayable is now just the container and the PlayableBehaviour is the content.

From the manual we have :

The Playables API allows you to create custom playables derived from PlayableBehaviour. By overriding the 'PrepareFrame' method, nodes can be handled as desired. Custom Playables can also override any of the other virtual methods of PlayableBehaviour, based on the event that they need to handle.

For instance, a basic Playable would look like this:

public class MyNicePlayableBehaviour : PlayableBehaviour
    override public void PrepareFrame(Playable owner, FrameData info)


And for its creation, we do this:

var nicePlayable = ScriptPlayable<MyNicePlayableBehaviour>.Create(playableGraph);

This is a bit different from what we did in the previous Playable version, but we believe that this aligns better with what we have planned for the future. And speaking of the future…

The future

Joachim recently gave a presentation about of C# compute jobs. Go watch this presentation it’s really good stuff.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

We are working, as we speak, to bring AnimationPlayables to the C# jobs system. This will allow users to directly control the animation stream values (transforms, floats, muscle, etc.) in an optimized, multithreaded and thread-safe C# code. The user-made C# animation jobs are then connectable in the PlayableGraph and allows you to create custom graph based rigs/constraints.

The code, written in C#, will run in our multithreaded animation system and will be evaluated during the traversal of the Playable graph.

As a short example, this is how we do a very simple FullBodyIK algorithm.

public struct FullBodyIKJob : IProcessAnimationJob
    public TransformSceneHandle leftFootEffector;
    public TransformSceneHandle rightFootEffector;
    public TransformSceneHandle leftHandEffector;
    public TransformSceneHandle rightHandEffector;

    public void ProcessAnimation(AnimationStream stream)
        FullBodyIK.IKGoal [] goals = new FullBodyIK.IKGoal [4];

        FullBodyIK.GetGoals(stream, goals);

        if (stream.IsValid(leftFootEffector))
            goals[(int)AvatarIKGoal.LeftFoot].position = stream.GetPosition(leftFootEffector);
            goals[(int)AvatarIKGoal.LeftFoot].rotation = stream.GetRotation(leftFootEffector);
            goals[(int)AvatarIKGoal.LeftFoot].positionWeight = 1.0f;

        // do for each effector.
 if (stream.IsValid(RightFootEffector))

        FullBodyIK.Solve(stream, goals);

public static void Solve(AnimationStream stream, IKGoal[] goals)
    var human = stream.human;

    human.bodyRotation = new float4(0, 0, 0, 1);;

    float3 bodyPosition = human.bodyPosition;
    float3 bodyPositionDelta = new float3(0, 0, 0);
    float sumWeight = 0;

    // pull engine
    for (int goalIter = 0; goalIter < 4; goalIter++)
        bodyPositionDelta += (goals[goalIter].position - stream.human.GetGoalPositionFromPose((AvatarIKGoal)goalIter)) * goals[goalIter].positionWeight;
        sumWeight += goals[goalIter].positionWeight;

    if (sumWeight > 1)
        bodyPositionDelta /= sumWeight;

    human.bodyPosition = bodyPosition + bodyPositionDelta;

    SetGoals(stream, goals);
    stream.human.IKSolve ();

The setup/instantiation of the playable looks like this :

ikPlayable = AnimationScriptPlayableJob<FullBodyIKJob>.Create();

    var fullBodyIK = new FullBodyIKJob();
    fullBodyIK.leftFootEffector = player.BindSceneTransform(leftFootEffector);
    fullBodyIK.rightFootEffector = player.BindSceneTransform(rightFootEffector);
    fullBodyIK.leftHandEffector = player.BindSceneTransform(leftHandEffector);
    fullBodyIK.rightHandEffector = player.BindSceneTransform(rightHandEffector);

    ikPlayable.SetJobData (fullBodyIK);


Note that this code is still an early prototype and is subject to change.

We also have prototyped the following use-cases:

  • CustomMixer: allows to have different weights per transform during mixing
  • IKSolver: custom made IK solver; currently 2-bone, can be CCD (  or other.
  • MotionStream: a special node that plays poses directly (instead of animation)
  • LiveMocap: reads mocap and writes it into the AnimationStream
  • KneePoppingFixer: adds scale when we are near full extension of body part

Theses are nice examples of what we can do with the C# AnimationPlayables, and we can’t wait to let you get your hands on this. Stay tuned!

August 2, 2017 in Technology | 5 min. read