Search Unity

Share

Is this article helpful for you?

Thank you for your feedback!

Visual Effect Graph empowers you to author next-generation visual effects through its node-based behaviors and GPU-based compute power. For starters, we released an introduction blog post that summarizes the philosophy of the editor. Since the initial preview release at Unite LA 2018, we’ve also been publishing various sample VFXs to our GitHub repository. Take a look at them and use them to build your own effects!

These samples illustrate different production scenarios that the Visual Effect Graph can handle, from simple particle systems to more complex systems with really specific behavior. All these effects are presented in separate scenes so you can browse and learn separately.

Getting the Samples

The first step for getting these samples is to make sure you’re running Unity 2018.3: the more recent 2018.3 editor version, the better. I advise you use Unity Hub to ensure you get the latest. Visual Effect Graph samples are working on Windows and Mac editors.

When you have the right version of the editor open, get the samples project. You can download one of the source code zip or tar.gz archives from the VFX Graph Releases GitHub page, or clone the repository if you want to update regularly.

Sample Project Structure

Each sample is located in a subdirectory of the Assets/Samples directory. The main scene (used when building a player) is the root of /Assets. This scene is used to sequentially load all samples declared in the scene build list in the Build Settings window.

If you need to build a player, you will have to ensure the VisualEffectsSamples scene is included in build settings at index zero, then just add all other scenes you want to cycle.

Sample #01 - Unity Cube

This sample is historically one of the very first effects prototyped using the early versions of Visual Effect Graph, It showcases a system of 400 thousand particles with a moving emitting source attracting particles towards a volume Unity cube.

The emitting sphere and its motion is self-contained in the effect and the position is animated using a combination of per-axis sin(Time). What’s interesting with this computation is that we can determine the sub-frame positions in order to reduce sphere position discretization. You can toggle this option to check the difference between the two modes. In the example below, when only the time is used the sphere is moving so fast that you can see its shape discretized over space. However, when using Per-Particle total time this artifact is totally gone.

Once released by the sphere, the particles are driven by two vector fields: an attractor towards the Unity cube, and a noise that will enrich the motion while being attracted. The particles are also colliding with the emitting sphere.

The color of the particles is driven by two gradients. One for the particles nearing the moving emitting sphere, that cycles every 5 seconds. The other gradient is a blue to pink standard Color over Life.

Using this masking trick, we simulate that the emitting source is applying some fake lighting to all the particles near it.

Sample #02 - Morphing Face

Morphing Face showcases the use of Point Caches to set the initial position of a particle, and also store other attributes such as normals. Particles are spawned randomly from a point cache we baked in Houdini. But we could also have used the Point Cache Bake tool (via Window/Visual Effects/Utilities/Point Cache Bake Tool) to generate this point cache from a Unity mesh.

Point caches files are imported into unity and generate an asset with one texture per attribute (attribute map). Then, you can use the Point Cache node to reference this asset: it will populate all attribute maps and display one connector per-attribute. Then you can plus these into Attribute from Map blocks to fetch these values. In the example above we sample points randomly from this point cache to create the particles.

Once created, these particles aren’t updated in a simulation (the system does not have an update context) as they stay fixed in space, and don’t age or die. We just compute a mask over time in the output context (shown above with green/red coloring).

This mask enables us to control many parameters of the particles by blending from two states: small non-metallic cubes and longer metallic sticks. The orientation is also blended between aligned cubes and randomly oriented sticks.

The scene uses also moving lights to display material changes while animating the mask.

Sample #03 - Butterflies

The Butterflies sample is an example of using multiple outputs for rendering one particle. In this sample, we simulate a swarm of butterflies orbiting around a central, vertical axis. Every butterfly is defined by only one particle element and only its trajectory is simulated in the update context. In the example below, butterfly particles are highlighted by the red dots.

The animation of the wings and the body is then computed in 3 different output contexts, one for each wing and one for the body.

To orient a butterfly, we use a combination of its forward (velocity) vector and an up vector that we tilt rear a little, so the body isn’t aligned to the trajectory but instead lifts up the head from the belly. The body is animated using a sine with a random frequency per-butterfly. The wings angles are also animated using a sine with the same frequency but slightly offset in time to simulate damping and inertia of the body.

Sample #04 - Grass Wind

Grass Wind is an example that showcases the simulation of something totally different from regular particles: grass on a terrain. Using a point cache generated from terrain data, we spawn grass crops on a terrain with an up vector blended from the terrain normal and a world up vector.

Every element is then interacting with the player by using a Position, Radius and Velocity parameter, sent to the effect and based on the player’s character values.

Simulation is then driven by these rules:

  • Crops in player’s range will bend in the player’s moving direction
  • Crops already bent will not be influenced anymore so stepping on them won’t affect them
  • Crops tend to regain their original orientation over time

To simulate crop bending, we store values into unused attributes: velocity and alpha.

  • Velocity stores the crop bending orientation.
  • Alpha stores the bent state: an alpha of 1.0 means that the crop is standing straight, 0.0 means that the crop is fully bent. The minimum value (-2.0) is the same as 0.0 and is used to keep the crop bent for additional time.

Alpha attribute upon stepping on a crop goes down at a given rate, until it reaches min value (-2.0). When not stepped on it regrows at a specific rate until it reaches 1.0. When transitioning from 0.0 to 1.0, the velocity value will release and diminish until the crop becomes vertical anew.

For all crops that aren’t affected by stepping and bending we apply an additional wind noise in the output to make it less static when idle.

Sample #05 - Volumetric

Volumetric sample is rather simple but it demonstrates the integration into HD Render Pipeline lighting and volumetric fog. The scene is setup with a split-environment and its background sky is a simple gray. Two light sources are used, one orange and one blue. To cast shadows, each source is composed of one spotlight oriented towards camera, with real-time shadows on. In order to simulate punctual source, we configured another spotlight for each light source, in opposite directions.

Opaque Particles are spawned from an animated source with a flipbook texture to simulate multiple elements per particle (this helps us keep the mass rich without having to use six times as many particles). The particle mass is evolving around using a noise, and is attracted towards a position nearing the camera.

Particles are rendered with cast shadows on, and use a diffusion profile with transmittance so the light is leaking through the particles.

Here’s a breakdown of the lighting we used for this sample.

Sample #06 - Portal

After seeing this Houdine tutorial, we wanted to challenge ourselves by re-creating an effect from a CG Package and add our own improvements. We also took some inspiration from the RiseFX Houdini demoreel.

As a breakdown, the effect is composed of a single particle system, a inner distortion circle, and a lighting rig made of 8 line lights, all rotating in play mode.

The particles are categorized at spawn in two groups: swift corona and colliding particles. Even though all particles collide on the ground.

Sample #07 - AR Radar

AR Radar showcases a complex effect with many systems that work together, with both internal sequencing and that is sequenced externally into a timeline through a single float [0...1] parameter: Initialize.

This parameter is used numerous times through the graph, to control the deploy effect while initializing the grid:

  • From 0.0 to 0.1 : it controls the blinking dot
  • From 0.1 to 1.0 : the grid deploys as well as the environment.

Enemy ships are triggered after the deployment of the base effect, using Timeline VFX Dedicated track. This track sends multiple times an event to spawn enemy ships around.

At the center is a blinking dot that is controlled by a Position Parameter Binder, to link it to a scene point light.

Here’s a breakdown:

Sample #08 - Voxelized Terrain

VoxelizedTerrain is a simulation of a heightfield driven by particles that renders each as a cube.

Each particle is a point on a 2D Grid (256x256) and is sampling from a 2D Texture based on object-space coordinates.  The coordinates can be offset and scaled so the terrain scales and pans.

By sampling this heightmap and storing the value in Scale.y, we can deform all points to set the actual sampled Height, color the cube based on its height, and adjust material properties (for instance smoothness for water).

You can adjust the water level as well as the input height (read from the texture) and the final elevation. All these parameters are exposed and controlled by a global script (VoxelizedTerrainController.cs).

This script handles the mouse / keyboard events to pan, scale and rotate the camera, and set all the parameters to the Visual Effect component. This script relies on the Helpful ExposedParameter struct that caches the string value of the parameter and returns its integer index (from Shader.PropertyToID()).

dist = Mathf.Clamp(dist, CameraMinMaxDistance.x, CameraMinMaxDistance.y);

           ViewingCamera.transform.position = CameraRoot.transform.position + dist * dir;



           VisualEffect.SetVector2(Position, m_Position);

           VisualEffect.SetVector2(WorldSize, m_WorldSize);



           // Sliders

           float inputHeightMapScale = Mathf.Lerp(InputHeightLevel.x, InputHeightLevel.y, InputHeightMapScaleSlider.value);

           float elevation = Mathf.Lerp(ElevationRange.x, ElevationRange.y, ElevationSlider.value);

           float waterElevation = Mathf.Lerp(WaterElevationRange.x, WaterElevationRange.y, WaterElevationSlider.value);



           CameraRoot.transform.position = new Vector3(CameraRoot.transform.position.x, waterElevation, CameraRoot.transform.position.z);

           ViewingCamera.transform.LookAt(CameraRoot.transform);



           VisualEffect.SetFloat(InputHeightMapScale, inputHeightMapScale);

           VisualEffect.SetFloat(Elevation, elevation);

           VisualEffect.SetFloat(WaterElevation, waterElevation);

Sample #09 - Genie

The Genie effect is a composition of many systems that share some parameters and that connect to each other by using some internal sequencing. The sample uses a simple script to toggle the effect on and off by clicking on the magic lamp.

The scene contains four points that will define the bezier points to drive the magic flow out of the lamp. To drive the particles, we don’t use velocity but instead a position along this bezier over the life of the particles, plus an offset computed from vector field noise.

The last point of the bezier holds the position of the genie and is animated within the visual effect with a 3D sine wave animation. This drives the last point of the bezier as well as the Genie’s body and its eyes.

The scene is setup using a single timeline and a control rig that makes it run forward or backwards. Using VFX Event Tracks we control the start and the stop of the spawn of particles. Moreover, this timeline controls Cinemachine camera blending as well as a simple control rig.

Other Visual Effects and Future Sample Releases

All new samples will be under the 2019.1 release track of the Visual Effect Graph package (5.x.x-preview). Which means that every sample up to now will be part of the new release track, but sadly no more updates will be done on the 2018.3 samples. We invite you to stay tuned to our Twitter and Facebook in order to be the first to grab these new samples when we will release them for 2019.1.

Also, you will be able to find visual effects in the Fontainebleau Demo as well as the FPS Sample repository pretty soon, with other production cases and solutions that you can use to get inspired for your own projects.

See you pretty soon for more visual effect adventures!

March 6, 2019 in Engine & platform | 12 min. read

Is this article helpful for you?

Thank you for your feedback!