Last year, at Unite LA, we unveiled our FPS Sample project and Visual Effect Graph. FPS Sample is aimed to show developers how to set up a first-person shooter. The first release came with effects made using Particle System, the current production-ready solution for particles. Afterward, we upgraded all the visual effects to use the brand new Visual Effect Graph, in order to take advantage of HDRP. Since Visual Effect Graph is still in preview, this was a valuable chance to gather information and try out production patterns so we can improve the tool towards a production-ready state. Here’s a post-mortem of the upgrade process and some of the lessons we learned.
Visual Effect Graph is a package that comes with a simple Component that runs effects from templates, which is pretty different from the Particle System solution, that embeds all effect data in a scene component. Also, Visual Effect Graph enables you to embed multiple components into a single effect graph, so that Unity uses only one component to render multiple particle systems.
However, we made the C# interface pretty close to the one Particle System uses, so the behaviors are really similar and it’s quite easy to upgrade from one system to another.
For FPS Sample, the transition to the new system was pretty easy due to the following:
We started the transition by finding all references to Particle Systems in prefabs and code, then we replaced all the instances with placeholders, in a really naive manner. A pretty easy task as all gameplay effects were stored in dedicated project folders under the Assets/Effects hierarchy.
All other effects in the scene were easily identified using the hierarchy filters. Most of these were stored in prefabs so the replacement with placeholders was an easy task too.
Once replaced, there was some latitude left to improve and iterate on effects.
Shooting systems rely on many effects whose performance depends on the pace of the game, the number of players, the playing area topology, and the current game’s state. This implies that many objects can be spawned around (for instance: impacts) and will fill the scene graph with many draw calls.
The FPS implementation was made using a pool of 64 impacts, managed by Entity-Component Systems using a recycling system: 1 Particle System per impact + 1 Audio Source.
The main drawback of this system was that every instance of an impact would lead to a draw call per-system. A constraint that requires using only a few draw calls per effect would, in a worst-case scenario, lead to N times 64 draw calls (N being the number of particle systems that compose each impact), for each kind of impact.
An improvement that came pretty quickly was to use a common simulation for all kinds of impacts of a single weapon type. This way, all the rendering would be performed in the same simulation. We assumed the drawback of sorting all impact systems together instead of sorting each impact individually, as it would let us:
In order to do that, we used the VFX SendEvent() API and VFXEventAttribute payload in order to place the impact source and the impact normal.
The impact definition would only reference an asset template and a scene-level manager would be in charge of handling master simulation objects and sending events: Entering VFXSystems.
VFXSystem is a top level class we developed for FPS Sample that’s in charge of handling all pooled effects. The behavior is the following:
This custom pooled system, while being a major improvement in terms of performance, also came with its drawbacks:
VFX Event Tester was a feature that came pretty early while working with pooled effects. As these effects need to be started using event attributes (impact position and normal, bullet hit scans source and target positions), some C# is needed to store these attributes.
The solution that came first was to use a dummy timeline with VFX Activation Tracks that sent parametrized effects in order to preview the effect. The drawback was a prefab that was needed to be embedded in the scene for editing, and removed afterward, which was not ideal.
Instead of that, we developed a SceneView utility window named VFX Event Tester. This window enables sending events with attribute payloads to currently selected asset.
The tool can be toggled on and off via the Edit > Visual Effects > Event Tester menu when needed. The source code can be accessed at the following location: Assets/VFX/VisualEffectGraph-Extras/Editor/Utility/VFXEventTester/VFXEventTesterWindow.cs
For all impacts, spawning all of the effects components at all times would be far beyond necessary, especially for really small particles. In order to solve depth-dependent rendering, we implemented a rudimentary yet useful filtering system for particles: Cancel By Distance. The system is a kind of gate that will null the spawn of particles if some conditions are valid.
Using this simple LOD system helped avoid unnecessarily spawning barely visible particles for the impacts. However, it also required to stop spawning far away effects such as rocks in the grinding drill machine effect.
You can find the source code of this custom Spawn block at: Assets/VFX/Script/CustomSpawners/CancelByDistance.cs
Deploying effects in a short time frame can be problematic so we chose to use a single effect around player’s camera and use the SRP Volume system to blend effects around the player depending on its position.
The volume mixer takes advantage of the SRP Core feature named Volume. These classes enable the use of parameters stored in the volume system, such as rendering parameters and post-processes, enabling overrides of the settings depending on the camera position in the level.
You can access the source code of this system at the following location: Assets/VFX/Script/VFXVolumeMixer
This system was used in order to deploy an environment effect that matches the player’s camera. This creates heat effects on the exterior and the furnace room, dust for the interiors and the cave.
In order to use this system, there’s a set of values that you need to set up in the project settings. The system is able to control up to 8 floats, 8 vectors, and 8 color values. The settings will let you choose how many of these you want to use and give a name to these variables.
Then you can add a VFX Volume Mixer volume component to n your scene volumes that will prompt you these values so you can make local overrides in your level. It’s a pretty convenient way to be able to make local overrides.
Finally, if you want to sample a value from these volumes, you can use the VFX Volume Mixer Parameter binders.
FPS Sample also displays some particle systems we used in the Level_01 scene and that aren’t part of a generic system. Here are some cases we solved:
The capture point circle is a pretty interesting effect because it requires particles to follow a deformable path made out of a skinned tube with 16 control points. As the Visual Effect Graph isn’t able to handle spawn on skinned meshes at the moment, we still have information about this bone chain. Also, for smooth 3D interpolation for this effect, the Sample Bezier operator is something that came very valuable.
In order to spawn particles on a path composed of multiple positions, a pretty straightforward method is to bake the position list into a position attribute map, then sample this texture’s pixels in order to determine where the particles need to be spawned.
For this specific case, we wrote a new parameter binder named Multiple Position binder, that takes a list of game objects, and will write into a texture every game object world-space position, then set the point count and the texture to parameters exposed in the specific visual effect graph.
Sampling this position map is then pretty easy. We consider a group of 2 pixels as 4 bezier points (by computing 2 intermediate bezier tangents), then we interpolate everything so all the particles will go through all positions with bezier blending.
You can find this example in the prefab located at: Assets/Prefabs/Gameplay/Capturepoint_A, the effect is located in the Small_emitter Game Object.
The source code for the Multiple position binder is available in the project at this location: Assets/VFX/Script/ParameterBinders/VFXMultiplePositionParameterBinder.cs
Rock grinder is a simple decorative effect used at the canyon’s end. It’s used to emphasize the power of a Grinding Drill. The Drill is stuck and is grinding the rock formation on its left as well the Terraformer’s egg that serves as a capture point.
Across the level, there are many light sources that needed to be decorated using simple steam, dust, and air flow effects. While most of these are static sources, some are animated, for instance, this rotating air duct.
In order to solve this case, we used specifically lit particles in order to synchronize the steam reveal and the volumetric light shaft. In order to keep a decent frame rate, these particles needed to be faded out pretty far from the camera because their shader was per-pixel lit and it was too resource-intensive to be rendered full screen.
Other effects were placed through the level, this time set up as unlit particles in order to save performance. The big vertical air ducts could have been lit as well but their size was a bit too problematic performance-wise. Plus, the light shaft wasn’t sharp enough so we did end with simple unlit effects. All these instances were configured directly in the scene by setting up fading distances and color.
The furnace effect is a pretty simple effect that spawns bubbles on a circle shape and uses the experimental GPU Events in order to spawn bubble burst flipbooks and upping sparkles upon each bubble collapse.
Also, it is generally decorated using non-localized upping sparkles, steam and heat effects.
While it is possible to get really close to this effect, we used a Camera Fade block for steam and heat so it soothes the overdraw and keeps the framerate under control.
The transition to Visual Effect Graph in FPS Sample was smoother than expected, as the package is still in preview, the system still allows some flexibility in order to customize the experience for the project’s needs. Luckily, FPS Sample project structure is super clean, uses really few workarounds and/or custom implementations, so it was a really good test candidate for a Visual Effect Graph Implementation. Still, we’re aware that it could have been a totally different and harder story in a wilder project. In any case, it enabled us to pinpoint production cases we will address for the next versions of Visual Effect Graph in 2019.
Some features were developed here specifically for the needs of the integration; we will still keep these as a reference in order to come up with more generic tools in the future.