Search Unity

Global Illumination in Unity 5

September 18, 2014 in Technology | 12 min. read
Screenshot_04
Screenshot_04
Share

Is this article helpful for you?

Thank you for your feedback!

Unity 5 is receiving a major make-over in terms of graphical fidelity and lighting in particular. Unity has been limited to baked lightmaps since Unity 3.0, but since then a lot of progress has been made in the global illumination field. Now is the time to provide some of that goodness out-of-the box in Unity. One of the new graphics features in Unity 5 is real-time in-game global illumination combined with new and vastly improved lighting workflows and these are the focus of this post. But first some background.

Doll model rendered with path tracing in Unity. Courtesy of Paul Tham, Unity Singapore.

What is this global illumination all about?

Global illumination (GI) is the simulation of physically based light transport. It is a means to simulate how light is transferred between surfaces in your 3D scene and it will greatly increase the realism of your game. Not only that, it is also a means to convey mood and with clever use can be used to improve the gameplay. GI algorithms take into account not only the light which comes directly from a light source (the direct illumination), but also subsequent cases in which this light is reflected by surfaces in the scene using different materials (indirect illumination). Traditionally, indirect illumination has been too expensive to compute within the real-time constraint of a game.

It is all down to this innocent looking equation:

It is really rather simple. The lighting visible from some point in your scene is the sum of light emitted from that surface point (Le) and the incident lighting from the hemisphere above the point reflected into the viewing direction towards the viewer. Li describes incoming light from some direction w' on the hemisphere towards the point x. The reflectance term p then describes how light is reflected towards the viewer and is dependent on the incident angle w' and the exitant angle w.

As the observant reader might have spotted L(x,w) is on both sides of the equation and inside an integral to boot. Now if that hadn't been the case we would have had global illumination in Elite. Since the laws of physics are hard to alter, the research community set about to come up with a solution.

One of the most popular (and oldest) algorithms is path tracing, which basically attacks this algorithm head on, with some tricks added to spend the most time on areas known to be difficult. Path tracing is used a lot in CGI for film and television. Even though a massive amount of research has been poured into this, an image takes in the order of seconds to render (even with a beefy GPU).

Path tracing is typically used in screen-space, so the image needs to be re-rendered from scratch in every frame. This means that it supports fully dynamic scenes: lighting, materials, geometry can be freely animated. This is also a drawback as a new image has to be rendered when the camera is moved, and, since it takes on the order of seconds for the image to fully converge, it will not be appropriate for games.

A not fully converged image has disturbing noise in it and it is also not temporally coherent either so the image will flicker badly as it reaches convergence. Filtering can help limit this effect, but it cannot be completely removed. Below are a few shots taken at various levels of convergence:

Path traced images at various stages of convergence.

Lately a number of hybrid approaches have been developed, many running on the GPU, such as voxel cone tracing. Most of them require a desktop GPU with a fair amount of memory and are appropriate for high-end desktop systems only. In order to provide global illumination that will work well on a wide swath of supported platforms (including mobile) some compromise has to be made.

Enter Enlighten

Enlighten provides an excellent solution to this problem. It very elegantly scales from mobile through consoles up to high-end systems because it constrains the problem it is solving. It has also been battle-tested as it was shipped in AAA titles such as Battlefield 4, MoH Warfighter, and others.

The basic idea is that if some of the visibility is precomputed (i.e. the integral on the right in the rendering equation above) it is possible to modify the lighting in real-time even on mobile platforms.

Enlighten allows dynamically changing:

  • Light sources.
  • Environment lighting.
  • Material properties (diffuse reflectivity and surface emission).

The geometry that is part of the GI simulation has to be static, but dynamic geometry can be relit using light probes that are updated in real-time with the GI generated from the static geometry. In order to do this Enlighten precomputes the data needed for simulating GI at run-time. This data is consumed by a run-time module which is available for many platforms: OSX, Windows, Linux, iOS, Android, PS Vita, PS3, PS4, Windows Phone, Xbox360 and XboxOne. WebGL is in the works, but no definite timeline for this.

Enlighten outputs the following data:

  • Real-time lightmaps.
  • Real-time lightprobes.
  • Real-time cubemaps.

Enlighten is limited to computing GI for diffuse transport. This is the most important mode of transport as it conveys the overall illumination of the scene. Since diffuse transport is generally low-frequency the real-time lightmaps can be quite low-res and updateable in real-time. Support for specular/glossy transport is added by providing dynamically updated cubemaps. In Heckbert notation Enlighten accounts for the following light path subset: L(D)*(S|G)?E. This means that most GI effects are covered. The main missing effect will be specular bounces via diffuse surfaces - commonly known as caustics. To get this effect you will need to cheat.

Below is an example of two lighting setups rendered with Enlighten. These lighting setups were fully dynamic and switching between them is instantaneous.

Screenshot_04

 Viking Village - dawn

 In this next shot a brighter blue sky was used the sun is higher and more intense:

 Viking Village - sunny day

Environment lighting is grey and desaturated and the sun intensity is lower. Mainly ambient lighting:

Viking Village - overcast day

Lastly a sunset shot with warm reds for a sunset mood:

Viking Village - sunset

Using this technique enables games with very realistic looking time-of-day cycles.

Enlighten precompute

The compromise is that most of the geometry has to be static, effectively all the large scale geometry that will participate in the GI solution. During the precomputation phase Enlighten will automatically break up the scene into systems. It is also possible to affect how these systems are generated. The systems serve to make the precompute a massively parallel pipeline. The pipeline is relatively deep and solves per system tasks. Here is an example of how the Viking Village level was automatically broken up into systems:

systems

Editor visualization of the automatically generated Enlighten systems.

After the precompute has finished the relationship between the systems is known and this can be used in the run-time to partially alleviate the limitation of static geometry. At run-time the amount of indirect light transferred between systems can be controlled. This allows you to fade in bounce between systems and this can be used to achieve effects such as destruction or opening doors.

Enlighten run-time

The Enlighten run-time is efficient enough to run on higher end mobile devices. It runs asynchronously on a CPU thread (or more threads if the platform allows it). The most prominent issue for mobile platforms will be that the direct lighting and shadow maps for dynamic lights needs to be computed on the GPU. So on mobile platforms only a few dynamic lights will be feasible. However, the emissive properties of geometry can be adjusted in real-time almost for free. The emissive lighting computed by Enlighten encodes visibility too. This means that the emissive light is effectively shadow mapped for free, albeit at a low resolution.

Enlighten scales well onto desktop systems and next-gen consoles that could support games using exclusively dynamic lights since more GPU power is available for direct lighting and shadow mapping.

Here is an example of a mobile demo using Enlighten running on an ARM powered tablet:

What about baking

For some titles baking will be the appropriate choice, so this workflow will be supported and developed well into the future. In Unity 5 the lighting inputs: light sources, emissive materials and environment lighting can be tagged as using baked or real-time GI. Baked lighting inputs are baked into lightmaps in the same way as in previous versions of Unity and dynamic lights are handled by the Enlighten run-time. The baked and real-time GI merges seamlessly.

There are also new baking features already in Unity 5. One of those is that now the lightmaps are broken up into components. For each atlas there are five lightmaps containing direct lightingindirect lighting, direct directionality, indirect directionality and AO respectively. After a bake these will be composited into lightmaps used in the game. In the Editor there are controls that specify how these lightmaps are composited. You can tweak these settings to adjust the look of the final output. For example the indirect lighting can be boosted easily. This only requires a compositing job, which takes seconds whereas before a rebake would be needed to do this.

Lighting workflow

Enlighten doesn't just provide in-game real-time GI. One of the most important improvements that Enlighten provides is a vastly improved lighting workflow for artists. It allows for faster iteration on lighting, which in turn yields better looking content. An iterative mode has been added, which removes the need for explicitly baking the scene. The scene will instead precompute and  bake in the background without any user intervention. The Editor will automatically track changes made to the scene and execute the tasks needed to fix up the lighting. In many cases when iterating on the lighting these tasks will be nearly instantaneous.

The following video shows a live session using the new lighting workflow:

What types of games will work with Enlighten?

As already discussed the limitation of real-time GI is that the large scale scene setup must be known in advance. We had to choose either to pursue a fully dynamic solution that would only work on high-end systems or a solution that covers many use-cases but also works on mobile hardware, thus reaching a wider part of the market. And we chose the latter for 5.0. Below is a list of types of games that may prove difficult with real-time GI:

  • Q: Can I do opening of doors and gates?
    A: As discussed above authoring systems and scripting the exchange of bounce between systems depending on the door or gate will solve this in some cases. (The scripting extensions needed for this are expected to arrive in 5.x.)
  • Q: Can I support destruction?
    A: Support for making objects transparent dynamically exists in Enlighten. This means that objects that may be destroyed in the game must be marked as such in advance. While this is not ideal it should support many use-cases. ( This feature should arrive in 5.x.)
  • Q: Can I download (stream) chunks of level additively?
    A: Yes, systems can be streamed in additively. The systems can be made so that they do not straddle your streaming chunks. This can be done manually or with scripting. (This feature should be available in 5.0.)
  • Q: Can I create a game world semi-procedurally with premade chunks (endless runner type)?
    A: Yes, bounce between consecutive chunks would be possible if the various combinations that premade chunks are arranged in are precomputed separately. This is similar to how you would bake lightmaps (without seams) for an endless runner. (The precompute API needed for this should be available in 5.x.)
  • Q: Can I create game world fully procedurally, Minecraft style?
    A: Short answer, no. However, we are doing multiple projects with Imagination Technologies using their PowerVR Ray Tracing technology for GI previews and hybrid rendering. This could potentially yield a solution to procedural or user generated scenes. The PowerVR Ray Tracing technology supports interactive GI for fully dynamic scenes. It is not yet fully real-time and the first few frames will have visible noise as the solution converges, but it is a very promising direction. It will certainly be able to light user generated scenes assuming some delay is acceptable. (There is no ETA for this as it is still an R&D project.)

What will be in Free what will be in Pro?

The feature split between Unity Free and Unity Pro has not yet been decided. We will announce this in a future blog post.

What is in 5.0

These are the main features shipping in Unity 5.0:

  • Enlighten run-time for in game real-time GI.
  • Iterative workflow (for both real-time GI and baking).
  • Reflection probes.

Beyond 5.0

As we are doing a lot of changes to the lighting workflows and lighting in general some features will slip 5.0 as they need more polish. These are some of the features that will surface during the Unity 5 cycle:

  • PowerVR path traced previews for real-time and baked lightmaps (details here).
  • API for controlling the real-time GI contribution between systems.
  • Enlighten real-time cubemaps.
  • Support for real-time GI transparency.
  • Cloud/cluster precompute/bake.

Some more material is available on this subject: slides for the Unite 2014 talk on GIvideo of the Unite 2014 talk on GI, Unite 2014 keynote (graphics part).

Let us know what you think!

The Graphics Team.

September 18, 2014 in Technology | 12 min. read

Is this article helpful for you?

Thank you for your feedback!