Search Unity

Image of outer space with stars and different spaceships traveling
Image of outer space with stars and different spaceships traveling
Share

Is this article helpful for you?

Thank you for your feedback!

Update: Now included in 2021 LTS

The new Unity 2021.2 Tech Stream release is now available. As part of this release, we’ve included a number of new features and improvements dedicated to helping artists and designers optimize creative workflows and create powerful immersive experiences.

Create UI for games and applications with UI Toolkit

Building on the ability to develop custom UI for Unity Editor extensions, UI Toolkit in 2021.2 now supports the authoring of runtime UI. This means that creators can use UI Toolkit to author interfaces for games and applications directly within Unity. 

In this release, UI Toolkit integrates the font rendering technology of TextMesh Pro, so you can access advanced styling capabilities to render beautiful text, at any point size and resolution, without installing additional plug-ins or packages. 

The dedicated UI animation system also helps to save valuable design time by enabling you to quickly author reusable transitions that can be applied to various projects and elements, reducing rework across your project. 

You can also efficiently create textureless UI directly within UI Toolkit. Without needing to import images from Photoshop, you can reduce the time between tools to create and iterate faster. This lowers the overall memory footprint and build size to scale more efficiently.

Learn more about UI Toolkit, how to get started, and when to use it over Unity UI here.

Streamlined Terrain workflows and new brushes

In 2021.2, the Terrain Tools package is coming out of Experimental and is now released. This means that it has passed extensive testing and validation procedures, including the creation of documentation, changelog, and license files so you can use the package with confidence in your next project. 

As part of this release, new Terrain sculpting brushes will be added to bridge, clone, noise, terrace, and twist Terrain and better help you bring environments to life. Erosion heightmap-based tools (Hydraulic, Wind, and Thermal) features are now available. We’ve also improved material painting controls and the general quality of life to streamline your authoring experience.

Improved SpeedTree 8 workflows in render pipelines

SpeedTree 8 vegetation can now be directly imported into the High Definition Render Pipeline (HDRP).

New Shader Graph-based shaders are now provided for both the Universal Render Pipeline (URP) and HDRP, and these improve performance and visual quality using new custom interpolators to interpolate normal, tangents, and bitangents, as well as the new per material culling overrides. Note: The new shader is used by default only on HDRP, whereas on URP, the already-existing handwritten shader is used by default for compatibility reasons.

Decals improved and now available for both URP and HDRP

Decals have been added to URP, with two modes, screen space or using a DBuffer. The latest is less adapted to mobile GPU architectures but gives access to surface properties for more advanced blending.

UX has been improved for decal projector placement as well, with pivot point and scale transform tools, Prefab support, and multi-select capability.

A normal blending option has been added to HDRP, improving the blending on non-flat shapes, and the full HDRP Decal Shader is now accessible from the Visual Effect Graph.

Lighting

Enlighten Realtime Global Illumination

This feature enables Unity creators to enrich their projects with indirect lighting that responds to real-time changes in light and material properties. Interactive visual feedback makes it possible to significantly reduce lighting design iteration times.  

In the 2021.2 release, we have reenabled support for Enlighten Realtime Global Illumination in HDRP and the Built-In Render Pipeline, as well as adding support to URP. We’ve also extended support for Enlighten Realtime Global Illumination to next-gen and newly released tech such as Apple silicon, PlayStation 5, and Xbox Series X.

Progressive GPU Lightmapper tiling

This new feature makes it possible for users to utilize the Progressive GPU Lightmapper for faster bakes at larger lightmap resolutions. 

The tiling technique helps reduce GPU memory requirements by breaking down lightmaps into tiles sized to match the amount of GPU memory available. This feature reduces the risk of the GPU Lightmapper running out of memory and falling back to the CPU.

Light speed light placement with Light Anchors

To make lighting for cinematics easier and more efficient, we have created the Light Anchor component, a dedicated tool to manipulate lights around a pivot point instead of world space. Various presets allow lighting artists to quickly place lights around a character or any center of interest.

Light Layers For URP

We have also added URP Light Layers support to allow masking of lights in a Scene to affect specific meshes within the same layer. This feature is now supported in both URP and HDRP.

Improved lighting in HDRP

Video using the Blender demo file “Classroom” by Christophe Seux

Get approximate dynamic real-time GI as a post-effect with the new Screen Space Global Illumination (SSGI) improvements falling back on probes and sky when they can’t find samples while ray marching the screen space buffers.

For your hero props and characters, specular occlusion has been improved when using Ambient Occlusion and Bent normal maps.

And finally, area lights now support fabric and are compatible with hair falling back on GGX lighting.

Rendering Debugger

The Rendering Debugger is now available both for URP and HDRP, allowing you to debug materials, passes, rendering effects, and volumes.

New Lens Flares system for URP and HDRP

Our new Lens Flares tools simulate the effect of lights refracting inside a camera lens or a human or creature eye. You are able to customize and apply flares, halos, and polygonal patterns (artifacts caused by aperture blades) to selected light sources in a scene to achieve an artistic effect. Lens Flares support occlusion and many procedural placement tools to prevent adding and setting manually multiple layers of the same pattern.

Add cinematic-quality flares in a second by choosing from a collection of premade effects (sun, car lights, anamorphic, stylized, and so on) provided as additional samples in the HDRP and URP packages.

Perform live camera and face animation with our new cinematic companion apps

We recently released two companion apps on the App Store that enable artists to leverage powerful mobile AR data in their cinematic and animation pipelines. The companion apps connect to the Unity Editor through the new Live Capture Unity package.

You can find more information on how to get started with the Live Capture package and companion apps on our forum here.

Unity Face Capture enables you to preview and record real-time facial performances in the Unity Editor. This simplifies the process of adding realistic face animation to your characters, saving animators many hours of time and effort, by capturing facial expressions and head movements to control a character in real-time.

Key features offer the ability to:

  • Capture and animate facial expressions and head poses using ARKit
  • See a live preview of performances in the Editor
  • Set up per-character mapping and overrides
  • Apply recorded animations to any compatible rig
  • Use customizable settings to amplify or dampen the performance on a per-blend shape basis
  • Record audio and video during the face data capture for reference
  • Export the recorded animation as an FBX file to tweak in your preferred DCC

Unity Virtual Camera is a simple, intuitive tool for virtual cinematography. It enables you to preview and record natural handheld camera movements and lens controls in an app, while leveraging the power of the Unity Editor.

Use Unity Virtual Camera to:

  • Preview and record camera motion for films, animation, games, and other content
  • Block shots and discover your lens language
  • Create storyboards
  • Scout locations in virtual environments to explore, feel, and understand the space

Key features offer the ability to:

  • See results in real-time, both in-Editor and on the device
  • Play back your takes instantly
  • Adjust ergonomic controls for comfortable handling
  • Record everything in a single pass, or isolate elements for finer control
  • Control focus, zoom, depth of field, and more

If you have any questions, feedback, or feature requests on the Live Capture package or the companion apps, please let us know in the forums.

Visual Effects Graph

VFX Graph for URP and 2D on PCs and consoles

VFX Graph for URP is now officially supported for PC and consoles, after development teams fixed issues and added Lit particle support. We have improved mobile support, but compute support on mobile devices varies widely across brands, mobile GPU architecture, and operating systems. Only a subset of high-end mobile devices can be officially supported by the VFX Graph, and other devices might encounter issues. Unity’s built-in Particle System is still the recommended choice for a wide range of mobile applications.

Support for URP 2D Renderer has been added, allowing the mix of 3D VFX Graph effects and 2D world objects using sorting layers. 

New Shader Graph integration in VFX Graph

You can use any custom Shader Graph shader (except URP sprites and URP/HDRP decals) to target VFX Graph. These shaders can also use new lighting models like HDRP hair or fabric, or they can modify particles at the vertex level to enable effects like birds with flapping wings, wobbling particles like soap bubbles, and much more.

Bake Signed Distance Fields directly from Unity Editor or at runtime

No need to use third-party software and go back and forth to change your source assets when using Signed Distance Field.

The new Signed Distance Field Baker tool allows you to quickly and directly bake geometry in texture 3D in the Editor (or at runtime using the API). It’s baked as signed distance field that can be used within the VFX Graph, including preview and debug windows and the ability to save session data.

Graphics buffer for large data and advanced simulations

Video courtesy of Lewis Hackmans / Twitter @_hackmans_

For more advanced users, you can now receive large data from C# or compute shaders via Graphics Buffers instead of texture baking to create your own complex simulations like boids, large data rendering, fluids, hair simulation, or crowds.

On the UX side, we improved attachment workflows, reworked the toolbar, and improved search filtering. One of the most-requested features was adding Transform to various shape blocks and operators to, for example, rotate a position circle block to make the circle shape face a new direction, or scale the shape of a sphere collision block on one axis to collide with an ellipse.

For more details on new features in VFX Graph, check the documentation of new features for VFX Graph.

Shader Graph

With the release of Unity 2021.2, we introduced some exciting new features in the Shader Graph, such as Surface Options support in the URP, Custom Interpolators, the ability to target the Built-In Render Pipeline, and categories on the Blackboard. These additions have greatly improved artist workflows and shader performance.

URP surface options allow you to expose more options from your Shader Graph shader in the Material Inspector. This allows for more powerful, flexible shaders and simplifies managing your shader library.

With custom interpolators, you will be able to pass information from the Vertex Stage to the Fragment Stage, whether it is vertex data or the results of operations done in the Vertex Stage. 

Blackboard categories enable you to group the properties in the Blackboard and expose them as expandable sections in the Material Inspector.

This was one of the most-voted features on our public roadmap: You now have the ability to author shaders that use Tessellation for HDRP from Shader Graph. This allows you to create custom graphs to dynamically add detail to meshes, water waves, traces in snow or sand, smooth geometry, or to improve the silhouette of objects. In addition, you can also compute custom motion vector values with procedural geometry (for example, generated hair strands) using additional velocity in Shader Graph.

The Built-In Render Pipeline now supports Shader Graph. Teams that are not yet ready to upgrade to URP or HDRP can access the power of Shader Graph for their projects. Certain features (such as XR) are not yet available in the Built-In Target, but creating graphs that support all three rendering backends will be hugely beneficial to many creators.

URP

In 2021.2, we have improved the support of Reflection Probes. We have added Reflection Probe blending to allow gradual fadeout of one reflection probe while fading in the others as a reflective object moves from one zone to another in the scene. This gradual transition avoids popping artifacts and provides a higher quality of reflection representation. 

Box Projection support has also been added to Reflection Probes. The Box Projection option allows you to create a reflection cubemap at a finite distance from the probe, thus providing a better presentation of reflections in indoor and confined environments.

We have added support of Light Cookies to both local and baked Global Illumination lights. Light Cookies can be used to change the appearance, shape, and intensity of the casted light for artistic effects or to simulate complex lighting scenarios with a minimal runtime performance impact. 

In this release, we have included samples to help you understand the use case of URP features, including camera stacking, and renderer features such as occluded geometry and Screen Space Ambient Occlusion (SSAO). Our new Scene templates are meant to be used within the scene template system in the Editor. It provides you with a predetermined default setup to use when you’re creating a new scene, with settings and assets that are suitable for use with URP. 

For more details on new features in URP, check the documentation page for new features in URP.

HDRP

New clouds and volumetrics for HDRP

To add clouds, you can now choose between procedural Cloud Layers (to use as background) or Volumetric Clouds (to use as background as well or to fly through). Both are easy to set up, and you can quickly tweak the default parameters to achieve different kinds of realistic clouds. Advanced users, you can access more settings and import your own map for finer artistic control. You can tweak lighting parameters for different atmospheric conditions and cloud moods such as ambient light probe dimmer, Scattering Tint, Powder Effect Intensity, multi-scattering, and shadows. You can also control wind direction and speed, as well as shape and erosion effects on clouds.

We have also updated Local Volumetric Fog formatting and blending, including colored volume masks, a RenderTexture optional input, higher-resolution volume masks, improved 3D Texture Atlas and various samples to download from the HDRP package.

Time to grow hair and fur with HDRP

Existing “Approximate” lighting model (left), new “Physical” option (right)

For realistic hair and fur, the HDRP hair shader offers a brand-new “Physical” mode, which uses a Marshner/Disney-based lighting model, in addition to the existing “Approximate” Kajiya/Kay one. This new parametrization is easier to set up and more appropriate for photoreal strand-based rendering, exposing the physical parameters of hair, while the Kajiya/Kay model offers greater artistic control and faster performance but is harder to set up and of slightly lower quality. 

This new model has been added as well to the Pathtracer, which was used as a ground truth reference for our real-time shader. 

As mentioned previously, rendering hair strands using Shader Graph is now possible using the new support for custom motion vector velocities.

Artist and designer highlight

While we’re on the subject, we recommend checking an amazing new plug-in on the Asset Store by Daniel Zeller called Fluffy Grooming. The tool leverages some improvements in Unity 2021.2 to allow you to directly groom dynamic hair, fur, and cards (for example in feathers) inside the Unity Editor.

Image courtesy of Daniel Zeller

HDRP performance boost

We’ve made many enhancements that respond directly to creators feedback, including improvements to HDRP.

Sometimes effects, shaders, or ray tracing are too heavy to run at 4K or even at 2K, and rendering two cameras at high resolution in VR is too constrained for high-end graphics. With dynamic resolution improvements and the support of three new cutting-edge upscalers – NVIDIA Deep Learning Super Sampling (DLSS), AMD FidelityFX Super Resolution (FSR), and our own Temporal Upscaler – you can boost your performance on PC, consoles, and desktop VR. 

NVIDIA DLSS offers fantastic visual quality thanks to advanced deep learning-based temporal upsampling that runs on a selection of NVIDIA GPUs. You can see it in action in hit titles such as LEGO Builder’s Journey and Naraka: Bladepoint.

AMD FSR and Unity’s Temporal Upscaler will run on all hardware supported by HDRP, including Mac and consoles (XBox One and Series X|S, PlayStation 4 and 5). These upscalers have a smaller performance footprint, while still producing beautiful results at three-quarters of the final resolution.

Decals have been highly optimized using Burst (15 times faster on the HDRP template), a new internal system called Renderer Lists allows to automatically disable passes that are not used in a frame, and DirectX12 is now getting closer performance with DirectX 11 at all resolutions.

Finally, we now offer a dedicated custom pass to get a lightweight camera for all UI-related work in response to feedback about the expensiveness of additional cameras. 

Making cinematics and films with HDRP

To simplify camera animation, focus distance has been added to the physical camera parameters (as well as most other physical camera properties), and is now animatable.

Boost the quality of your offline videos when rendering trailers, cutscenes, or films, with Unity Recorder now supporting frame interpolation to get broadcast-quality motion blur or record path traced frames.

The HDRP Pathtracer now supports most HDRP shaders (fabric, stacklit, AxF, new hair lighting model), volumetric scattering, and increased max samples.

For more details on new features in HDRP, check the documentation page for what’s new in HDRP.

How to get started

Discover more features and the latest improvements for artists and designers in the 2021.2 Tech Stream

You can use our project templates on the Hub to start from environments already set up for your needs:

We’ve also compiled a collection of videos to help you get started:

What’s coming next?

Check our public roadmap, vote for features under consideration, give us details on your use cases, or request other features here.

Preregister for your free copy of the upcoming Unity Game Designer Playbook, a new guide to inspire and instruct game designers on prototyping, crafting, and testing gameplay in Unity.

Is this article helpful for you?

Thank you for your feedback!

Related Posts