Unity 2018.1 marks the start of a new cycle that introduces a major upgrade to our core technology, which gives artists, developers and engineers the power to express their talents and collaborate more efficiently to make their AAA dreams a reality.
Let’s start with a few of the highlights, and then you can dig into the details of all the features. The first two highlights described below, the Scriptable Render Pipeline and the C# Job System, represent the first versions of two major features, which will continue to evolve to help you unlock beautiful graphics and increase the runtime performance of Unity.
While you go through the list of new features, you can download Unity 2018.1 here.
Scriptable Render Pipeline (SRP)
Available in Preview with Unity 2018.1, the new Scriptable Render Pipeline (SRP) places the power of modern hardware and GPUs directly into the hands of developers and technical artists, without having to digest millions of lines of C++ engine code.
SRP makes it easy to customize the rendering pipeline via C# code and material shaders, giving you maximum control without all the complexity and challenges of writing or modifying a complete C++ rendering pipeline.
We are also introducing two out-of-the-box render pipelines to fit your needs. The High-Definition Render Pipeline (HD RP) is for developers with AAA aspirations, and the Lightweight Render Pipeline (LW RP) is for those looking for a combination of beauty and speed, and it also optimizes the battery life for mobile devices and similar platforms.
The C# Job System & Entity Component System (ECS)
Combined with a new programming model (Entity Component System), the new runtime system enables you to take full advantage of multicore processors without the programming headache. You can use that extra horsepower, for example, to add more effects and complexity to your games or to add AI that makes your creations richer and more immersive.
Level design and shaders
Unity 2017.x introduced new features that help teams of artists, designers and developers build experiences together. We added powerful visual tools like Timeline, Cinemachine and a new FBX Exporter, which enables smooth round-tripping with Digital Content Creation tools like 3ds Max and Maya.
With Unity 2018.1, we are continuing our efforts to help artists, designers and developers collaborate more efficiently by making it possible to create levels, cinematic content, and gameplay sequences without coding. For example, new tools like ProBuilder/Polybrush and the new visual Shader Graph offer intuitive ways to design levels and create shaders without programming skills.
Unity 2017.2 introduced the Package Manager, an underlying core modular system and API that enables dynamic loading and updating of new Unity features in your projects. Unity 2018.1 builds on that with the newly released Package Manager User Interface, the Hub and Project Templates, all of which help you get new projects started faster and more efficiently.
Several of the features are available in packages. The idea is to make Unity more modular so that it’s easier for us to release features on an ongoing basis.
We use the "Preview" label for these new features to indicate that they are not recommended for production nor fully supported just yet. Previews offer you an opportunity to update, modify and experiment with features at an early stage as a separate modularized package, which you may want to use at a later time in productions.
Unity’s built-in rendering modes offer a compelling pipeline for creating a wide range of games. With the evolution and growing diversity of platforms (performance, architecture, form factors), however, we wanted to provide a more powerful and flexible rendering pipeline.
Available in Preview with Unity 2018.1, the new Scriptable Render Pipeline (SRP) places the power of modern hardware and GPUs directly into the hands of developers and technical artists, without having to digest millions of lines of C++ engine code.
SRP will allow you to easily customize the rendering pipeline via C# code and material shaders. This gives you maximum control without all the complexity and challenges of writing or modifying a complete C++ rendering pipeline.
Unity provides several built-in rendering modes, which are sufficient for the majority of smaller games. However, SRP allows you to go beyond what comes out-of-the-box to tailor the rendering process based on your specific needs and to optimize performance for the specific hardware of your target platform.
SRP offers a new way of rendering in Unity. We’re going from a black-box model to one where most things are in C#, a more open system where users can write their own pipelines or customize templates for their needs. We’re releasing two initial pipelines in 18.1 in addition to the built-in rendering engine.
You can learn more about the Scriptable Render Pipeline (SRP) and how to get started in our recent blog post.
For high-end visuals on PCs and consoles
The HD RP is a modern renderer that will support a limited set of platforms (PC DX11+, PS4, Xbox One, Metal, Vulkan — no XR support yet).
The HD RP targets high-end PCs and consoles and prioritizes stunning, high definition visuals. The tradeoff here is that HD RP will not work on less powerful platforms, and there will be some learning and re-tooling required.
The renderer is a hybrid Tile/Cluster Forward/Deferred renderer with features parity between Forward and Deferred. Its features include volumetric lighting (in progress), unified lighting (the same lighting for opaque/transparent/volumetric), new light shapes (point lights now have line and rectangle options, spotlights now have box and pyramid options), and decals.
Fewer draw calls
The LWRP is a single-pass forward renderer that uses fewer draw calls. Using the LW RP will decrease the draw call count on your project when compared to using the built-in rendering pipeline. While it supports all platforms, it is an ideal solution for mobile, and performance-hungry applications like XR. The trade-off here is that, as with the HDRP, switching to the new SRP workflow will require a learning curve, and it’s worth keeping in mind that some third-party tools are not yet compatible with it.
The LW RP has its own process for rendering and therefore requires shaders written with it in mind. We have developed a new set of Standard Shaders that are located under the Lightweight Pipeline group in the material’s shader+selection dropdown. These include a Standard PBR shader, a Non-PBR Standard shader with a simplified lighting model, a Standard Terrain shader and a Standard Unlit shader. It’s worth noting that all Unity’s unlit stock shaders already work with the LW RP. This includes legacy particles, UI, skybox, and sprite shader.
Download the LW RP using the Package Manager, and learn more about the LW RP and how to get started by reading our recent blog post.
Templates provide pre-selected settings based on common best practices for projects, depending on whether they are 2D, 3D, high-end platforms, such as PC/consoles, or lightweight platforms, such as mobile. That way, you don’t have to worry about setting up the basics, and you have a better out-of-the-box experience.
Templates ship with optimized Unity project settings, as well as some prefabs and assets to get you started. You don’t have to worry about changing many of the default settings in Unity when you start a new project because they are already pre-set for a target game-type or level of visual fidelity.
Not only does this make it faster to get started, it also introduces you to settings, which you otherwise might not have discovered, and to new features, like Scriptable Render Pipeline, Shader Graph, and the Post-Processing Stack.
Below is a list of the various templates you can choose from.
|2D||For 2D projects that use Unity’s built-in rendering pipeline.
Configures project settings for 2D, including Image Import, Sprite Packer, Scene View, Lighting and Orthographic Camera.
|3D||For 3D projects that use Unity’s built-in rendering pipeline.
Configures project settings for 3D, includes updates such as setting the default color space to Linear and the default Lightmapper to Progressive.
|3D with Extras
|Similar to the 3D Template but with the added benefits of post-processing, presets and example content.|
|For high-end graphics on platforms that support Shader Model 5.0 (DX11 and above). This template uses the HD RP.
This template uses the HD RP, a modern rendering pipeline, which includes advance material types and a configurable hybrid Tile/Cluster deferred/Forward- lighting architecture.
|For focus on performance and projects that primarily use a baked lighting solution. This template uses the LW RP, a single-pass forward renderer with light culling per-object. Using the LW RP will decrease the draw call count on your project, making it an ideal solution for lower-end hardware.
All lights are shaded in a single pass rather than additional passes per pixel light.
|Focuses on performance when developing VR projects that primarily use a baked lighting solution. This template uses the LW RP and requires a VR device to run.|
Authoring shaders in Unity have traditionally been the realm of people with some programming ability. In 2018 we are changing this!
Shader Graph enables you to build your shaders visually using a designer tool — without writing one line of code. Instead, you create and connect nodes in a graph network with easy drag-and-drop usability. You can see the results immediately and then iterate, making it simple for new users to become involved in shader creation.
The Shader Graph system:
The Progressive Lightmapper offers great results for baked lights, and improves the workflow for lighting artists, enabling them to iterate quickly and predictably by providing progressive updates in the Unity Editor.
While originally released as a “preview” feature in version 5.6, it has been improved with more features in each subsequent release. In 2018.1, it comes out of preview mode and includes memory optimizations for baking large scenes.
As of 2018.1, the Progressive Lightmapper also helps power users via the Custom Bakes API. This enables access to data within the baking solution for the development of new lighting tools, such as the custom Occlusion Probes system used to create the first-person interactive experience in Book of the Dead.
The Post-Processing Stack enables you to apply realistic filters to scenes using professional-grade controls. The artist-friendly interface makes it easy to create and fine-tune high-quality visuals for dramatic and realistic effect.
Coming out of beta in 2018.1, we’ve added the most requested features and have fixed as many bugs as possible. We’re also improving our XR support by adding mobile-specific paths, volume blending, and a complete framework for custom user-effects.
This version of the Post-Processing Stack will be shipped as the first of many upcoming packages, which will give users the plug-in flexibility of an Asset Store pack, but with the update-ability of a core Unity feature.
In 2018.1 the Post-Processing Stack has been improved to feature higher-quality effects, automatic volume blending with a powerful override stack, and a flexible framework to write and distribute your own custom effects. It’s compatible with the LW RP, HD RP, and built-in rendering pipelines.
Dynamic Resolution was first introduced in Unity for Xbox One in 2017.3. Now we are bringing the same functionality to PS4. The feature helps users dynamically manage their GPU budget. For example, it may be desirable for a game to hit high resolutions (such as 4K) in some scenarios. At other times, however, it is preferable for the resolution to drop in order to allow for GPU performance to increase.
Users can select which render textures and cameras will participate in Dynamic Resolution from the Unity Editor and then scale the resolution of those items at runtime from a single script call. This combined with the information provided by the FrameTimingManager allows users to produce scripts that can automatically balance their GPU load by changing the resolution of their chosen render targets. Internally the system uses no more memory for render targets than would be allocated if Dynamic Resolution were not in use, and no significant CPU performance overhead is incurred when changing the resolution.
Note: Users should be sure to check that their titles can become GPU-bound before adopting Dynamic Resolution as it will be of no benefit in CPU-bound scenarios.
GPU Instancing now supports fetching of Global Illumination data for each instance. This can achieved either by allowing Unity render loops to automatically batch LightProbe-lit or Lightmap-lit objects, or by manually calling the new APIs to extract the LightProbe data baked with the scene into a MaterialPropertyBlock object used later for instanced rendering.
This feature addresses an issue related to undesired artefacts. This occurs when a lightmap is divided into a number of charts and sampled, and the texel values from one chart bleeds into another (if they are too close) leading to undesired artefacts. The new UV Overlap Visualization feature lets you immediately see which charts/texels are affected by this issue. It automatically identifies overlaps and enables you to make more informed decisions when solving such issues (e.g. by increasing chart margins).
Tessellation for Metal is a way to increase visual fidelity while using lower-quality meshes. It follows the DX11 hardware tessellation convention of using hull/domain shader stages. Existing HLSL shaders using this feature are cross-compiled and transparently transformed into Metal-compute shaders, aiming to make it a seamless transition between platforms. (Target-graphic APIs have a different underlying native approach to implementing tessellation.)
Sky Occlusion ships as an experimental feature in 2018.1. It improves graphical fidelity and realism by including the contribution of lighting from the skybox as part of ambient occlusion calculations.
We also added a new experimental C# interface to pass light information to the GI-baking backends.
We now provide the ability to use all the CPU cores on a device to perform the rigidbody simulation. This includes the ability to execute the discovery of new contacts, performing both discrete and continuous island solvers and broad-phase synchronization, all using the native Job System. To get started, simply go to the 2D Physics settings and under “Job Options (Experimental),” check “Use Multithreading.” We have also provided a couple of test projects for job-based physics (used during development of this feature), which are available on Github.
SpriteShape is a sprite layout and world-building tool that provides the ability to tile sprites along the path of a shape based on given angle ranges. Additionally the shape can be filled with a tiling texture.
The main advantage of the SpriteShape feature is the powerful combination of a bezier spline path with the ability to tile sprites adaptively or continuously. When tiling continuously, sprites assigned to given angles are automatically switched.
We’re working on a new 2D animation system, which will be released over multiple phases and is now available as a preview package.
In its first release, we have focused on developing tooling for rigging a sprite for skeletal animation (although similar, it is not an integration of Anima2D into Unity).
The tools include: Bind-pose editing, manual mesh-tessellation and skin-weights painting. A runtime component ties it all together to drive sprite deformation. Our goal is to allow users to create simple animated characters that consist of a single sprite.
An ongoing project
We will continue to build new features and workflows on top of this. This may include tooling to make the creation of multi-sprite characters more efficient and to support workflows for larger productions. This will allow you to create complex multi-sprite characters and potentially share rigs and animation clips across multiple characters.
The feature is now available as a preview package and you can get our sample project and documentation here. We're eager to hear what you think. Let us know which workflows we supported correctly, and what we missed, ideas for improvement, and more in our dedicated forum.
The Particle System now supports GPU Instancing, which enables many more particle meshes to be rendered with a much smaller CPU performance cost. The Particle System uses Procedural Instancing, which is explained in more detail here.
Instancing support has been added to the Particle Standard Shaders, and will be enabled by default on all new content. In Unity 2018.1, you can manually enable content from older Unity versions simply by clicking the checkbox in the Renderer Module. It’s also possible to add particle-instancing support to your own shaders.
Below, you can see 10,000 sphere meshes using the old non-instanced technique, rendering at 8.6 fps followed by 100,000 sphere meshes using the new instanced technique, rendering at 8 5fps.
Unity 2018.1 adds some new options to the Velocity over Lifetime module, allowing you to make particles travel relative to a defined center point. By default, the center is aligned with the Transform, but can be overridden within the Module. Particles can be made to travel around the center point, using the Orbital parameters, and away/towards the center point, using the Radial parameters.
All shape types in this module now support a texture. The texture can be used for:
There are two new ways to spawn sub-emitters in Unity 2018.1. The first is via the Trigger Module, which works in a similar way to how sub-emitters are spawned from the Collision Module. Simply choose Trigger as the sub-emitter type in the Sub-Emitter Module, and then, when conditions are met inside the Trigger Module (i.e. particles have entered the collision volume), the corresponding sub-emitters will be triggered.
The second new way to trigger sub-emitters is via script. We have added a new script API called TriggerSubEmitter, which can be used to trigger a sub-emitter for a single particle, a list of particles, or all particles. In the Sub-Emitter module, you can choose Manual as the spawn type, which tells the Particle System that this emitter will only be triggered via a call in script. It is also possible to use the existing types (Collision or Death) and add additional triggers for these sub-emitters via script.
The Legacy Particle System continues to be a development burden for each Unity version where it is supported. New engine features, such as VR and multi-threaded rendering, require time spent on ensuring compatibility as Unity evolves, and there will, of course, always be new engine features requiring maintenance of the Legacy Particle System code.
This has prompted us to take the next logical step and retire the Legacy Particle System. Therefore, we have decided to remove its Script Bindings in Unity 2018.1.
It has been fully deprecated since Unity 5.4, and our analytics show almost non-existent usage. Our target is to fully remove the Legacy Particle System in Unity 2018.3.
If any of this will affect you, you have the following options:
While we can’t promise to solve every problem this causes, we will definitely listen to your concerns and do our best to mitigate any pain it may cause.
Weight Tangents allows animators to create animation curves with fewer keys and smoother curves by allowing them to control the weight of the tangents. Once a tangent is set to Weighted, you can stretch it to affect the curve interpolation without adding any additional keys, for a smoother, more precise result.
The Animation team has added support for weighted tangents to all curve editing in Unity. This means that you can use this new feature with the Particle System.
The added bezier handles and weighted tangents enable you to create animation curves with fewer keys and smoother curves by controlling the weight of the tangents. Once a tangent is set to Weighted, you can stretch it to affect the curve interpolation without adding any additional keys, for a smoother, more precise result.
We also added a simple, but oh-so-convenient, Zoom Control feature in the Animator Controller window!
As we announced in February, ProBuilder and its creators have joined Unity. With the ProBuilder, PolyGrid and Polybrush tools, we are now offering integrated advanced level design in the Unity Editor at no additional cost. The package, which consists of ProBuilder, ProGrid and Polybrush, is included with all Unity subscription plans (Personal, Plus, Pro and Enterprise).
ProBuilder is a unique hybrid of 3D-modeling and level-design tools optimized for building simple geometry, but capable of detailed editing and UV unwrapping as needed.
You can use ProBuilder to quickly prototype structures, levels, complex terrain features, vehicles and weapons, or to make custom collision geometry, trigger zones, or nav meshes.
ProBuilder also includes tools for exporting your models, editing imported meshes, and a run-time ready API for accessing the ProBuilder toolset from your own code.
Also available, ProGrid gives you both a visual and functional grid, which snaps on all 3 axes. Working on a grid facilitates speed and quality, making level construction incredibly fast, easy, and precise. It is especially handy for modular or tile-based environments, but when combined with ProBuilder, it enables faster, and more precise geometry construction for all types of work. Check the ProGrid Introduction and Tutorial to learn more. Scroll to the bottom of this blog post for information on the integration and availability of ProGrid.
Polybrush enables you to blend textures and colors, sculpt meshes and scatter objects directly in the Unity editor. Combined with ProBuilder, you get a complete in-editor level-design solution.
Finally, enjoy Unity’s seamless round-tripping with Digital Content Creation tools (like Maya) to further detail and polish your models.
Below are a few examples of how ProBuilder enables you to quickly prototype structures, complex terrain features, vehicles and weapons, or to make custom collision geometry, trigger-zones or nav meshes.
Probuilder has been used by many made-with-Unity games. Check out the highlight reel video:
Here’s a quick overview of the many features in ProBuilder
Polybrush enables you to blend textures and colors, and sculpt meshes directly in the Unity Editor.
Polybrush is in beta, and just got a new feature in its latest iteration; it now allows you to scatter objects using highly customizable brushes.
Check the Polybrush Introduction and Tutorial to learn more.
As a package, ProBuilder, Polybrush and ProGrid give you a complete in-editor level-design solution that enables you to construct precise geometry faster.
ProBuilder is available via the new Unity Package Manager. We plan to integrate the other two parts of the package directly into Unity at some point in 2018, but right now, you can download Polybrush (Beta) and ProGrids for free on the Asset Store.
How to get started:
In the Unity Editor, go to the Window menu>Package Manager, click All, select ProBuilder and click Install:
With our new high-performance multithreaded system, we’re rebuilding the very core foundation of Unity. The new system will enable your games to take full advantage of the multicore processors currently available — without the programming headache. This is possible thanks to the new C# Job System, which gives you a safe and easy sandbox in which to write parallel code. We are also introducing a new model to write performant code by default with the Entity Component System, and the Burst compiler to produce highly optimized native code.
With performance by default, not only will you be able to run your games on a wider variety of hardware, you’ll also be able to create richer game worlds with more units and more complex simulations.
Write very fast, parallelized code in C# to take full advantage of multicore processors
The trend in modern hardware architecture has been heading towards multiple cores to increase processing power over the more traditional solution of increasing core speed. The introduction of the C# Job System will help you fully leverage this increase in processing power.
It enables you to write fast jobified code in C# Scripts. It’s also safe as it provides protection from the pitfalls of multi-threading, such as race conditions and deadlocks.
Better performance across the board
The C# Job System enables better overall performance, especially as new Unity features like the Entity Component System (18.1 preview) and our new Burst compiler (18.1 preview) technology become available. The goal of all of these systems is to increase what is fundamentally possible in Unity in terms of performance, while still supporting existing workflows and allowing for a smooth technical transition.
What you’ll need to do
In order to achieve these performance gains, there are a few key changes you need to make to the way you write code in Unity. First, providing the CPU with clean, linear arrays of data to read from, rather than pulling from multiple locations in memory during calculation, allows for much faster performance. By taking an active role in memory management, we ensure that memory is managed in a way that optimizes performance. A new set of tools added to Unity’s API allows you to manage your data layout and the way memory is managed in an explicit and detailed way.
The Entity Component System is a way of writing code that focuses on the actual problems you are solving: The data and behavior that make up your game.
In addition to being a better way of approaching game programming for design reasons, using Entity Component System puts you in an ideal position to leverage Unity's Job System and Burst Compiler, letting you take full advantage of today's multicore processors.
With Entity Component System, we are moving from an object-oriented approach to a data-oriented design, which means it will be easier to reuse the code and easier for others to understand and work on it as well.
The Entity Component System ships as a preview package in 2018.1, and we will continue to develop and release new versions of it in the 2018.x cycle.
Burst is our LLVM-based compiler, which takes .NET IL and produces machine code using a new math-aware backend Compiler Technology. Burst takes C# jobs and produces highly-optimized code, which takes advantage of the particular capabilities of the platform you’re compiling for. So you get many of the benefits of hand-tuned assembler code across multiple platforms, without all the hard work.
Burst Compiler ships as a preview package in 2018.1, and we will continue to develop and release new versions of it in the 2018.x cycle.
To help you get started, we’ve also provided a repository of examples that demonstrate using the C# job system to write systems at scale, for reference and sharing: C# Job System Cookbook.
For more information on how to build using the Entity Component System, have a look at these Entity Component System Samples.
Last fall at Unite Austin 2017, Unity and Autodesk announced a collaborative partnership to build more connected workflows between Autodesk 3D tools and the Unity engine.
Since then we have improved the interoperability to benefit any game developer or artist who works on either side of a 3dsMax/Maya/Unity workflow.
Our goal is to provide an artist-friendly interface and workflow that allows you to safely merge your changes back into those assets to continue your work.
When roundtripping, assets are often edited and renamed, potentially changing their very nature. Now Unity will make sure that modifications made to the FBX by an external application can be remapped to the original with no loss of information.
Other improvements to the workflow and integration include Lights roundtripping, Animations roundtripping (including custom properties) and Blendshapes (experimental).
Version 6.0 of the remote Cache Server, which is the culmination of a six-month focused effort to elevate quality and performance, is now available. Cache Server makes creating with Unity faster by optimizing the asset-import process, either on your local machine or on a dedicated Local Area Network server. These improvements save time and speed up the development process for individuals and teams. Download Remote Cache Server now on GitHub.
In 2017.2, we introduced the first pillar of the new Package Manager, a more flexible and modular approach to managing Unity-developed features and assets that make up your projects. Initially only exposed as an API in previous releases, in Unity 2018.1, we’re introducing a new Package Manager User Interface. The new Package Manager UI will help you start projects more efficiently, and make it smoother and easier to install, update and enable new Unity features.
The Unity Package Manager UI improves the following aspects of your project-management workflow:
You can find the Package Manager in the Window and use it to install features such as Shader Graph, Post Processing, ProBuilder and the Lightweight and High Definition Render Pipelines.
Unity automatically defines how scripts compile to managed assemblies. Compilation times in the Unity Editor for iterative script changes increase as you add more scripts to the project, thus increasing compilation time.
In 2017.3, we introduced the ability to define your own managed assemblies based on scripts inside a folder. By splitting your project’s scripts into multiple assemblies, script compilation times in the editor can be greatly reduced for large projects. You can think of each managed assembly as a single library within the Unity Project.
In 2018.1, Assembly Definition File (asmdef) assemblies are now compiled on startup before any other scripts (Assembly-CSharp.dll and friends), and compilation does not stop at the first compile error.
All asmdef assemblies that successfully compile along with all their references are loaded before compiling the remaining scripts (Assembly-CSharp.dll and friends). This ensures that Unity packages are always built and loaded regardless of other compile errors in the project.
It also makes it possible for packages to have playmode test assemblies without modifying user project settings.
In the past, when adding play-mode tests, users needed to enable it in the settings. This would result in registering the assemblies in the build, and also meant that there was no separation, which C# developers normally employ in projects designated for tests.
In 2018.1, you can mark the assembly to reference the test assemblies, and no other assemblies will reference them unless the old settings are used (backwards compatible). However predefined assemblies will not auto-reference these assemblies either.
We also added a new BuildOption by default. The normal build will not build assemblies with these settings. Only the TestRunner will include and build test assemblies.
All this then makes it possible to have tests in projects without having the setting activated.
Presets are assets containing the type of asset to which they apply and a list of property modifications (name/value pairs).
Presets can be easily applied to or created from any serializable object via the UI on the object inspector or from a public API method.
Objects do not keep a link to an applied preset. Thus, modifying a preset after it has been applied has no side-effects.
Each object type may have a single preset registered as its default via the new Preset Manager. Any time an object is created in the Unity Editor, the default preset for that object is automatically applied. Changing the default preset does not affect existing objects.
Presets only exist in the editor. There are no runtime API changes.
Presets expose API hooks, which allow them to be set immediately before import, and a new object creation API, which allows editor code to create objects with default presets.
In 2018.1, we are adding a number of improvements to Timeline. With Timeline keyboard navigation, you can now use tabs and arrow keys to speed up your workflow by toggling between collapsing and expanding Tracks easily.
We also added the new Timeline Zoom bar which makes it easy to zoom in and out to get an overview of your Timeline tracks.
We are also introducing editing modes in Timeline.
When manipulating clips, zoom-based snap behaviors keeps clips together with neighboring clips. To create a blend effect, you can release (relax) the edge-magnet behavior through a clutch key (Ctrl). This allows moved/trimmed clips to blend into neighboring clips without being impeded by the edge-magnet.
There are three modes:
Cinemachine comes packed with improvements in 18.1, including the Cinemachine Storyboard, which enables you to set up the timing and basic animation of storyboards in Timeline as a function of a Cinemachine clip. These people will help you get people to use Unity from start to finish in the storytelling process keeping them there throughout the creative process.
You can pace out your boards, do cross-fades, basic zooms and movement ‘Ken Burns’ style.
Get your story blocked in and your timing and shots working the way you want them to. Add audio to create a realistic feel and pull your story together. Once you're ready, you can use just one button to toggle between the storyboard and a Cinemachine camera, all the while keeping your editing intact.
There is no need to do your storyboard and previs edit in another tool; you can do it all inside Unity.
Other improvements include Package Manager integration, camera-shake system, support for custom camera blend curves and numerous others.
It’s important to us that we provide you with a great C# IDE experience to accompany the new C# features. So to support the latest C# features and C# debugging on the new .NET 4.6 scripting runtime on macOS, we are replacing MonoDevelop-Unity 5.9.6 with Visual Studio for Mac starting from Unity 2018.1.(as announced in January). This will make support for many of the exciting new C# features available in C# 6.0 and beyond better as we move to the (currently experimental) .NET 4.6 scripting runtime upgrade in Unity.
On Windows, we will continue to ship Visual Studio 2017 Community with Unity. The Visual Studio Community already supports the latest C# features and C# debugging on the new .NET 4.6 scripting runtime. MonoDevelop-Unity 5.9.6 will be removed from the Unity 2018.1 Windows installer, as it does not support these features.
Visual Studio Code also supports Unity and extensions to accommodate IntelliSense and the Unity Debugger Extension to debug your Unity C# projects. See Unity Development with VS Code for details, and learn more about the implications and alternative IDEs in our blog post.
As always, we are looking for feedback, and if you experience issues with Visual Studio or its integration in Unity please report them at https://developercommunity.visualstudio.com/ "
In 2018.1, we have added support for IL2CPP scripting backend for Windows standalone and macOS standalone. This brings the CPU speed improvements of IL2CPP to the Mac/OSX standalone and Windows standalone player, as well as enabling third-party DRM tech to help with application code security.
The new scripting runtime is no longer experimental in Unity 2018.1. In Unity 2017.1, we shipped the first experimental preview of the stable scripting runtime. Throughout the 2017.2 and 2017.3 release cycle, many Unity users worked with this experimental scripting runtime and provided invaluable feedback. (Thanks, everyone!) We’ve also worked closely with excellent developers from Microsoft, both on the Mono and Visual Studio teams. As we’ve sorted out issues and corrected bugs, the modern scripting runtime has become more and more stable and is now ready for widespread use.
Upgraded compilers and runtimes provide improved stability, richer debugging, and better parity with a modern .NET architecture across all platforms. For example, C# 6 and new .NET APIs make Unity compatible with modern .NET libraries and tools.
Two profiles are available with the new scripting runtime: .NET 4.x and .NET Standard 2.0. Both of these profiles fully support .NET Standard 2.0 to enable the use of the latest cross-platform libraries. The .NET Standard 2.0 profile is optimized for small build size, cross-platform support, and ahead-of-time compilation. The .NET 4.x profile exposes a larger API surface and exists primarily for backward compatibility.
Google’s spatial audio SDK, Resonance Audio is fully integrated in Unity 2018.1, enabling developers to create more realistic VR and AR experiences on mobile and desktop. With Resonance Audio in the Unity Editor, you can now render hundreds of simultaneous 3D sound sources in the highest fidelity for your XR, 3D and 360 video projects on Android, iOS, Windows, MacOS, and Linux.
A new, native, multiplatform XR API has been added in this release laying the groundwork for a more extensible framework that initially targets abstraction over handheld AR SDKs. The API is designed to enable developers to build Handheld AR apps once, and deploy across multiple device types. As functionality is added to this API, we will eventually deprecate the ARKit plugin on Bitbucket as well as the experimental ARInterace project on GitHub. Support for handheld AR SDKs will ship as preview packages in the editor.
The Magic Leap Technical Preview is meant for anyone looking to get a glimpse at this exciting new platform. In addition to 2018.1 features, the Technical Preview includes a new platform under the Build Window targeting Magic Leap’s Lumin OS. The preview coupled with the Lumin SDK will also give you access to Magic Leap Zero and Magic Leap Remote, which allows for simulation of the hardware platform. Learn more.
Unity 2018.1 brings you support for Google’s Daydream standalone VR headset with Worldsense technology, enabling inside-out, six degrees of freedom (6DoF) tracking support for Daydream apps. Use it to get building now, and be ready when the hardware is released later this year!
With ARCore out of developer preview, you can now create high-quality AR apps for more than 100 million Android-enabled devices on Google Play. ARCore 1.1 for Unity also enhances the environmental understanding of your scene with oriented feature points, a new capability that allows you to place virtual content on surfaces near detected feature points, such as cans, boxes, and books. It also allows you to test and iterate your AR app in near real-time with ARCore Instant Preview, Learn more about all that ARCore brings to Unity 2018.1.
Whether you’re a VR developer who wants to make a 360 trailer to show off your skills, or a director who wants to make an engaging cinematic short film, your workflow just got easier. Unity’s new technology for capturing stereoscopic 360 images and video in Unity empowers you to create and share immersive experiences with an audience of millions on platforms such as YouTube, Within, Jaunt, Facebook 360, or Steam 360 Video.
Our device-independent stereo-360 capture technique is based on Google’s Omni-Directional Stereo (ODS) technology, which uses stereo cubemap rendering. We support rendering to stereo cubemaps natively in Unity’s graphics pipeline both in the Unity Editor and on PC standalone players. After stereo cubemaps are generated, the cubemaps are converted to stereo equirectangular maps, which is a projection format used by 360-video players.
Stereo 360 capture works in forward and deferred lighting pipelines with screen space and cubemap shadows, skybox, MSAA, HDR and the new Post-Processing Stack.
You can find more info in this blog post.
In 2018.1, we added ARM64-bit runtime support for Android based on IL2CPP technology. Currently, only 32-bit ARM or x86 builds for Android can be produced using Unity. Almost all current chipsets ship as 64-bit, but we have not been leveraging the potential advantages. The new support for running 64-bit Android apps will have performance benefits, plus games will be able to address more than 4GB of memory space.
Android Sustained Performance Mode sets a predictable, consistent level of device performance over longer periods of time without thermal throttling. It offers improved battery life and a smooth experience at the cost of some reduced performance. It’s based on the Sustained Performance API from Google. This setting has already been available for VR applications, and now you can enable it for non-VR applications as well.
Performance Reporting (included with Unity Plus and Unity Pro) now supports Windows desktop, joining support for MacOS and mobile platforms (iOS and Android). Native crashes for Windows will now be reported for you to view and debug from the Unity Developer Dashboard. The dashboard is also being updated with new features for managing and viewing reports, as well as better performance for highly active projects.
As always, refer to the release notes for the full list of new features, improvements, and fixes. You can also provide us with feedback on our forums.
At GDC, we announced our new plans for Unity releases, which include what will be known as the TECH stream and the Long-Term Support stream (LTS). The TECH stream will consist of three major releases a year with new features and functionality. The LTS stream will be the last TECH stream release of each year and will roll over to the following year.
The TECH and LTS streams represent a major shift from the current approach of supporting each of the releases for a year. From now on:
Support for each respective TECH release will end when the following one goes live.
LTS releases will be supported for two years.
Other aspects of our approach to releases will also change:
From four to three releases: Instead of four feature releases, we are going to ship three TECH stream releases per year.
First release in spring: Each year, a TECH stream will begin in the spring, this year starting with 2018.1. This will be followed by summer and fall releases.
Frequency of bug fixes: Whereas the TECH stream will receive a weekly release with bug fixes, the LTS stream will receive regular bug fixes every other week.
From patches to updates: We are dropping the .p# suffix for our weekly patches because, due to our improved testing of these releases, we believe they will be suitable for everyone.
The first LTS release will be 2017.4, which is simply the latest 2017.3 updated release. The change in version number signifies that it is the beginning of the new LTS cycle. So, xxxx.1, .xxxx.2 & xxxx.3 are TECH releases and xxxx.4 is the LTS release.
The regular updates on both TECH and LTS streams will have continuous version numbering. For example, 2017.4.0 will be followed by 2017.4.1, 2017.4.2, 2017.4.3, and so on.
The chart below shows an example of how the streams will work, with the blue boxes representing the TECH streams and the green boxes representing the beginning of the LTS streams.
We also want to send a big thanks to everyone who helped beta-test, making it possible to release 2018.1 today. Thanks for wanting to be among the first to try all the new features and for all the great feedback.
Info on the 2018.1 sweepstake winners
We have found and contacted all the beta 18.1 sweepstakes winners and will send out the prizes to the lucky winners in the weeks to come.
Be part of the 2018.2 beta
If you aren't already a beta tester, perhaps you’d like to consider becoming one. You’ll get early access to the latest new features, and you can test if your project is compatible with the new beta. By becoming a beta tester, you can:
We've been listening to your feedback from our beta survey and as a result, we're planning to launch a number of new initiatives. We're going to expand these open-beta initiatives into a more formalized program, which will result in a more efficient and speedy QA process and, ultimately, a more polished end product.
You can get access simply by downloading our latest beta version. Not only will you get access to all the new features, you’ll also help us find bugs ensuring the highest quality software.
As a starting point, have a look at this guide to being an effective beta tester to get an overview. If you would like to receive occasional mails with beta news, updates, tips and tricks, please sign up.