2017.3 completes the 2017 cycle introducing several new features and improvements across the board for both artists and developers.
We’re excited to share all the great new and improved features available today with Unity 2017.3. Before drilling down into the details, though, we wanted to look back at Unity 2017. (If you can’t wait, just jump straight to the What’s new section below.)
With the new Unity 2017 cycle, we doubled down on our effort to help artists, designers and developers create and collaborate more efficiently. Powerful visual tools like Timeline, Cinemachine and Unity FBX Exporter free artists to do more.
We continued to improve graphics quality and runtime performance to help you stay ahead of the curve on the latest emerging platforms (desktop, console, mobile, VR, AR, smart TVs) and to take advantage of the latest GPU and native Graphics APIs. A great example of this is the award-winning Adam demo-film series.
Other features, like the updated 2D tools and Unity Teams, help you get better results, faster. Finally, we’ve given you new ways to use powerful data (in the Ads, IAP and Live-Ops Analytics solutions) to optimize game performance in real-time and maximize your revenue.
Unity 2017.1, 2017.2 and 2017.3 delivered many key features supporting these goals. Here’s a summarized recap:
We’re also excited to share some of the Unity 2017.x productions released or in progress:
|Oats Studios / ADAM: The Mirror & Episode 3||Spiraloid / Nanite Fulcrum
|KO_OP / GNOG||Baobab Studio / Asteroid!|
|EpicHouse Studios / Phased||Cybernetic Walrus / Antigraviator|
|Rebel Twin / Dragons Hill 2||Slow Bros / Harold Halibut|
|WITHIN / Life of Us||Seriously / Best Fiends|
|Monomi Park / Slime Rancher||Videogyan / TooToo Boy|
And finally before jumping to 2017.3 details, check out this highlight video of Unity 2017.x titles and some of the new & improved features:
As the year comes to an end, we are happy to announce that Unity 2017.3, the final release in our 2017 cycle, is now available. 2017.3 introduces several new features and improvements across the board for both artists and developers, and we’re particularly excited about sharing our improved toolset for creating interactive 360-video experiences.
We’re finishing 2017 strong with a huge change log of new features, including the following highlights:
Panoramic 360/180 video
We are particularly excited to bring you improvements to panoramic 360/180 and 2D/3D video workflows. You can now easily bring in various styles of 2D or 3D video in Unity and play it back on the Skybox to create 360-video experiences targeting standalone, mobile and XR runtimes.
Particle System improvements
Improvements include new Unlit and Surface particle shaders and ribbonized particle trails. These allow particles to be connected based on their age. Since each point on these ribbonized trails is represented by a particle, they can be animated, for example, by using them in conjunction with the Noise Module.
Script compilation - User-defined managed assemblies
You will be able to define your own managed assemblies based on scripts inside a folder. By splitting your project's scripts into multiple assemblies, script compilation times in the editor can be greatly reduced for large projects.
Managed Memory Profiler support
You can now take advantage of Mono/.NET 2.0 support for the APIs required to take managed memory snapshots. This makes it possible to take memory snapshots directly inside the editor.
The updated Crunch Library
The Crunch Library can now compress DXT textures up to 2.5 times faster, while providing about 10% better compression ratio. But more importantly, the updated library is now capable of compressing ETC_RGB4 and ETC2_RGBA8 textures, which makes it possible to use Crunch compression on iOS and Android devices.
There is now support for HDR compressed lightmaps (BC6H) on PC, Xbox One and PlayStation 4. We also made a number of GPU instancing improvements, and we’re adding Dynamic Resolution as an engine feature debuting on the Xbox One platform with other platforms to follow later.
We are introducing Lighting modes for the Progressive Lightmapper (Baked Indirect, Shadowmask and Subtractive), LOD support with realtime probes providing a more intuitive workflow, and HDR encoding support for baked lightmaps for higher visual quality.
VR device info
To help you optimize VR experiences, you can now capture VR-device refresh rate, dimensions, aspect ratio, HMD-tracking and controller-tracking as part of device info and device status events.
Improvements include cloth self-collision and inter-collision technology and improved constraint painting.
We are introducing Playable scheduling, which allows you to prefetch the data before it is actually played. The first implementation affects AudioClipPlayables, but in the future, the scheduling will be used by other assets: audio, video and timeline. We added support to animate integer and enum component properties. We are also introducing a New "2D" mode button in the animation Preview window. Finally, it is now possible to zoom, frame and autofit in the Animator window!
Xbox One X Support
We added support for the new Xbox One X console from Microsoft. Use Quality Settings to enable support for 4K HDR rendering, or use the extra power in other ways such as improving framerate or increasing graphical fidelity. Xbox One X support is available in all 2017.x versions of Unity.
Xiaomi: easily publish Android apps into China (Xiaomi) through the editor.
In November 2016, Unity and Xiaomi announced a partnership to help developers bring games to Xiaomi’s 200 million customers in China, the largest Android store in this region. Since then a number of early-adopter games have gone live on Xiaomi's store. At the end of November, the push-to-store service offering for Xiaomi's brand-new portal went live! To get started, check out the FAQ and getting started guides on this site.
Standard Events is now officially out of beta, and you can access it directly from the Unity Editor in 2017.3. We’re also introducing funnel templates, which together with Standard Events, allow you to create common funnels that can reveal key insights with just a few clicks.
Unity automatically defines how scripts compile to managed assemblies. Typically, compilation times in the Unity Editor for iterative script changes increase as you add more scripts to the Project increasing compilation time.
Now, you can use an assembly definition file to define your own managed assemblies based on the scripts in a folder. Separating project scripts into multiple assemblies with well-defined dependencies ensures that only the required assemblies are rebuilt when you make changes in a script. This reduces compilation times and is particularly useful for large projects.
Each managed assembly can be thought of as a single library within the Unity Project.
The figure above illustrates how to split the project scripts into several assemblies. Because only the scripts are changed in the Main.dll, none of the other assemblies are recompiled. And since the Main.dll contains fewer scripts, it compiles faster than the Assembly-CSharp.dll.
Similarly, script changes in only Stuff.dll cause Main.dll and Stuff.dll to recompile.
You can read more about how to use assembly definition files in our feature preview blog post.
Back in 2015, we released an experimental managed memory profiler with support for IL2CPP. The memory profiler started as a Unity Hackweek project and was since released on BitBucket as an open source project. See the memory profiler project page on Bitbucket for details and a demo video.
We’ve added Mono/.NET 2.0 support for the APIs required by the memory profiler to take managed memory snapshots. This makes it possible to take memory snapshots inside in the editor.
The Transform Tool is a multitool, that contains the functionality of the standard three: Move, Rotate and Scale tools. The Transform Tool is not meant to replace the three standard tools, but rather, to provide a tool for situations where you would want all of the three tools present without having to switch back and forth between them.
With the pivot rotation set to “Global” mode, you can translate and rotate the GameObject along the global axes.
When the pivot rotation is in “Local” mode, you can also scale along local axes.
Screen Space Gizmo
If you keep the “SHIFT” key pressed, the gizmo enters Screen Space mode.
In this mode, you can translate and rotate in screen space.
When you keep the “CTRL” (Win) or “Command” (OSX) key pressed, the unit snapping is activated for:
When you keep the “V” key pressed, the gizmos enter Vertex Snapping mode.
This enables you to translate your gameobject so one of its vertexes is placed on the vertex of another object.
Cinemachine, our camera system for in-game cameras, cinematics, cutscenes, film pre-visualization and virtual cinematography, also includes a number of new features and improvements.
Other new features and improvements include dolly cam behavior and POV, a 1st-person shooter type, aim component, and much more.
For a complete list of all the features and improvements, check out our official forum thread.
The Cinemachine feature set is distributed via the Asset Store. Download the latest version to your project.
In Unity, it is possible to control lighting pre-computation and composition in order to achieve a given effect by assigning various modes to a Light (Realtime, Mixed and Baked). Using Mixed mode significantly reduces realtime shadow distance increasing performance. Higher visual fidelity can also be achieved as far distance shadows are supported along with realtime specular highlights.
In Unity 2017.3, you can do the same thing with the Progressive Lightmapper choosing from among the following Lighting modes:
In Baked Indirect mode, Mixed lights behave like realtime dynamic lights with additional indirect lighting sampled from baked lightmaps and light probes. Effects like fog can be used passed realtime shadow distance where shadowing would otherwise be missing.
In Shadowmask mode, Mixed lights are realtime, and shadows cast from static objects are baked into a shadowmask texture and into light probes. This allows you to render shadows in the distance drastically reducing the amount of rendered shadow casters based on the Shadowmask Mode from the quality settings.
In Subtractive mode, direct lighting is baked into the lightmaps, and static objects will not have specular or glossy highlights from mixed lights. Dynamic objects will be lit at realtime and receive precomputed shadows from static objects via light probes. The main directional light allows dynamic objects to cast a subtractive realtime shadow on static objects.
To try out the lighting modes in the Progressive Lightmapper, make sure that you have a Mixed Light in your scene and then, in the Lighting Window, select a Lighting Mode.
We’ve added the ability to generate lighting for level of detail (LOD) objects with real-time light probes in addition to baked lightmaps, offering a more intuitive workflow for users to author their lighting. The LOD allows you to have meshes in lower complexity when the camera is far away and higher complexity when the camera is closer. That way you can reduce the level of computation when rendering faraway objects. When you use Unity’s LOD system in a scene with baked lighting and Realtime GI, the system lights the most detailed model out of the LOD Group as if it is a regular static model. It uses lightmaps for the direct and indirect lighting, and separate lightmaps for Realtime GI.
To allow the baking system to produce real-time or baked lightmaps in 2017.3, simply check that Lightmap Static is enabled in the Renderer component of the relevant GameObject.
This animation shows how realtime ambient color affects the Realtime GI used by lower level LODs:
We’ve added support for HDR compressed lightmaps (BC6H) on PC, Xbox One and PlayStation 4 for achieving even better quality visuals. The advantage of using High Quality lightmaps is that they don’t encode lightmap values with RGBM, but use a 16-bit floating point value instead. As a result, the supported range goes from 0 to 65504. The BC6H format is also superior to DXT5 + RGBM combination as it doesn’t produce any of the banding artifacts associated with RGBM encoding and color artifacts coming from DXT compression. Shaders that need to sample HDR lightmaps are a few ALU instructions shorter because there is no need to decode the sampled values, and the BC6H format has the same GPU memory requirements as DXT5.
The HDR is easily enabled by setting the Lightmap Encoding option in the Player Settings to High Quality.
GPU Instancing was introduced in 5.6 to reduce the number of draw calls used per scene by rendering multiple copies of the same Mesh simultaneously, and thus significantly improving rendering performance.
There are a number of improvements to GPU instancing, including Per-instance properties which are now packed into a structure data type with instancing constant buffer containing only one array of such structures.
For most platforms, the instancing array sizes are now calculated automatically and no longer need to be specified by user (maximum allowed constant buffer size/size of the above-mentioned structure type). Instancing batch sizes are therefore improved on OpenGL and Metal, and instancing shader variants compile much faster and potentially cost less in terms of CPU-GPU data bandwidth.
The updated Crunch library, introduced in Unity 2017.3, can compress DXT textures up to 2.5 times faster with a 10% improvement in compression ratio. More importantly, the updated library is now capable of compressing ETC_RGB4 and ETC2_RGBA8 textures, which makes it possible to use Crunch compression on iOS and Android devices.
Crunch is a lossy texture compression format, which is normally used on top of DXT texture compression. Crunch compression helps to reduce the size of the textures in order to use less disk space and to speed up downloads. The original Crunch compression library, developed by Richard Geldreich, is available on GitHub. Support for Crunch texture format was first added in Unity 5.3 and in Unity 2017.3, we are now introducing an updated version of the Crunch compression library.
Textures compressed with Crunch are first decompressed to DXT and then uploaded to the GPU at runtime. Crunched textures not only use less space, but they can be also decompressed really fast. This makes the Crunch format very efficient for distributing textures. At the same time, Crunched textures can take quite a lot of time to compress, which is a major drawback when using Crunch (for a big project, compression of all the textures into Crunch format can take several hours in the Unity Editor).
This should help you reduce the build sizes of your mobile games, and thus make it easier to comply with App Store over-the-air size limitations, ultimately letting you reach a wider audience with your content.
To use Crunch compression on Android, iOS and tvOS platforms, you simply select either the “RGB Crunched ETC” or “RGBA Crunched ETC2” format for your textures in the Inspector window. If you enable the “Use Crunch Compression” option in the Default tab, all the textures on the Android platform will be compressed with ETC Crunch by default.
Below you can see examples of textures compressed with Crunch using default quality settings. Note that the artifacts on the final textures are introduced both by Crunch compression and selected GPU texture format (DXT or ETC).
If you are interested in knowing more about the updated Crunch texture compression library, visit this blog post, which includes more examples of size and performance comparison.
As its name suggests, Dynamic Resolution refers to dynamically scaling some or all of the render targets to reduce workload on the GPU. Dynamic Resolution can be triggered automatically when performance data indicates that the game is about to drop frames due to being GPU bound. In such a case, gradually scaling down the resolution can help maintain a solid frame rate. It can also be triggered manually if the user is about to experience a particularly GPU intensive part of gameplay. If scaled gradually, Dynamic Resolution can be virtually unnoticeable.
The new video player introduced earlier this year, made it possible to use 360 videos and make them truly interactive by adding CG objects, ambisonic audio, visual effects, and more.
In 2017.3, you can now bring a 2D or 3D 360-video into Unity and play it back on the Skybox to create standalone 360-video experiences targeting VR platforms.
Equirectangular 2D videos should have an aspect ratio of exactly 2:1 for 360-degree content, or 1:1 for 180-degree content.
Equirectangular 2D video
Cubemap 2D videos should have an aspect ratio of 1:6, 3:4, 4:3, or 6:1, depending on the face layout:
Cubemap 2D video
To use the panoramic video features in the Unity Editor, you must have access to panoramic video clips, or know how to author them. Check our documentation pages for a step-by-step guide on how to display any panoramic video in the Unity Editor.
Keep in mind that many desktop hardware video decoders are limited to 4K resolutions and mobile hardware video decoders are often limited to 2K or less, which affects the resolution of playback in real-time on those platforms.
Technical improvements introduced with the new developer preview include a C API for Android NDK, functionality to pause and resume AR sessions allowing users to pause and continue tracking after app is resumed, improved runtime efficiency and finally, improved trackable and anchor interface across anchor, plane finding, and point cloud. Learn more about ARCore Developer Preview.
With Unity 2017.3 support for Vuforia 7, you can build cross-platform AR apps. Vuforia 7 introduces Model Targets, a new way to place digital content on specific objects using pre-exiting 3D models. Also new is Vuforia Ground Plane, a capability that allows you to place digital content on a horizontal surface such as a floor or table. Ground Plane will support an expanding range of iOS and Android devices, taking advantage of platform enablers such as ARKit where available. Learn more about Vuforia in Unity 2017.
|Ground Plane (available for free) place digital content on floors or tables.||Model Targets. Recognize a new class of objects based on their geometry used for example for placing AR content on top of industrial equipment, vehicles or home appliances.|
We are excited to welcome OTOY’s path-traced GPU-accelerated render engine for the Unity Editor. OctaneRender for Unity is available for free, or at $20 or $60 for packages that unlock more GPUs and OctaneRender plugins for leading 3D authoring tools. Born on GPUs, OctaneRender is an unbiased render engine, tracing each ray of light in a scene with physics-grade precision to deliver unrivaled photorealism in CG and VFX. Octane Render for Unity works with Unity 2017.1 and above. Learn more about OctaneRender for Unity.
The Velocity over Lifetime module now contains a new curve, which allows particle speeds to be controlled over the duration of a particle’s lifetime. This makes it possible to change the speed of particles without affecting the direction they are traveling in. It also includes support for negative speeds, which is useful for creating gravity/vortex effects.
We are releasing surface and unlit shaders for use with particles. They provide all the core particle functionality that you would expect, via a similar interface to our existing Standard Shader. It is now easy to configure blend modes, alpha testing thresholds, soft particle distances, and many more features. Support for blended flipbooks and distortion effects is also built in. The surface shader also makes it easy to support Normal Mapping and PBR lighting.
There is a new option in the Trails Module called Ribbon Mode, which allows particles to be connected based on their age, allowing you to avoid each particle leaving trails behind it. Because each point on the trail is represented by a particle, it is possible to use this mode to create animated trails, for example, by using it in conjunction with the Noise Module.
When using meshes in the Shape Module, it is now possible to use meshes whose primitive type is Line or LineStrip. This makes it easy to spawn particles along lines, instead of only on the surface of a mesh.
Unity has added support for 32-bit meshes in Unity 2017.3, finally dropping the constraint for meshes to have 65,536 or fewer vertices. This support has been fully integrated for Particle Systems, meaning you can spawn on the surface of these new large meshes. Although it is likely to have high-performance costs, you can now render particle meshes with 32-bit indices as well.
We’ve also made some improvements to the editor in order to improve your workflow when authoring effects:
There are a number of improvements to Cloth physics, including better in-editor tools for authoring Cloth with more of PhysX internals exposed. This gives you more options for collisions, self-collisions, etc.
You can now paint constraints on to the cloth using a brush-based method and update. You can also enable intercollision to make different cloth objects collide.
You can find more information on how to get started using the cloth improvements here.
There is now support for Facebook's new segmented upload feature for Game Room. Previously, a single file with a max size of around 250MB was uploaded to Facebook. Now, only those parts that were changed are uploaded (divided into segments of 10 MB).
Unity has supported Facebook’s gaming desktop app, Game Room, since Unity 5.6. Supporting the Windows-native client allows players to experience both web games and native games built exclusively for the platform. Learn more about how to reach up to two billion users on Facebook.
Witch Oculus introducing its new Oculus Dash menu and UI in it’s latest Rift’s update, we have included Oculus Dash Depth support and exposed camera-depth information, so that Oculus Dash can partially occlude world-space overlays.
We’ve added the possibility to use Unity’s terrain trees in VR experiences. Check the documentation for details on populating your world with trees.
In 2017.2, we delivered Stereo Instancing for VR PC based platforms; we’re now expanding to PSVR. Stereo Instancing is an exciting rendering advancement that unlocks hardware optimizations for rendering stereo images. This means that developers will be able to do more while maintaining a smooth framerate
In order to focus our efforts latest versions of DirectX, Unity 2017.3 (both the Unity Editor and the Standalone player) will no longer support DirectX 9 (which was originally released in 2002). You can read more about how to continue to support Windows XP on our blog.
Finally, as originally announced in October 2017, Unity will no longer include Samsung Tizen and SmartTV from version 2017.3. However, Unity stills provides 12 months of support, including patches and security updates starting from the release of 2017.2 (October 12 2017). To get the latest info on Tizen and SmartTV for Unity, visit the partner page for Samsung or contact Samsung directly at firstname.lastname@example.org.
Unity Teams features are now available for purchase, so teams of all sizes can save, share and sync their projects easily within Unity. Unity Teams Basic is free and is great for small teams that are just getting started. More features and capacity are available with Unity Teams Advanced, which starts at at $9 per month. You can add more team members, cloud storage, or more automated builds, paying only for what you need. Details at unity3d.com/teams.
If you’ve been using Unity Teams features (Collaborate, Cloud Build), you can upgrade to a paid subscription by January 9, 2018 to prevent interruption in service. Furthermore, it’s a great time to upgrade to Plus and Pro.
Unity Analytics Standard Events helps track users’ behavior and answer critical questions specific to your game. They are a curated set of events focused on five different areas of user experience: Onboarding, Application, Progression, Engagement, and Monetization. With 2017.3, Standard Events are now built right into Unity.
Along with this release, we are excited to introduce you to funnel templates.
Together with Standard Events, funnel templates allow you to create common funnels that can reveal key insights with just a few clicks.
Learn more about Standard Events.
For Plus and Pro users, LiveStream now displays instantaneous metrics broken down over the preceding 30 minutes. IAP revenue is displayed when verified, making income reporting more accurate. There are also options for filtering New Installs, Sessions, and Verified IAP Revenue in the Activity view.