Discover how we took an idle RPG game to the next level with Unity’s native 2D tools.
Back in Unity’s 2019 release cycle, we realized our vision of empowering 2D artists and creators with a complete suite of 2D tools. The release of our 2D packages included character skeletal animation and Inverse Kinematics (IK), level design with tilemaps, spline shapes and pixel art tools. Check out our 2D website for an overview.
Our 2D team has since optimized those workflows and refined the graphics technology: the 2D Renderer inside of the Universal Render Pipeline. There’s no better way to put these tools to the test and see how they can make your 2D visuals shine than by exploring a new sample project. Dragon Crashers is now available for free on the Asset Store.
Dragon Crashers is an official sample project made in Unity 2020.2 that showcases Unity’s native suite of 2D tools and graphics technology. The gameplay is a vertical slice of a side-scrolling Idle RPG, popular on mobile platforms today.
While the party of heroes auto-attack their enemies, you can trigger special attacks simply by touching/clicking on the different avatars.
The sample project has been tested on desktop, mobile and web platforms.
In addition to the information shared in this article, please join us for our online Dragon Crashers overview webinar on April 14 at 11:00 am EST (5:00 pm CET) for key tips and a live walkthrough from our global content developer, Andy Touch. Take me to the registration section.
Make sure you have Unity 2020.2 or 2020 LTS to get the project on the Asset Store. First, start a new 2D project, then go to Package Manager > My Assets to import Dragon Crashers. You will see some Project Settings pop-up messages; accept them all.
If you encounter any issues, please let us know in the 2D dedicated Dragon Crashers forum.
Once the project is imported, you should see a new option in the menu bar that offers shortcuts to the project’s scenes. Select Load Game Menu and press Play to try it.
We recommend using high-definition display settings in the game view, such as the full HD (1920×1080) setting or 4K UHD (3840×2160).
Our party of heroes and base enemies are diverse, and decked out with different outfits, accessories and variations. However, they are all bipeds that have a similar build.
To avoid animating every single one of them with their respective 2D IK constraints, we created a mannequin. The animator used this mannequin, while the character artist created unique skins and accessories for the characters.
The Mannequin Prefab (PV_Character_Base_Bipedal.prefab) was used to create Prefab variants for each character. The only difference in those variants is the Sprite Library Asset, where we swap the visual appearance of the biped character.
All of the character Sprite Library Assets have the same Category and Label to define the body parts and their variants. For example, the knight and skeleton enemies both have a category named “mouth,” with sprite variants labeled as “mouth open,” “mouth teeth” and “mouth normal” used during animation.
To apply the animations to all characters, ensure that each character’s visual asset or PSB has a similar rig. In other words, they must have bones named in the same way, attached to parts of the body of the same Category and Label. To save time, you can copy the mannequin’s skeleton (or reference character bones), and paste it to the different characters. This option is available in the Skinning Editor, part of the Sprite Editor.
The Prefabs include features that make the characters more lively, like Inverse Kinematics and Normal and Mask maps for improved integration in the 2D lit environment.
There’s no need to set your level design in stone so early in the process during prototyping. The worldbuilding 2D tools included in Unity enable you to have fun designing levels, and then easily iterate on them. The Tilemap Editor and Sprite Shape help automate tasks, such as setting up colliders to conform to object or terrain shapes, whereas the Scene view is your playground to make the game more exciting and aesthetically pleasing.
The most important aspect is to have all your “brushes” ready in the Tile Palette, which can contain repeatable tiles, animated tiles, isometric or hexagonal tiles, or even GameObjects that render them all performantly, with just one renderer (Tilemap Renderer). For all the elements in the grid, refer to the Palette_GroundAndWalls Tile Palette.
Another often overlooked feature that can be useful in level design is Sprite Draw Mode. Tiled sprites used for background layers can cover a large scene area with a small sprite to create a nice parallax effect.
A Tilemap grid might not be the most practical solution to add more organic-looking objects, or spline-based elements to your project. Instead, we recommend a spline-based tool, such as 2D Sprite Shape,which draws much like a vector drawing software. Use it in background props or as part of the gameplay. The SpriteShape Renderer enables you to efficiently render many sprites attached to the spline or border of your shapes. See Prefab P_MineCartTracks_A to observe how the tracks are drawn with the spline line, and the supporting structure artwork is made from the fill texture of the Sprite Shape Profile.
Prefab P_Bridge or P_MineCartTracks_B are other examples that demonstrate how a Sprite Shape border doesn’t need to be a simple line, but rather represents more elaborate elements – in this case, a bridge and a railtrack.
With the 2D Renderer, use the Sprite-Lit shader for advanced lighting effects. Take full advantage of these effects by giving your sprites Secondary Textures.
Normal maps can be added in the Sprite Editor. These RGB images represent the XYZ direction that the pixel is facing and signal the 2D lights how much to affect them. Mask maps can also be harnessed by the 2D Renderer data asset (RenderData_2D.asset in the project), part of the Light Blend Styles feature. The Light Blend Style called “Fresnel” is used to accentuate the rim light around characters and sprites. To achieve the fresnel effect, for instance, select to use the R channel information from the Mask maps. In this particular project, we only have one Light Blend Style, so the three channels – R, G and B – look the same (black and white). This makes the process of creating Mask maps more convenient.
Shader Graph is frequently used in the demo to animate props without taxing the CPU. You can observe elements like wind moving the spiderwebs (P_SpiderWeb_Blur prefab), crystals glowing (P_Crystals_Cluster), as well as the lava flowing animation (P_Lava_Flowing_Vertical), which leverage a flow map texture to control the direction of the main texture’s UV coordinates. The flow texture uses the colors red and green to indicate the XY direction that pixels follow in every frame. Open the SubGraph FlowMap to learn how to achieve this effect.
In the same dragon battle scene, there is another shader animation technique called “animated alpha clipping,” which creates smooth animation from a single texture. This occurs by showing a specific range of pixels in each frame based on their alpha values. Visual effects like the lava splatter (ParticleSystem_Splatters) or hit animation (P_VFX_HitEffect) were made using this technique with Shader Graph.
The art style of the demo was created with consideration to 2D lights – and their many possibilities. As you can see, sprites enhanced by the handcrafted Normal maps and Mask maps are relatively flat. Some sprites, like the tilemap floor, are grey scale; meaning they are colored using the Color option from the Tilemap Renderer combined with the lighted areas from the environment.
Real-time 2D lights allow you to spend more time in the final game scene in Unity Editor. Observing the direct results while composing your scene with lights and objects, or even being able to play the level as you go, allows you to better establish the desired mood and atmosphere for your game.
Additionally, you can increase the immersion by animating those elements. For example, the P_Lantern_HangingFromPost Prefab shows how to attach a light to an animated object, or give the witch character a staff with Sprite-Lit particles
Another benefit of using 2D lights in your project is the ability to reuse elements. Environments and props can look very different depending on the lighting conditions, which allows you to recreate many different levels with the same sprites.
All seven characters, regardless of whether they are heroes or villains, inherit their core structure from the base Unit Prefab and use the same behavior code. To differentiate values between characters, we used Scriptable Objects for different ‘blocks’ of unit-based values. Hard-coded values make it difficult to balance the game for non-programmers and cause gameplay to be rigid, so we set up unit values such as ‘Attack Damage Amount,’ ‘Ability Cooldown Time in Seconds’ and ‘Unit Health’ in Scriptable Objects; for anyone working on the project to make quick adjustments. Those value changes are then handled by the gameplay code dynamically.
Each Unit Prefab has a core ‘UnitController’ script that acts as the unit’s ‘brain’ and handles internal-prefab script references and behavior sequencing. When the Dragon is attacked, for instance, the ‘UnitController’ executes related behavior events, such as transitioning to a flinch animation, playing a roar sound effect and reducing the Dragon’s health amount. These core behaviors adhere to the concept of encapsulation and only handle their own respective purposes and tasks. For example, UnitHealthBehaviour only handles logic, including setting and removing health values of a unit, and reporting important event callbacks, such as ‘HealthChanged’ or ‘HealthIsZero.’ However, the ‘UnitController’ sends values to ‘UnitHealthBehaviour’ based on the scenarios and actions that occur during gameplay.
In some instances, systems external to Units would require updating if a specific event happens. Delegates are utilised for these setups.
For example: When the Witch receives damage from an attack, and ‘UnitHealthBehaviour’ executes the event ‘HealthChanged(int healthAmount)’, then the external-subscribed ‘UIUnitHealthBehaviour’ can update the Witch’s Health Bar according to that value.
Using Delegates allows us to isolate and test areas without dependency on other systems. For example; this included testing the performance of the pop-up Damage Display Number System in a separate scene, without needing to simulate the battle gameplay.
Unity’s Timeline feature was used in two areas: Linear cutscenes and each Unit’s ability sequences.
The linear cutscenes take place at the beginning and end of a battle. They handle sequencing for a variety of areas, such as camera transitions (using Cinemachine), character animations (using Animator), audio clips, particle effects and UI animations. Each track was bound to the relevant scene instance.
A Timeline Signal was embedded at the end of the intro Cinematics to invoke a Unity Event when the Cutscene is finished. This ‘signals’ when to begin the battle gameplay logic.
Timeline was used to create prefab-embedded ability sequences for each unit. This enables each Unit to have their own special abilities that are connected and associated to their character; similar to champion abilities in a MOBA game.
Each unit contains two ability timelines; one ‘basic’ auto-attack and one ‘special’ manually-activated attack. The ‘UnitAbilitiesBehaviour’ script handles the logic for both ability timelines in terms of the ability currently playing, the ability sequence queue and starting/stopping ability cooldowns (depending on high-level gameplay logic, like whether the intro cutscene is playing, or if a unit has died). Ability Timeline Tracks bound to local systems of the Unit Prefab, including Character’s Animator for attack animation and Particle Systems for VFX. This allowed both the programmer and artist to create, playback and iterate on a Unit’s specific ability in isolation using Prefab Editing Mode before applying the changes to each instance of the Unit Prefab in the game.
Timeline Signals were used for when an ability was to apply some kind of value modifier to a Unit target’s health. When the Knight swings his sword, for example, we want the damage applied as soon as the sword reaches a critical point in the animation, rather than the beginning or the end of the sword swing. As timing for animations and VFX iterated during development, the artist repositioned the ‘Ability Happened’ signal to the new desired point of the sequence in a very quick workflow, without relying on the programmer to change any values in the code.
This also allowed us to add multiple ‘Ability Happened’ signals in a continuous attack, such as the dragon breathing fire at the group of heroes.
Senior global content developer Andy Touch hosted a webinar running through an in-editor demonstration of the Character Pipeline Workflow that was used to create the project. This webinar unpacked how to:
As a token of appreciation for exploring Dragon Crashers with us, we would like to share a set of wallpapers, Zoom backgrounds and other visuals to inspire you throughout your 2D game dev journey. Get the Dragon Crasher backgrounds here.
For those starting a new 2D project, there are already some great guides on the blog and forums. If you’re new to the tools, we recommend checking the 2D web page, 2D Tips Lightning Round blog and presentation for useful tips. For even more, check out a deep dive into our skeletal animation system here, or our previous project, the Lost Crypt and its corresponding webinar. As always, we also recommend perusing our latest docs, and of course, the 2D Renderer section for more information on specific features or APIs.