In this guest post, Everguild’s team discusses the studio’s upcoming collectible card game for Steam, iOS, and Android – Warhammer 40,000: Warpforge – and how they leveraged the best elements from past successes, elevating them with the latest development tools.
We are Isabel Tallos (co-founder and art director) and Cesar Rios (game director), leading the development of Warhammer 40,000: Warpforge at Everguild. For this ambitious project, the studio’s third and largest to date, we’ve leveraged the best elements from Warhammer The Horus Heresy: Legions, and elevated our work using the latest development tools and a lot of craftsmanship.
Warhammer 40,000: Warpforge aims to become a reference for quality, depth, and innovation among digital collectible card games. One of its most striking features is the approach our team has taken to depict the scenarios where battles take place.
Most card games use a top-down perspective, but in Warpforge all units become “physical” tokens in battle. They fight in astonishing scenarios that provide an immersive experience of leading a player’s army in battles across the Warhammer 40,000 universe.
In this deep dive, we’ll examine the design and technical challenges that led to these battle backgrounds, and the tools and processes we used to create them.
The approach we take in Warpforge draws from the experience of Horus Heresy: Legions, our previous card game in the Warhammer 40,000 universe. Originally a mobile game, the scenarios and game logic were done in 2D, faking the perspective to create the illusion of being 3D. This helped keep the game lightweight and able to run on low-spec devices.
However, building the scenario and logic in 2D caused important limitations when trying to create VFX for the different cards and abilities, and even more so when trying to animate or add VFX to the background itself.
To make the scenarios for Warpforge as stunning and immersive as the team wanted, and to elevate the card’s VFX to the next level, it was essential to create them in 3D. Moving to 3D enabled us to stop faking positions, scales, and rotations and just do things in a more natural way. This led to better quality VFX and faster production times.
From the start, we wanted our 3D scenarios to maintain the exact same visual aspect as our amazing 2D concept art. When using the usual 3D asset production workflow for creating 3D assets (concept, modeling, unwrap, and texturing), some deviations were introduced in each step, making the final result different from the original concept art.
Since the game will be released on mobile, creating high-fidelity 3D assets on less powerful devices risks introducing performance problems, like frame rate or memory issues, and increased loading times. Add to this increased production time when executing this workflow properly for each of the scenarios, and it was clear that we needed a different approach.
After numerous attempts with different approaches, the main breakthrough was the idea of using camera projection mapping. This technique, also known as spatial augmented reality, consists of “projecting” 2D textures over 3D surfaces or objects, creating a “projected texture.” It makes flat surfaces appear to have depth, creating the illusion of very detailed 3D objects while using extremely simple geometric shapes. It also allows for some limited camera movement to further convey the impression of being immersed in a fully 3D space.
With this approach, we managed to get all the benefits of a 3D scenario without most of the drawbacks. It allowed us to faithfully keep the art style and level of detail of the concept art, without having to recreate it through 3D assets. It also requires very little processing power at runtime, allowing the game to run perfectly even on low-spec devices. Plus, the end result requires much less work and cost than creating full 3D environments.
Compared to a full 3D environment, a scenario built with camera projection mapping doesn’t allow for a lot of camera movement, so it’s clearly not a valid solution for many types of games. For a card game like Warpforge in which the camera is mostly static, though, it is a fantastic solution.
The process of generating a scenario has several steps. For Everguild, it starts with the creation of a placeholder scenario directly in Unity using simple primitives. It consists only of a floor plane and some vertical cubes to help find the camera perspective that best matches the desired gameplay.
Once everything is set up correctly, we capture an image from the camera perspective to send to the painting software. Using the Unity FBX Exporter, we export the 3D placeholder scenario, including the camera position and lens parameters, so we can import it into Blender.
Taking the exported image as perspective reference, the concept artists draw the scenario. They have total creative control without any type of restriction, because whatever they do will be translated 1:1 to the game. Concept artists not only draw the scenario itself, but also the visual effects (VFX). Those will be used later as reference, even as textures, for the VFX artists. In this step, it is essential to properly organize the file in layers, always drawing whatever is behind the objects, so that we can export the different layers separately later.
Once the concept art is ready, it is brought into Blender, where each element is projected onto a simple 3D object. This projection technique eliminates the need for the 3D artist to laboriously create custom UVs for every object, since the camera projection mapping automatically calculates them. The 3D models are then exported back to Unity. Here, using custom Editor tools, a final pass takes place where the various texture layers are combined into a cohesive atlas texture.
This process not only optimizes memory usage and the game’s size, but also minimizes the batch count. Now, the scenario looks exactly like the concept, but we can take it a bit further.
After creating the 3D scenario concepts, it’s time to bring them to life with a range of different tools. For example, a camera flyover at the start of the match helps provide a sense of depth and immersion, though the path needs to be carefully chosen to work around the limitations of the projection mapping technique.
We use a combination of Shader Graph and Render features to craft an array of captivating effects, including dynamic water, blurred planar reflections, vortex portals, and more. These effects are seamlessly integrated with particle effects using both Shuriken and VFX Graph. VFX Graph is used for implementing more complex effects. However, since it relies on compute shaders, we always ensure the availability of a fallback version for devices that don’t support it.
Each effect is meticulously designed to align with the original vision from the concept artist, ensuring a cohesive and immersive experience.
After a painstaking research process involving the design, art, and technical teams, we’ve developed an approach to creating 3D scenarios which meets all of our core criteria: Stunning aesthetics, a sense of immersion, efficient development workflow, and strong performance on all devices.
We believe there are many games which could benefit from these processes, particularly those with limited camera movement. We hope this post will prove useful to some developers. Most importantly we hope players will enjoy diving into the grim, dark universe of Warhammer 40,000: Warpforge when it’s released.
The game is currently in its closed alpha phase and due for release on Steam, iOS, and Android before the end of the year. To learn more about the multiplatform release, check out Everguild’s recent case study. Read more Made with Unity stories straight from the developers here.