This year at AWE 2019, Chandana ‘Eka’ Ekanayake, co-founder and studio director of Outerloop Games, walked through the development process behind his game, Falcon Age. We had the chance to sit down with Eka to learn more about the specific tools he used. And now, we share his advice with you.
Falcon Age is Outerloop Games’ fifth Unity-based game as a team. They’ve made games ranging from mobile, VR, and non-VR titles. The team finds Unity’s toolset is ideal for testing out concepts and rapid prototyping, especially for smaller teams like theirs.
Falcon Age started as a VR prototype after seeing footage of golden eagles and other birds of prey. We spent time with a real-life falconer and studied bird movement. The idea of raising a bird and learning falconry sounded like interesting game mechanics to explore in VR, plus the bonding aspect of having a falcon as a pet hasn’t been done before.
We knew it was going to be more work for us to create both a VR version and non-VR version but felt the game could be designed to work in either way. The biggest considerations were locomotion, character to object interaction, and falcon interaction. With VR motion controls enabled, the player has full control of their hands. To make that work in non-VR, we had to animate and map interactions to a standard game controller. With Unity, it was easy for us to set up multiple character controllers that enable and disable in real-time based on what hardware the player is currently using.
We started with a basic structure using Animation Layers and gradually expanded it to multiple layers with animation Parameters that look for various events during gameplay. For things like bird headlock (when the bird keeps their head still while the body is moved around), we used a system of blend poses that transition between depending on the bird’s orientation. There are animation partial blending for legs, jaw, blinking, and full-body overrides for when the bird makes special moves like a fist bump with the player.
The falcon used basic physics for flying and high-level navigation uses an auto-generated 3D navigation graph. We used A* on the nav graph in Unity Job System to keep it performant. For object avoidances, the bird cycles through different raycasts at different frames while steering around and over objects in the world. To tune the flying, we exposed tweakable values in the inspector to adjust in real-time.
We’re fans of keeping our AI fairly simple and layering simple behaviors together for gameplay needs and talk to our animation system. For the falcon, she reacts almost instantly to player calls and player commands, but when she’s flying around she’ll get bored and go hunting, collect items for the player, or land at a point of interest. If she’s hurt, she’ll fly back to the player and find a spot where the player can see her and tend to her wounds. For our ground-based creatures and enemies, we use Unity’s navmesh. We also used the same navmesh for player navigation.
To speed up our early prototype while we were figuring out the gameplay, we used a lot of Unity Asset Store artwork like rock primitives, clouds, and props. It’s a lot faster to iterate on the design when we can use readily available content from the Asset Store. For the full development, we used other Asset Store items like texture tools, various libraries for sound, terrain tools, and even hat assets for the falcon. The Asset Store is invaluable for small teams like ours to speed up development time.
Social media is a great way to test out ideas. We tried different, short gameplay clips, images, and gifs from Falcon Age early on. The things that resonated for us the best were animations of the baby bird. Baby bird fist-bumping and other interactions with the player gifs were shared the most. My advice would be to test out things, find what resonates, and share more content related to the things people find interesting about your game.