In a positive shift toward inclusivity, an increasing number of game developers are prioritizing accessibility as an integral aspect of their creations. Unity is dedicated to providing support to help developers achieve their accessibility goals. As Leah Skerry presented during GAconf USA 2023, Unity has been actively working on mobile screen reader support, marking the first of many accessible runtime features slated to enhance gaming experiences in the coming years.
Since projects developed with Unity use our own graphical user interface (GUI) systems, it was not previously possible for mobile screen readers to be compatible with Unity-made content. This meant that when an Android or iOS user opened a game made with Unity on their devices while a screen reader was running, there was no way to interact with it until the screen reader was turned off. Allowing users of all abilities to enjoy gaming on their mobile devices has been one of our main objectives, so let’s dive into what this support means for Unity developers today.
A screen reader is a form of assistive technology that allows visual input to be output in a nonvisual way, such as speech or braille. Mobile devices running Android and iOS have built-in screen reader technology, such as TalkBack and VoiceOver, respectively. This form of assistive technology is essential to people who are blind, and also useful to people who are visually impaired, illiterate, or have cognitive disabilities.
For mobile devices, screen readers use a text-to-speech (TTS) engine to translate on-screen information into speech. They can be used to navigate the UI by either touch or gestures.
Older games made with Unity are, by default, incompatible with screen readers. For a screen reader to navigate such an application, its technology has to receive information about what the accessible elements are, where they are placed on the screen, what role they have, and how a user can interact with the UI. This means we needed a way to tell the screen reader that, for example, there is a label in this position, with this particular text, and in this position there’s a button with this particular text, and the action to take when the button is activated is this function, and so forth.
Starting with Unity 2023.2 Tech Stream, and improved with 2023.3 Tech Stream (now known as Unity 6 Beta), developers can now convert their GUI into data that a screen reader on mobile devices can use to allow navigation and interaction with a Unity game. This API was developed to not depend on a particular GUI system and can therefore be used by anyone developing a game with Unity – no matter what technology they use to implement their GUI. Non-GUI elements can also be represented as screen reader elements.
The screen reader API is a simple data structure hierarchy that contains the information that a screen reader needs in order to allow interaction with each GUI element. Every node in the hierarchy usually represents an accessible element in a game, featuring a label (the first thing read by the screen reader when the node is focused), a position on screen, sometimes a value, and extra information to help the screen reader give a user more information about that element, such as if it’s a button or a toggle, or if the element is disabled.
The order of nodes in the accessibility hierarchy are what defines the order in which the screen reader will navigate a screen. That means sibling nodes (nodes at the same level of the hierarchy) are read in order, and a parent node is read before its children, for example.
Currently, the initial implementation of this API for screen readers works only with mobile devices running Android or iOS operating systems. According to our product strategy, we are also considering extending support to MacOS and Windows, each with native screen reader capabilities, and to desktop-based web browsers as well. While game consoles are not inherently accessible platforms, we are looking into what is possible for them, too.
Unity recognizes the significance of ensuring that every gamer, regardless of ability, can fully engage with all that the world’s developers create. This latest capability underscores our commitment to fostering an inclusive and enjoyable gaming industry for all players. Our Accessibility Team is just getting started and has a lot more to do – check our roadmap, along with the one for UI Systems, to learn more.
The APIs mentioned are documented and found in the Scripting API section of the Unity Manual. Get started through the AssistiveSupport class. Additionally, we’ve put together a GitHub repository with a practical example (LetterSpell, pictured in the banner at the top) on how to implement screen reader capabilities in your Unity application or game, alongside extra AccessibilitySettings usage examples.
We’d love to hear how you’re using our tools to support accessibility. Show us the amazing things you’re working on in Unity and send us feedback directly in the Accessibility forum.