Toxic behavior can turn a potentially awesome gaming experience into a nightmare. When insults, trolling, and derogatory remarks are thrown around, many gamers choose to peace out, leaving the game behind. This player exodus can hurt a game's community and longevity.
Studios dedicate enormous amounts of resources and people to make in-game communities safer, but the tools required to foster healthy, fun, and engaged player communities haven’t been up to the task – until now.
Safe Voice from Unity Gaming Services (UGS) identifies disruptive and toxic behaviors exhibited by players in your game. By harnessing cutting-edge AI techniques such as advanced machine learning and deep learning algorithms, Safe Voice provides analysis of voice communications so you can identify and address toxic behavior effectively and efficiently. It helps you understand which behaviors are negatively impacting your player experience and gives you the insights you need to create communities that are safe by design.
Current anti-toxicity solutions lack contextual understanding of interactions between players. They might rely on speech-to-text or transcription-based analysis, which falls short in distinguishing between actual harmful behavior and harmless banter or identify players who use alternate forms of disruption like loud noises. Words alone can be easily misinterpreted, which can further limit the effectiveness of assessing intent.
Many solutions also rely solely on player reports to take action on toxicity. This places the burden on players to actively report violations and ignores instances where players might just leave the game, never to return.
Safe Voice overcomes these limitations by using context-aware AI technology. It can detect audio disruptions like loud music or toxic behaviors, then classify these into more than a dozen categories including obscenities, threats, insults, identity attacks, problematic speech, and verbal attacks. Safe Voice analyzes unique voice characteristics like tone, loudness, pitch, and emotion to deliver nuanced insights on both session and player-based metrics.
You can customize which interactions are monitored, rely on player-initiated or proactive situational triggers, and prioritize reports based on what matters most to your communities.
Moderation can be resource-intensive, especially for games with large player bases. This can lead to delays in addressing player reports, slower response times, and constantly overwhelmed moderation teams.
Safe Voice’s AI-powered detection and categorization automates formerly manual processes and provides results in near real-time so you can take action faster. You can customize Safe Voice to your team’s specific needs to help your moderation squad be more efficient.
Safe Voice kicks in when players flag instances of toxic or disruptive behaviors. You can also proactively monitor activities in your game like player mutes, exits, or when a vulnerable player is in a session. The service categorizes and prioritizes sessions that need attention based on your preferences, so no legitimate player report or serious violation is lost.
Some game genres have a higher likelihood of toxic behavior due to their competitive nature, anonymity, or the intensity of player interactions. Perceptions of toxicity can also vary based on individual experiences and community culture.
Your toxicity detection tools should align with the norms and culture of the community you want to create. With Safe Voice, you can build keyword lists to target or deprioritize specific words or phrases and customize toxicity threshold scores to make sure your coverage is as unique as your game and community. This contextual analysis is critical to keeping the essence of your game alive while developing a safe environment for your players.
Safe Voice’s detailed reporting dashboards give your team a thorough, nuanced understanding of your community. The overview dashboard shows trends over time and surfaces the behaviors that drive the most disruption in your game. You can see a detailed breakdown of player attributes and behaviors and categorize players as toxic or vulnerable. The data is updated in near real-time and reflects your custom priorities.
Today’s largest online communities, like those in Valorant and Rainbow Six: Siege, rely on Unity Voice Chat (Vivox) to connect their players through in-game communication across platforms. With Safe Voice, Voice Chat users can now add a layer of safety to their enhanced gaming experience.
We invite you to join us in creating the game communities of the future, where safety and inclusivity are top priorities. Whether you’re experiencing toxic behavior in your game or want to be proactive in establishing a safe environment from the start, Safe Voice can help.
Chat with us about Safe Voice and how we can further help with any toxicity you’re navigating in your game communities in the forums.