
Toxicity plagues games – no matter how good they are. Driven by a small minority of offenders, the culture of a game can easily tip, eroding a game’s community and driving players away – possibly forever. Modulate has gone out to change that – and we are excited to partner up to support them on their mission.
By leveraging Modulate’s insights on player behavior, our matchmaker can now actively reduce toxicity by forming matches that foster healthier, more engaging gaming experiences.
Toxicity impacts games
No matter how cozy or competitive your game is, toxicity can strike you at any time. The very tools built to foster interaction in your community can be turned into tools of exclusion, degradation, and harassment. Voice chat, intended to let players interact with each other naturally, can be used by a vocal minority of trolls and exploiters to poison your game.
The outcomes are grim. Churn, reputational damage, and a community in distress can put your entire game philosophy in question. How can we foster a true sense of community if players run the risk of abuse?
Modulate’s insights can foster pro-social behavior
Modulate specializes in detecting and battling toxicity by proactively flagging the toxicity happening in your community directly to moderators. ToxMod, Modulate’s voice chat moderation solution, listens to players’ voice chats to identify instances of toxic behavior as defined by your game’s Community Guidelines and Code of Conduct. Within seconds, ToxMod flags and shares analysis of player voice chat interactions in a central dashboard so your moderator and player experience teams can take action before toxicity drives other players away. How Modulate’s research suggests to do this has us absolutely fascinated:
Being fully customizable, ToxMod lets you decide what types of behavior are allowed in your games and align enforcement with your community guidelines. Moderators can review audio clips within ToxMod and determine if any given player interaction does in fact violate your Code of Conduct. Sure, moderators can send warnings, mute, or even ban particularly egregious offenders, but what other tools can developers and player experience managers use to prevent the spread of toxicity and encourage positivity?
Turns out, most players are not toxic by nature. According to Modulate’s aggregate customer data, only about 6% of players in a game’s ecosystem create 90% of toxic interactions. These players have toxic tendencies, which, when they are not feeling restraint, they will freely express. Even worse, when they are paired up with other people of similar inclination, they are given a perfect echo chamber to reinforce their negative behavior and team up as bullies. Research has shown the impact of online echo chambers in amplifying harassment behaviors.
That is the point matchmaking comes in. Pairing potentially toxic players with outspoken, positive players, will suppress their toxic tendency and make the game more positive overall.
Idem’s role: Setting up the right matches
Pairing up players in an effective manner to maximize game enjoyment is at the heart of what we do at Idem. In the end, matchmaking is all about reducing friction. Whether it is the friction introduced by difference in skill-level, long latency – or through the toxicity of players. As we just saw, pairing up the right combination of people in a team is what makes the game enjoyable. That is why the integration of Modulate and Idem makes perfect sense: No better place to regulate team setup than straight at the matchmaker.
The integration ensures Modulate’s insights are used as they are intended. For each player enqueueing, Idem pulls the toxicity scores. The matchmaker then creates teams in a way that toxic behaviors are best mitigated, isolating toxic players from potential echo-chambers. That way, Modulate’s insights can be acted upon in the way that their research suggests is most effective. Also, if toxic players need a bit of extra motivation, they can be punished by lowering their matchmaking priority – or just be straight out expelled.
Getting started with ToxMod gets even easier
For studios looking to integrate ToxMod into their game, this partnership makes getting started simple: Idem’s matchmaker has ToxMod natively integrated and with a simple to use API and ready-built integrations into major backends (Pragma, Accelbyte, Beamable etc.) as well as Steam and Unity, getting Idem’s matchmaker into your game is a walk in the park.
Talking hard numbers: The benefits
Using anti-toxicity will have a measurable impact on a number of dimensions, ranging from players satisfaction, to the dry KPIs of revenue and churn. For good reason major games like Call of Duty, GTA or Among Us VR trust Modulate and their ToxMod platform to moderate their communities.
Read more about how Modulate helped Call of Duty to reduce churn, decrease toxicity exposure by 50%, and improve player engagement by over 25%:
댓글