close
close

Mondor Festival

News with a Local Lens

How AI technology is finally making online gaming less toxic
minsta

How AI technology is finally making online gaming less toxic

  • Online gaming communities often fail to make players feel safe and included due to toxic text and voice chat lobbies.

  • AI chat moderation flags potentially toxic behavior so it can be reviewed and addressed by a team of human moderators.

  • AI voice chat moderation is already used in popular multiplayer games like Call of Duty, Among us, And GTA Online

Spending time in a gaming lounge or in a chat room can be terrible. Some communities have become toxic, driving players away from the game before they even get into it. Some publishers believe that AI could be the solution to making their games accessible to all players.

The game has a toxicity problem

I come from an old-school gaming background, so most of my formative years were spent playing single-player games or couch competitions on a console. Trash talking was the order of the day, but you knew your limits when you were on the couch next to your friends, and no one ever took trash talking personally. The Internet has changed all that.

While you can play local multiplayer games with Steamyou will usually play online in a lobby. If you play online multiplayer shooting games like Valorant Or Call of Dutyyou will quickly realize the extent of the toxicity problem that many online shooters face. 2022 data showed that Call of Duty was probably the most toxic of all fanbases, making the game look bad in the eyes of new fans. The game also has a notorious history of harmful behavior, including the famous “swatting” trend from a few years ago.

Activision, the publishers of Call of Dutywere unhappy with it and decided it was time to change things.

Gamers have been looking for non-toxic games for a long time. However, most gaming companies I don’t know how to approach the toxicity problem. among their player bases. Call of Duty recently implemented a system that uses AI to detect toxic behavior in lobbies and manage players promoting negative behavior in their player base. The results were surprisingly effective.

How the AI ​​handles toxic players

Person playing on PC with FPS visible.

So you’re in a game lobby, and a guy hits you in the face with trash talk. This eventually results in disgusting language and then insults. You report the guy, but you don’t even know if anything will be done about it.

This is the reality of many online gaming spaces these days. According to Activision, more than four million accounts have been subject to enforcement actions since January 2024 on toxic behavior, but the behavior persists.

Activision’s old approach wasn’t working, so they decided to shift gears.

The company previously partnered with Community Sift for text-based moderationwhich seemed to work well. However, in console-based lobbies, most communication is done via voice and voice-based moderation is necessary. That’s where the company’s AI-powered voice chat moderator comes in.

The moderator, known as ToxModcomes from the company Modulate.ai and has a lot of power within the Call of Duty ecosystem. ToxMod uses voice nuances, speech patterns, accents, and other elements to determine whether the user is saying something toxic to other lobby members. The moderator can issue warnings and report player accounts for human enforcement and banning.

What games currently run the AI ​​chat moderator?

Titles like Among us And Grand Theft Auto Online have both implemented the AI ​​chat moderator. GTA Online players first experienced ToxMod beta testing in December 2023, but many users were concerned about their privacy. Rockstar, the game’s publisher, told players that they were simply testing the system and would consider implementing it in 2024. To date, however, there haven’t been many updates on moderation of AI.

Among us also implemented the system in its VR version in April 2024. These popular games have made ToxMod very popular for dealing with questionable speech and keeping gaming lobbies acceptable to everyone. It’s a step forward in helping online multiplayer games shed their toxic past and become more accessible to more people.

What is reported and what is correct

A woman and child using headsets to voice chat on a tablet and laptop.

According to ToxMod, the AI ​​voice moderator uses emotional signals to determine whether something offensive has been said. However, it also uses chat responses to identify offensive statements.

The system is designed so that humans have the final say since ToxMod sends detailed logs of the audio conversation to the human moderator, who can then issue a ban if they find that the conversation violates the code of conduct. This is a reporting tool, designed to help locate the problem, rather than a fully automated moderation system.

So why is ToxMod so important if there are already ways for users to submit reports on toxic players? Activision’s problem is that going to the report screen takes players out of the game. When this happens, they are less likely to want to continue playing. ToxMod’s automation keeps players immersed by creating the report for them.

Many of us are used to simply muting problem players and moving on, while ToxMod ensures that a report is made so that the instigator can be dealt with.

How has ToxMod helped Call of Duty?

While it’s easy to assume that an AI voice moderator could create games like Call of Duty less toxic, having concrete facts is much better. In this case, Activision estimates that there was up to Voice chat 50% less toxic in North America for C titlesall Duty: Modern Warfare III And Call of Duty: Warzone. The same source mentions an 8% reduction in repeat offenders and an overall 25% reduction in toxicity on the platform.

From the numbers, it is evident that ToxMod has had a positive impact on the chat ecosystem. However, this should be taken with a grain of salt. AI voice moderation tends to have issues with certain regional accents, so it is essential that a human issue the bans to avoid any issues.

Despite everything, some players are unhappy about having been punished for voice chat. No matter how effective AI voice moderation is, mistakes can still be made. We think it is a great year to discover what’s new Call of Dutyespecially with the reduced toxicity.

Building the future of gaming we want

Artificial intelligence has many uses, from powerful image generation on your PC has use it as a personal trainer. It’s only natural that someone would find a way to moderate offensive voice chats in multiplayer games to make it a more welcoming atmosphere.

I would like to see a multiplayer gaming community where people support each other and don’t abuse other players out of prejudice. There’s a time and place for trash talking, but there’s a line where it gets nasty and abusive.

If AI voice chat moderators are the way to get us to a place where gaming is fun again, I’m all for implementing it.