According to Unity’s new Toxicity in Multiplayer Games Report, the percentage of players who report witnessing or experiencing toxic behavior increased from 68 percent in 2021 to 74 percent in 2023. Half of the players surveyed say they regularly encounter toxicity in games.
However, the report contends that players are finding ways to fight back against the abuse they encounter. The vast majority of players – 96 percent – say they have either blocked or muted an abuser, left a game, or made use of in-game reporting options. This compares to 66 percent in 2021.
“In the battle against toxicity, multiplayer gamers are not passive bystanders,” states the report. “In the past year, nearly all players have responded to toxic behavior at some point, demonstrating their willingness to take action.”
The report includes a stark warning to game companies that fail to tackle toxicity – two thirds of players say they are “likely” to stop playing a game if they experience toxic behavior, while 74 percent say they would not try a new game if it has a reputation for toxicity.
People who work in games are well aware of the problem. Game developers are more likely than players to say they’ve noticed a rise in toxicity ove(r the past year – 53 percent – compared to 32 percent of players.
Game companies, like Microsoft, are taking steps to thwart online assholes. Earlier this year, the company instigated a driving-license style points system to deter and punish toxic behavior. Although the system has its flaws, it’s a step in the right direction for a sector that has long been bad at policing its own areas, and enforcing rules.
Consumers are looking to game companies to set the pace. Players assign most of the responsibility for safety to game studios, while attributing 45 percent of the responsibility to protect themselves. Four out of five players agree that protection from toxic behavior should be a priority for game developers, a slight increase on a previous report in 2021.
“Developers and players see game safety as a shared responsibility, and they agree that developers, players, and the broader industry all have a role to play in building safe game environments,” states the report.
The report is based in interviews with players in South Korea, the United States, and United Kingdom. Mainly, the results are in synch in each country, although U.S. players are more likely to believe that the responsibility to manage toxicity lies with the players themselves (51 percent) compared to a just over 40 percent in the other two countries.
Developers say they can see the benefits of reducing in-game toxicity – 98 percent believe that players will respond positively to changes. They claim that less-toxic games will lead to increased interaction between players (59 percent), continued engagement (57 percent), higher purchase levels of in-game items (41 percent), and a higher likelihood of recommending the games to friends and family (48 percent).
Around the world, players define toxic behavior according to their own experiences. Cheating or intentionally disrupting games is the most common form of toxicity, followed by hate speech, posting inappropriate content, abusing other players (often about the way they play), harassment, extremism, and predation.
The report features a sad selection of quotes from players who have witnessed and suffered abuse. One Korean man defined toxicity: “People who continue to create and spread slander and abuse among the community.” An American woman said: “Use of inappropriate language and slurs.” A British man said: “Targeting individual players on a regular basis in order to harass, break down or humiliate them.”
Different kinds of games attract varied levels of bad behavior. Players say they are more likely to experience toxicity in shooting games (51 percent), sports games and racers (49 percent) and battle royales (44 percent) than card and deck building games (23 percent), puzzlers (29 percent) and adventures (30 percent).
Players are clear about the features and oversight they want to see in multiplayer games. A man in the UK said that there ought to be “clear rules for players’ behavior in the game to prevent malicious behavior or inappropriate communication that harasses players”. A woman in the U.S. said she expects “responsible community moderators who can interact with players”.
Unity has its own stake in effective moderation. The report highlights the company’s own Safe Voice AI tool which it claims gives studios and moderators “the insights they need to take action and tackle toxicity in their games”.
The report says that AI is bound to play a bigger role in moderation, as the technology sharpens. “The rise of toxic behavior has created a growing need for ways to moderate more quickly and effectively. Technology is helping to address this challenge, enabling both proactive monitoring and player-initiated actions. By collecting evidence of toxicity and giving context about incidents, AI-powered tools are empowering players and human moderators to take action and make informed decisions”
68 percent of players say they are willing to be audio recorded if it helps fight toxicity. “Recording comes with the perks of holding players accountable and having solid proof of behavior,” states the report. However, 89 percent of players “have concerns about being recorded,” saying they want to be assured that their information won’t be misused (43 percent) or sold (31 percent), that it’s safe from data leaks (34 percent), and that it won’t be used to target them for non-moderation purposes (31 percent).
Most developers (69 percent) believe that players are fine with being recorded, but only to reduce toxicity. 87 percent of developers agree that players need to feel that audio recording is only being used for the safety of the community, and to root out bad actors.
The report concludes: “The gaming industry must empower creators with solutions to tame disruptive behaviors and promote positive and engaging communities within games. Addressing toxicity requires a collaborative approach that involves tech companies, studios, publishers, platforms, players, and other stakeholders. Understanding why and how toxic attitudes develop is crucial for the continued growth and success of online gaming.”
You can download the full report here.