Online and multiplayer games are notoriously toxic places. As various legislators eye ways to crack down on toxic behavior, game companies are beginning to understand the urgent need to put their house in order.
This week. Microsoft released its Xbox Transparency Report, which details the company’s efforts to mitigate against toxic behavior, often through the use of content moderation tools.
The report states that the company notched up a 16.5x increase in the number of “proactive enforcements,” that is “when we use our portfolio of protective technologies and processes to find and manage an issue before it is brought to our attention by a player”.
The vast majority of those enforcements went against accounts that were “inauthentic” or involved in cheating. Others were hit with bans or suspensions for posting adult sexual content, vulgarity, profanity, harassment and bullying.
At this week’s GamesBeat Summit, a panel spoke on the subject of “How to do trust and safety right before you’re forced to do so.” One panelist, David Hoppe, is a partner at Gamma Law, specializing in the gaming and tech sectors. He pointed out the many efforts taking place to legislate against companies that tolerate – or fail to address – toxic gaming spaces.
The California Age-Appropriate Design Code Act is set to go into force this summer, but, as Hoppe stated, “there are also ruminations going on at the federal level, and there are seven additional states besides California that are considering similar laws”. He added: “Without a doubt, if we were to come back five years from now, it’s going to be a completely different environment for regulation of content and communications among users.”
Poison players
Companies like Microsoft are working on AI tools that will do the job of spotting toxic accounts before they have a chance to poison other players’ experiences. The 2002 Xbox report states that player reports are down 34 percent from the same period in 2021, which the company says is due to its tools intercepting toxicity at an early stage.
The report states that “player reports … are often first evaluated by content moderation technologies to see if a violation can be determined, with the remainder reviewed by human content moderation agents for decision-making.”
In the last year, Microsoft says that it widened its definition of toxic behavior. “We increased our definition of vulgar content to include offensive gestures, sexualized content, and crude humor … This policy change, in conjunction with improvements to our image classifiers, have resulted in a 450 percent increase in enforcements in vulgar content.”
Carrot and stick
Eve Crevoshay, executive director of Take This spoke at the GamesBeat panel, and called for companies to take a “carrot and stick” approach to stamping out toxicity, particularly white supremacist rhetoric, which is not uncommon in gaming spaces. Take This is a mental health advocacy, research and training nonprofit organization focused on gaming.
Crevoshay said: “There are norms and behaviors and ideologies [in gaming spaces] that have become really common. I don’t mean that they’re ubiquitous, I mean that they are a small but very loud problem. And that loudness means that it has become normalized.”
She pointed out that these norms include “misogynist white supremacist, neo-Nazi and other xenophobic language along with harassing and mean behavior”. She said that the game industry has yet to fully embrace the problem. “We haven’t really wrapped our heads around what it means to design for positive outcomes. We are starting to do that … But right now, we see really high incidences.”
A report last year, from The Anti-Defamation League, found that “nearly one in ten gamers between ages 13 and 17 had been exposed to white-supremacist ideology and themes in online multiplayer games. An estimated 2.3 million teens were exposed to white-supremacist ideology in multiplayer games.”
Richard Warren, another panelist at GamesBeat, is a partner at Windwalk, which is a consultancy dedicated to building gaming communities, often on Discord. He said that it’s difficult for game companies to manage communities on third party apps, where young gamers often congregate, but that there are ways to influence the conversation. He said that companies should go about “setting a culture around self moderation inside communities, promoting people that are doing good deeds inside the community.”
For Crevoshay, the real danger is that toxic behavior becomes learned behavior, especially among youngsters. As this spreads, it acts to exclude newcomers to gaming spaces, which is bad for game companies.
“Kids are learning toxic behaviors because the environment is so filled with it,” she said. “It’s a real concern. And people are frozen out of games because they don’t feel comfortable in them. So we know it’s a limiting factor in the business.”
She added: “It’s ubiquitous, and it’s harmful to people. It propagates through communities, which are not effectively moderated.”
While organizations like Take That, and some gaming companies are working to remedy the problem, time is running out. “We’re armed with the ability to say, ‘okay, we have the tools inside this industry to address this. We can change this’. But meanwhile the regulatory landscape is really heating up. People are starting to say, it’s time to put the hammer down.”
Photo credit: Andre Hunter at Unsplash
Colin Campbell has been reporting on the gaming industry for more than three decades, including for Polygon, IGN, The Guardian, Next Generation, and The Economist.