It’s no secret that online gaming has a massive harassment problem, but the good news is that more and more publishers and developers are taking steps to fight back against the trolls. Electronic Arts is one of the companies taking action, and at the publisher’s EA Play event prior to the start of E3 2019, it held its first ever Building Healthy Communities Summit where it invited 200 members of its community-based ‘game changers’ program to discuss what can be done to alleviate the awful toxic behavior in games today.
The company outlined its efforts on its corporate blog today, while acknowledging that according to its partner Ditch the Label, “nearly 60 percent of players have been bullied in an online game. That’s just not acceptable.” EA stressed that bullying and toxicity in general “is getting more intense,” and that’s why it’s opened the discussion with members from its online community across 20 different countries. It’s a problem that requires a global effort and a collective push from the entire industry.
“While this negative experience has largely been normalized up until this point, this is a time to challenge the status quo in pursuit of a safer, more inclusive gaming experience,” said EA’s Adam Tanielian, Sr. Director Global Community Engagement.
EA stressed that it’s already taken action in the past to fight toxicity.
“Over the past three years we have worked to reach youth through campaigns like Gamers Unite for Equal Play, which last year reached almost 110,000 young people in the UK and the US. We also introduced an updated public reporting tool available to all players. While we have made strides, we believe with our player communities’ help, we can continue to improve the player experience,” Tanielian continued.
During the summit, EA held a number of closed-door sessions with its community to gain critical feedback on the toxicity problem, a process it likened to getting feedback during game development playtesting. These included topics such as toxicity detection and technology, EA’s inclusion framework and how it can “help developers create games that reflect the diverse communities they engage through games,” as well as factors that lead to or even encourage toxicity and “how a zero-tolerance policy might affect change across EA games and the industry at large.”
Coming out of the summit, EA has made a number of commitments, including forming a council to “provide ongoing feedback into EA programs, policies, and platforms, including additional avenues for community feedback,” and a promise to explore new toxicity tools and in-game features “to more easily manage and effectively report disruptive behavior in our services.” Moreover, the publisher will now inform the community with quarterly reports about the progress being made and any new initiatives that are being started.
EA’s Building Healthy Communities Summit immediately follows the company’s “Play to Give” pledge, which provided a total of $1 million to three nonprofits working to build equality worldwide and fight back against bullying. It’s certainly encouraging to see one of the industry’s top publishers seeking to lead by example in the fight against toxicity and for inclusivity and diversity.
That being said, if EA wants to do the right thing by its community, it’s going to have to carefully evaluate who it partners with in the influencer crowd as well. As noted by GamesIndustry.biz, the streamer Guy “DrDisrespect” Beahm was paid for his appearance at EA Play and then went on to see his E3 badge revoked and his Twitch channel suspended for bringing a camera crew into a public bathroom where he apparently filmed himself and others.
Slimey influencers aside, EA’s recent initiatives are representative of a larger fight against toxicity throughout the industry. We’ve seen Blizzard and Ubisoft Montreal introduce ways to combat harassment in Overwatch and Rainbow Six Siege, respectively, while Microsoft has continually pushed for inclusivity and accessibility through its Gaming For Everyone program. Not only that, but we’ve witnessed the rise of technological solutions as well, as companies like Spirit AI have created tools that can weed out toxic behavior in both text chat and voice chat.