With Rainbow Six Siege and Overwatch combating hate speech and harassment, what are others doing to tackle these issues?
Online gaming has a massive harassment problem.
We’ve all encountered it at some stage -- name-calling, teasing, the insidiousness of racism and homophobia. This poor behavior can be enough to push us away from a game, its community or even away from online gaming as a whole. That is incredibly damaging not only to the player, but the brand of the game if word gets out about player abuse to other innocent players.
Now, it’s at a stage where publishers, developers, and the platform holders are demanding a lot better from their communities going forward. Two examples of this come from Blizzard and Ubisoft Montreal, who respectively instituted their own ways to combat harassment within Overwatch and Rainbow Six Siege.
In Overwatch, an endorsement system was introduced that lets players compliment one another by endorsing other players on their team for skill, sportsmanship, and being a good teammate. Blizzard has also introduced Looking for Group (or LFG for short), where players can play in groups with people of similar interests. LFG groups have long existed in games, officially or unofficially (such as the LFG subreddit for Destiny). But with Overwatch, the hope is this LFG feature has helped with reducing harassment levels for the game, especially with women gamers.
Meanwhile, Rainbow Six Siege has swung banhammers all over the place for abusive players. Players have received automatic bans without warning for using racist or homophobic language in text chat. After two suspensions, a third will warrant an investigation where the account in question could end up permanently banned from playing per its Code of Conduct.
But what other developers and publishers are taking the fight to the harassers in question?
GameDaily reached out to a handful of developers and publishers of the biggest played online games in the industry today, including the Big Three platform holders, to gather insight on their current policies about online harassment in their games, the approach taken to combat harassment, and how they will prevent harassment and hate speech in their upcoming games. As we put this piece together, we contacted Sony, Microsoft, Bethesda (Fallout 76, The Elder Scrolls Online, The Elder Scrolls Legends), Psyonix (Rocket League), Epic Games (Fortnite), and EA (FIFA 19, Battlefield V, Anthem), in particular.
We also requested comment from Nintendo, Rockstar Games (Grand Theft Auto Online, Red Dead Online), Activision (Call of Duty: Black Ops 4, Destiny II: Forsaken), and Bluehole (PUBG). We were particularly interested in chatting with Blizzard on the new systems in place for Overwatch and whether they’d bring these features to their other online games (Hearthstone and Heroes of the Storm). Unfortunately, none of these companies responded to our numerous requests.
That could be considered massively problematic for Nintendo considering it has just launched Nintendo Switch Online. Without any knowledge of its Code of Conduct for hate speech and abuse online, we’re still in the dark for how it will take action on those things in the infancy of a new online service, one consumers will have to pay to get the best out of it.
We also followed up with Ubisoft about how its systems used in Rainbow Six Siege will continue to combat the increasingly problematic players, in addition to how it can carry those forward to Ubisoft’s ever-evolving online GaaS (games-as-a-service) catalogue, including the upcoming The Division 2.
Toxicity in Rainbow Six Siege
As Ubisoft continues to clamp down on hate speech and harassment within Rainbow Six Siege, its online policies get examined thoroughly on a regular basis.
“We are constantly looking at improving the experience of players within our game worlds,” senior director of media and community relations at Ubisoft, Michael Burke, told GameDaily.biz.
“For instance, we have pushed multiple updates to the Rainbow Six Siege code of conduct in the past year and a half. Ubisoft has a great culture of sharing knowledge and best practices across studios so teams from different games also learn from each other's experiences. They frequently discuss what worked well for them and what didn't, and how best practices for improving the player experience in other games.”
Ubisoft isn’t the only one to run at abusers with a stick. Rocket League developer Psyonix has told GameDaily that it’ll be implementing a brand new “real-time language filter” for the game that it hopes will be fully implemented by the end of the year.
“This new filter system will initially filter out inappropriate Tournament names, as well as Party chat once cross-platform parties are enabled in the Rocket League client,” said Devin Connors, community manager for the San Diego-based studio.
“This [feature] will eventually apply to all forms of player communication within the game, and our goal is to have this enabled in player chat by the end of 2018.”
Toxicity in the biggest games on the planet
As the popularity of certain games rise, the chances of having toxic elements grow from its player base increase. No game is safe from it, but when you think of the biggest games in the world, there are none bigger than Fortnite right now.
When asked for comment on their anti-harassment stance, developer Epic Games promptly pointed to its code of conduct on its website. It’s short and brief, but it runs by four core principles:
- Respect other players
- Play fairly and within the rules of play
- Keep account information safe and private
- Good luck and have fun!
Although Epic is a bit more jovial with its code of conduct, Rockstar Games definitely isn’t. Grand Theft Auto Online, five years after its original launch on PS3 and Xbox 360 and then subsequently on PS4, Xbox One, and PC, still drives revenue for the company and its parent Take-Two Interactive. And with Red Dead Online set to launch soon after the October 26 release of Red Dead Redemption 2, the industry will be watching with a careful eye to see what lessons are brought over from GTA Online to RDR Online.
But Rockstar’s code of conduct, while not as brief and peppy as Epic’s for Fortnite, is as stern as you’d expect from a company that has to protect the image of properties worth hundreds of millions of dollars (or in GTA’s case, billions). Rockstar’s not messing around either.
Paragraph 3 in its CoC reads: “You will not use the Online Features to create, upload, or post any material that is knowingly false and/or defamatory, inaccurate, abusive, vulgar, obscene, profane, hateful, harassing, sexually oriented, threatening, invasive of one's privacy, in violation of any law, or is inconsistent with community standards.”
And adds in paragraph 8: “You will not impersonate any other individual or entity in connection with your use of the Online Features.”
Ubisoft faces many of the same pressures as Rockstar when dealing with online harassment. It’s vital that the French publisher protects their properties and image, especially as a public company that only this past March managed to fight off a very public and ugly hostile takeover from Vivendi spanning two years.
Michael Burke said that when it comes to implementing similar measures fighting harassment from Rainbow Six Siege - one of the biggest competitive shooters in the industry, if not the biggest - towards other games in-house at Ubi, such as Ghost Recon Wildlands, The Division (and its upcoming sequel), and For Honor, it’s a collaborative process worldwide.
“Ubisoft studios across the world work closely with each other to exchange best practices and discuss how to adapt effective anti-toxicity features used in some of our games to other brands when needed and relevant.”
With Siege itself, though, the message is clear: as long as you don’t grief and harass, everyone should be and is welcome.
“We feel strongly that Rainbow Six Siege should be a welcoming environment for everyone, and as such, will continue to work on improving our anti-toxicity measures over the coming years,” he said. “In the short term, we are finalizing more robust muting options, allowing our players to take more control over their gaming experience.”
Toxicity on the platform side
But as third-parties continue to fight the toxicity inherent in their communities, there is still an onus on the first-parties. Sony, Microsoft, and Nintendo have a responsibility as platform holders to work on keeping harassment and toxic behaviour off PlayStation Network, Xbox Live, and Nintendo Switch Online, respectively.
Nintendo didn’t respond to multiple requests for comment for this article. In fact, when searching online for their code of conduct, the most recent one we could find was from 2015, around the time the Wii U was still a thing. To the best of our knowledge, we’ve not seen an updated CoC that reflects the Nintendo Switch or Nintendo Switch Online.
“While you are certainly entitled to think whatever you like, we must ask that players always adhere to the Community Code of Conduct to help us make PSN as safe and enjoyable as we can for everyone,” said Sony’s Luke Mears in one of the blogs. “The Code of Conduct applies to everything you say or do on PlayStation Network including but not limited to: the Online ID you create for yourself; the images and text within your profile; anything you create or share within games.”
In a list of dos and don'ts, Sony’s Code of Conduct outlines that players do not stalk, bully, discriminate against or abuse other users, be disruptive, or threatening. It goes on to say suspensions and bans can be given out to users depending on the severity of the rule broken. The company will send the suspended user in question an email within 24 hours of the offense and tell them they’ve been suspended from PSN and how long they’re suspended for. A full ban can also be issued if the severity of the rule breaking is massive.
But suspension or even a permanent ban from the PlayStation Network isn’t the only punishment Sony dishes out to bad apples. Your console can be hit with a ban to prevent you from using any kind of online access on it.
“Console suspension means that your PlayStation system has been stopped from accessing PlayStation Network for a set amount of time. A ban means that you will not be able to access PlayStation Network at any point in the future.
“No users may use PlayStation Network with their local account on the system.”
PlayStation “will not refund you for any unused period of subscriptions or any unused wallet funds in line with the SEN Terms of Service” should you gain a permanent system ban.
Microsoft, who have more of an online-centric and multiplayer driven approach with their first-party lineup with the likes of Gears of War, Halo, and Forza, provided a statement to GameDaily.biz outlining their approach with Xbox Live in the Xbox One era.
“We do not tolerate harassment of any kind,” a company spokesperson told me. “At Microsoft, we believe that everyone has the right to create, play, and share their opinions about games without the fear of being a target of violence, harassment, or threats. Our code of conduct helps ensure everyone has a safe, secure, and enjoyable experience and any activity that violates these terms [will not be tolerated].”
Its code of conduct outlines that anyone who doesn’t conduct themselves in certain ways (including but limited to harming or harassing others such as encouraging violence against people; screaming, intimidation, bullying; or doxxing fellow players) or creates user-generated content on Xbox Live that’d harm others (including but not limited to hate speech or threats towards “people who belong to a group, including groups based on race, ethnicity, nationality, language, gender, age, disability, veteran status, religion, or sexual orientation/expression”) can face content deletion, restrictions, or even a permanent ban depending on the severity of the rules that were broken.
But Microsoft’s no-tolerance approach for abusers goes beyond a mere code of conduct. The company has been running its Gaming for Everyone initiative, where its mission statement includes making the Xbox platform a better place to play for everyone. And at his DICE 2018 keynote earlier this year, Xbox boss Phil Spencer said the industry needed to do better in regards to making minorities and women feel more welcome in online gaming spaces.
Toxicity within live games
Since starting out as a PlayStation Plus freebie, Rocket League’s popularity has exploded with millions of sales across five platforms, and the game has become an esports staple.
“These policies are the cornerstone of our current Language Ban system and how it operates, as we aim to keep our a game a safe, friendly environment for all of our players.”
Rocket League had a chat ban implemented into the game where if toxic perpetrators harassed other players or used abusive language, they’d be issued bans from the game’s chat system (except for private matches or party chat online).
The developer also detailed last year its ban practices for the game, including the then new language ban system, where if a player hit a certain threshold of times they’d used a banned word, they would be automatically subject to a ban. The determined amount of time can vary from 24 hours to even a permanent ban.
Connors did a post-mortem talk on the language ban system at GDC, which you can watch here. But Connors told GameDaily.biz that its policies are looked at on a regular basis.
“Our player behavior policies are discussed frequently -- typically multiple times per month during broader, mass action/ player behavior sync-ups between our Community, Customer Care, and Analytics teams,” he said. “Additional input from our Design, Legal, Online Services and Publishing teams is sought when significant changes need to be discussed or made.”
With Rainbow Six Siege helping to lead the push against toxic behaviour and harassment with its ban system for abusive players, at the end of the day, it’s all about just having fun enjoying your favorite game and treating each other with kindness and respect, said Ubisoft’s Michael Burke.
“We encourage all players to treat each other with respect,” he said. “We are constantly working with our teams and partners to find ways to reduce toxicity and improve players’ experience, whether by implementing chat filters or by suspending or banning players that violate a game's code of conduct. Generally speaking, a game's code of conduct prohibits the use of any language or content deemed dangerous or toxic, whether on forums, via the in-game chat, or over VOIP.”
Potential toxicity within upcoming games
Bethesda and Electronic Arts have the community’s eyes on them as they gear up for their next major launches: Fallout 76 and Anthem.
Bethesda’s experiences with online games like The Elder Scrolls Online, free-to-play shooter Quake Champions, and card game The Elder Scrolls Legends has given them valuable insight into how to better build positive communities. But with Fallout 76, the industry will watch the publisher very closely to see how it deals with harassment and hate speech for one of its marquee franchises from its flagship studio.
That said, they have already detailed some of the ways they’ll deal with the dirty players.
At Quakecon this year, game director Todd Howard revealed that players who continue to provoke others will appear as a wanted person on the map as a red star and will be hunted by other players.
The provoking player will also not be able to see other players on the map coming for them and if they end up getting killed, the player who killed the troll will gain a reward taken from the pocket of the troll’s caps trove. It therefore gives an incentive to either play nice or face the consequences in ways that are not only detrimental to the harasser but also advantageous and fun for the players hunting them. It’s very clever game design.
“We take community conduct and player safety extremely seriously and operate a zero tolerance policy towards hate speech and online harassment,” said Bethesda in a broad statement to GameDaily.biz.
“Anyone experiencing or witnessing this type of behaviour in our games should use the in-game reporting tools featured in all of our online games to alert our Customer Service team. Our Customer Service team investigates every issue they receive, and they take appropriate action whenever necessary, including — but not limited to — temporary suspension, banning, and the permanent closure of forum and game accounts, if the circumstances warrant it.”
Bethesda doesn’t have a code of conduct for its games, but its community code of conduct for its forums are a good indicator on what it deems hate speech, harassment, and more. There are similar rules in place for The Elder Scrolls Online.
EA, on the other hand, has three big launches on its collective hands. Blockbuster sports game FIFA 19 went live on September 28; Battlefield V, arguably the most anticipated shooter this year, is set to launch in November after a delay from its original October date. And then there’s Anthem, the brand new IP from RPG powerhouse BioWare, set to launch in February next year.
Although it didn’t provide a general statement outlining how it would specifically deal with harassment and hate speech for those games, EA provided GameDaily.biz with links to its code of conduct that outline how it perceives harassment in its titles.
EA’s definition of harassment and hate speech is as follows: “Behavior or content that is harmful, offensive, defamatory, obscene, harassing, threatening, hateful, degrading, intimidating, discriminatory, or otherwise breaks the rules in our User Agreement.”
“Be kind to one another,” says one of its rules of conduct. “Don't harass, embarrass, or threaten other players. This includes sending messages to them repeatedly. If they don't respond, they may not want to talk.”
It gets a lot more stern in the legal terms of service provided by the company.
In section 6.5 under its rules of conduct section, it says that users agree they will not “harass, threaten, bully, embarrass, spam or do anything else to another player that is unwanted, such as repeatedly sending unwanted messages or making personal attacks or statements about race, sexual orientation, religion, heritage, etc. Hate speech is not tolerated.”
A similar point is made for its UGC (User Generated Content). “Contribute UGC or organize or participate in any activity, group or guild that is inappropriate, abusive, harassing, profane, threatening, hateful, offensive, vulgar, obscene, sexually explicit, defamatory, infringing, invades another's privacy, or is otherwise reasonably objectionable.”
And in section 6.8, users are warned not to “publish, post, upload or distribute any content, such as a topic, name, screen name, avatar, persona, or other material or information, that EA (acting reasonably and objectively) determines is inappropriate, abusive, hateful, harassing, profane, defamatory, threatening, hateful, obscene, sexually explicit, infringing, privacy-invasive, vulgar, offensive, indecent or unlawful.”
Toxic gamers have run roughshod in games for far too long, to the point that it’s not only ruining decent communities in games, but has actively pushed players away. And instead of leaning heavily on community managers to pull communities out of their toxic goo, publishers and major platforms continue to devise new ways to build inclusivity and bring down the banhammer on the problematic folks.
First-parties continually help to make the platform experience on PlayStation and Xbox the same way as well, though the fact Nintendo has not published a newer, up-to-date code of conduct for Nintendo Switch Online is incredibly concerning.
There’s even talk of South Korea, one of the biggest competitive gaming countries in the world, making online games harassment punishable by law.
And while these aren’t bulletproof ways to keep toxic gamers out - these players will exist one way or another despite attempts from developers and publishers at the highest level - it’s a massive indicator the industry is continuing to seek out ways that provide a safe, harmless and fun experience for players, regardless of the genre of games in question./* =$comments; */?>