Content with "scripted or simulated" violence is no longer subject to age restrictions.
In a policy enforcement blog post, Google has announced that it is relaxing its stance on scripted and video game violence on YouTube. No longer will videos that contain “scripted or simulated violent content found in video games” be age-restricted, meaning you won’t be required to verify your age when viewing such content.
“We know there’s a difference between real-world violence and scripted or simulated violence--such as what you see in movies, TV shows, or video games--so we want to make sure we’re enforcing our violent or graphic content policies consistently,” the post reads.
There is a caveat, however, according to one YouTube employee: "To clarify, we may still age-restrict gaming videos if violence is the sole focus – more graphic scenes like dismemberment, decapitations, showing of human corpses with these severe injuries may be age-restricted, while less graphic content may be approved."
The audience that is likely to be most affected by the guidelines update are content creators and YouTube gaming influencers. In the past, videos featuring significant amounts of violence had the potential to be demonetized, meaning the video’s creator could not receive payment from advertisers. YouTube’s policing algorithm could automatically flag a video for being too violent, which has been a point of contention between Google and content creators for some time.
For Doron Nir, CEO of streaming services provider StreamElements, Google’s update to the game violence guidelines represents a first step toward a more discerning platform.
“There is an obvious difference between real and animated violence, so it makes sense to have a different criteria for what is acceptable,” Nir told GameDaily. “Taking this step is a smart first move, but there are other ways gaming content needs to be parsed, such as M-rated versus E-rated games, so it will be interesting to see if that is being addressed with these new changes.”
On Twitter, reaction to the changes has been almost universally positive. Even Ed Boon, one of the creators of the Mortal Kombat series of games, tweeted praise for new rules. It’s violent games like Mortal Kombat 11 that were cause for the original restricting guidelines in the first place.
“Hopefully this is good news for Streamers/YouTubers who feature M-rated video games,” Boon said.
“Finally something good for creators. Hopefully Mortal Kombat players won't have to use filters over their footage anymore,” they said.
Dani Plays, another YouTuber, also tweeted praise:
“Really happy to see YouTube treating video game violence the same way as movies now. There's a lot of gaming content creators, including me, who's been hit by this age-restriction unfairly and automatically too. This is a great step the right way YouTube!”
Some content creators might ask why it’s taken this long for YouTube to address these concerns, and argue that age-gating videos is an arbitrary step that doesn’t do much to deter viewers. But compliance with the Children’s Online Privacy Protection Act of 2000 goes a long way in influencing corporate policies. Given Google’s relaxing of its guidelines, it may not view COPPA as the be-all-end-all when it comes to filtering the content available on its platform.
YouTube has long been trapped between content creators and a need to avoid liability for damages done to maleable, young minds. That balancing act is not an enviable one, to be sure, but it’s become incredibly common in this digital age. Hopefully, these guideline updates point to a more harmonious future for the platform.
(Updated 12/3/2019 with comments from StreamElements)
For more stories like this one delivered straight to your inbox, please subscribe to the GameDailyBiz Digest!/* =$comments; */?>