YouTube said on Wednesday it would remove videos that deny factual catastrophes such as the Holocaust ever happened and stop sharing ad revenue with channels that skirt too close to its rules, a major policy reversal as it fights criticism that it provides a platform for hate speech and harassment.
The streaming service, owned by Alphabet Inc’s Google, said it was taking aim at videos claiming school shootings and other “well-documented violent events” are hoaxes. It also will remove videos that glorify Nazi ideology or promote groups that claim superiority to justify discrimination.
In addition, video creators that repeatedly brush up against YouTube’s hate speech policies, even without violating them, will be removed from its advertising revenue-sharing programme, YouTube spokesman Farshad Shadloo said.
YouTube for years has stood by allowing diverse commentary on history, race and other fraught issues, even if some of it was objectionable to many users.
But regulators, advertisers and users have complained that free speech should have its limits online, where conspiracies and hate travel fast and can radicalise viewers. The threat of widespread regulation, and a few advertiser boycotts, appear to have spurred more focus on the issue from YouTube and researchers.
In a blog post, the company did not explain why it changed its stance but said, “We’ve been taking a close look at our approach towards hateful content in consultation with dozens of experts in subjects like violent extremism, supremacism, civil rights and free speech”.
YouTube acknowledged the new policies could hurt researchers who seek out objectionable videos “to understand hate in order to combat it”. Several independent journalists also criticised the policy for targeting their work. The policies could also frustrate free speech advocates who say hate speech should not be censored.
Other types of videos to be removed under YouTube’s new rules include conspiracy theories about Jews running the world, calls for denying women civil rights because of claims they are less intelligent than men, and some white nationalist content, Shadloo said.
YouTube said creators in the revenue-sharing programme who are repeatedly found posting borderline hate content would be notified when they do it one too many times and could appeal their termination. The company did not immediately respond to questions about what the limit on such postings would be.
Recently, YouTube has facing controversy over its refusal to remove videos from conservative commentator Steven Crowder, in which he uses homophobic slurs to describe Vox reporter Carlos Maza. YouTube said Crowder has not told people to harass Maza, and the primary point of his video is to offer opinion, and thus it did not violate YouTube’s anti-harassment policies.
Criticism of the decision has poured out online. YouTube later said it had removed Crowder’s ability to make money on YouTube.
Crowder did not immediately respond to the Associated Press news agency’s request for comment but posted a video on Twitter saying his channel is not going anywhere.