youtube
© Pixabay / germany_photography
YouTube will delete thousands of accounts after banning "supremacists", conspiracy theorists and other harmful accounts, it has claimed.

The decision was made after an in-depth review of its rules on hateful content, YouTube said. While it has always banned hate content in general, the site has allowed some specific kinds of harmful videos - such as those promoting Nazi ideology or claiming 9/11 did not happen - to continue being hosted on the site.

Those videos, as well as other kinds of "supremacist" content, will now be officially banned.

"Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status," it wrote in a blog post.

That is expected to lead to the removal of thousands of accounts as it goes into place, though that could take some time. "We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we'll be gradually expanding coverage over the next several months," its announcement read.

It did not give any specific examples of accounts that would be removed.

It noted that some of those accounts are useful to researchers, and said it would try and work on ways of making sure they stay available. It also said the change would not affect videos that are discussing "pending legislation, aim to condemn or expose hate, or provide analysis of current events".


Comment: Aside from the problematic issue of "who decides what is hateful?", who decides which accounts are "useful to researchers"? The fact is that the safest place for hateful content to be is out in the open, so people can see it, research it if they so choose, and act accordingly. Pushing it out of the public space doesn't make it cease to exist, it only pushes it underground. Remember that 'sunlight is the best disinfectant'.


It will also alter its algorithm in an attempt to stop certain kinds of misleading and harmful videos, such as those promoting fake miracle cures or the flat Earth hoax, will stop being recommended in YouTube's "up next" sidebar. It will also encourage more authoritative videos to try and discourage people from being tricked by those stories.


Comment: And who decides what constitutes "more authoritative"? Your average doctor thinks IV vitamin C is a "fake miracle cure," but soft-censoring videos about it may actually cost lives. The same could be said for thousands of other topics.


It has already trialled the system to do this in the US, and said it has found success. It will bring it to more countries by the end of the year, it said, as well as tuning the algorithm so that it is more efficient and can spot more content, it said.

It also said it would work harder to stop YouTube users promoting harmful content from receiving ad money. Channels that "repeatedly brush up against" its hate speech policies will be suspended from the company's partner programme.

"The openness of YouTube's platform has helped creativity and access to information thrive," its blog post concluded. "It's our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence.

"We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come."

The change comes on the same day as the company said it would not remove videos in which one of its stars attacked another user over his sexuality, using a series of anti-gay slurs. Strangely, the company now explicitly bans videos that encourage discrimination or segregation based on sexuality, but made no reference to that high-profile case in its blog post, and did not say that it would change its position.

YouTube has been repeatedly criticised for its relatively lax approach towards various kinds of harmful content, including those on the far-right. That criticism became even more prominent in the wake of the Christchurch shooting, when it and other video sites failed to quickly remove videos of the mass murder.

As such, the site has been repeatedly accused of not only permitting but also encouraging extremism, by playing host to often violent and niche accounts.

But right-wing channels also make up a significant part of YouTube's channels and their viewers. Earlier this year, Bloomberg reported that far-right videos were one of the site's most popular categories.


Comment: It would be interesting to see what videos Bloomberg is referring to as "far-right". It's likely they would include not-even-right-wing video producers like Joe Rogan, Tim Pool or Dave Rubin; a complete miscategorization.


The decision also comes amid increasing scrutiny from conservative politicians about whether YouTube has a bias against right-wing creators. As with Twitter and Facebook, the company has been criticised for undermining free speech and being unfair towards its conservative users, despite the fact there is no evidence of those accounts being discriminated against.