facebook
A test of a new feature that asks users to report friends they suspect may be becoming an extremist has been greeted with horror from Facebook users. It's a bizarre precedent that we should all be concerned about.

Over the past few days, some Facebook users have reported seeing prompts asking them if they are concerned that someone they know might be becoming an extremist. Other users are being notified they may have been exposed to extremist content as they were naively reading political articles or watching videos on the platform.

Screenshots of the alerts have surfaced on social media.


Understandably, the move has shocked users. US representative for Colorado Lauren Boebart ironically tweeted, "Facebook just warned me that I may have been subjected to extremist content and asked me to report anyone I may know that is becoming an extremist. I have more than 200 coworkers I need to report."


In a bid to calm concerns, Facebook issued a statement about the testing. It read, "This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk."

Ultimately, what this means is that if the policy is implemented, Facebook is quite literally going to encourage people to report their friends for committing thought crimes. Even worse, Facebook will punish its own users who may have potentially committed wrongthink.

It's quite the dystopia Facebook is leading us through, and begs the question: has society, as a whole, got to the point where reporting friends can be deemed as appropriate? And are we becoming docile and easier to control?

For example, individuals' thoughts on issues such as mass migration should be free to be expressed on Facebook without fear of consequences. It is worrying that what could be interpreted as codes of political correctness are being implemented across the platform.

This is especially problematic as big tech like Facebook and the other giants are now the new public square. And the immense power they hold means they are behaving like monopolies. This raises the important question of why they are apparently involving themselves in the political opinions of its users.

Facebook's claim that it is aiming to clamp down on "extremist content" by using new policies is questionable. In an era dominated by woke thinking, "extremist content" could mean anything, from something mildly offensive to a crude joke. So, the term is too subjective and vague, as the platform provides little indication of what it considers to be "extremist content".

Of course, most of us do not want to witness extremist content or hateful behaviour. However, policing people in this way is a slippery slope, especially in a society where we value freedom of speech and diversity of opinions.

This is especially pertinent as Facebook is a powerful platform used by more than a third of the planet's population, which has bestowed owner Mark Zuckerberg with massive influence and power. He essentially controls the timelines and newsfeeds of 2.7 billion people. Ultimately, in future it could be that in order to be allowed to use the platforms, one must completely oblige Facebook's policies revolving around speech, which presumably will be determined by Zuckerberg. This is a chilling prospect and precedent.

Thankfully, the bizarre behaviour is being called out by some prominent people. Pink Floyd's Roger Waters slammed Zuckerberg during a recent press conference, describing Facebook's policies as "insidious movement... to take over absolutely everything." He's absolutely right.

The truth of the matter is that Zuckerberg, who is now the fifth richest man in the world, probably has more influence over what information the masses are able to read and watch than any publication or media entity.

Facebook's sharp algorithms are able to direct its billions of users towards any idea, organisation or partisan ideology in an instant. This power is immense. And its ability to effectively censor people, no matter how influential, has been seen its treatment of President Trump.

It's unclear how Facebook went from being a fun, quirky website to socialise on to being such a powerful player in the age of information. Day by day, it really does seem strange that tech oligarchs want to control our posts on their platforms by policing content.

What's next? Could Facebook socially engineer the timelines of people suspected of being 'extremists' by the company itself, in order to influence them into changing their personal views? Who knows? Who thought we'd be where we are now?

If you don't believe that Facebook's latest test policy is a dangerously worrying precedent in a free society in 2021, then it is time to wake up.