© Corey Brickley
Content warning:
This story contains discussion of serious mental health issues and racism.The panic attacks started after Chloe watched a man die.
She spent the past three and a half weeks in training, trying to harden herself against the daily onslaught of disturbing posts: the hate speech, the violent attacks, the graphic pornography. In a few more days, she will become a full-time Facebook content moderator, or what the company she works for, a professional services vendor named Cognizant, opaquely calls a "process executive."
For this portion of her education, Chloe will have to moderate a Facebook post in front of her fellow trainees. When it's her turn, she walks to the front of the room, where a monitor displays a video that has been posted to the world's largest social network. None of the trainees have seen it before, Chloe included. She presses play.
Comment: There's obviously a lot going on with this story. Facebooks content moderators are exposed to the worlds' dregs of society on a consistent basis. That's going to be traumatizing in one way or another for most people.
It's true that suffering can also be a mechanism for growth. This is part of what SOTT was designed for. We look at the worst of the worst every day. However, we do for the most part stay away from graphic photos, videos, etc. It's not wise to stare into that abyss. But there can value in processing what is happening throughout the world and working to understand the very many influences at play. We also work to understand the flaws in our thinking, mechanisms of healing, and how our connection with others is essential for getting through this mess. This is very different from 'content moderation' where there is just exposure and arbitrary rules to dance around. And it is all being done in strict isolation.
These people are having their psyche repeatedly wounded with explicit imagery, and then they are being tasked with ruling on what thoughts and information are permissible. This is a horrible combination! Most people don't have the interest to understand ourselves or the world. It's unlikely the executives at Facebook have such a drive; same goes for the average content moderator. And yet, they are at the forefront of regulating information for the users of their technology.