Monday, 25 October 2021

THE HARD LIFE OF FACEBOOK MODERATORS


The other day I casually bumped into these videos that Youtube suggested me, I clicked and watched them. What I learned in those 10 minutes is tough stuff to believe, but it’s reality, the cruel reality of social media.

 In the first video there’s a man who’s disguised because Facebook didn’t even allow him to talk about that kind of things.

He’s a Facebook moderator, he basically reviews the disturbing content that users report so that we don’t have to see it. The content ranges from hate speech, racism, bullying, animal abuse, to torture, terrorism, pornographic material, violence, self-harm, murder, suicide or child abuse. Nothing pleasant to watch.

Have you ever reported a post or video on a social media, got this notification “Thank you for reporting. This post has been removed” and then thought it was a bot? Well, it was not. All the content we report, get into the hands of moderators who have to watch it and decide if it violates the community guidelines, which are very strict rules. Moderators practically spend an average of 8 hours a day, sat on a chair watching traumatizing materials and just pressing two buttons: yes or no.

It’s hard to believe that there are actually people who have to see all that disturbing material for 8 hours a day. But it’s real and it’s an actual job, a job that brings catastrophic effects on people, who are often forced to leave their job and are left with permanent psychological scars.

Many of them also recall having to take a break every once a while to realize what they just saw and then prepare themselves to see what horrible content was next.

The man in the first video works from 6pm to 2am, then comes back home and sleeps. Or at least tries to. Having to watch horrible and very upsetting content inevitably causes nightmares and permanent traumas. He says he wakes up in the middle of the night pondering about what he had to see. 

Moderators choose to be moderators, but, of course, they aren’t aware of what their job is going to be. They are told that “there might be disturbing material to watch” but they would never think about something like child abuse or murder or self-harm. 

The woman in the second video tells that she reported to the company that the she didn’t like the content she had to see, saying, however, that the company didn’t support her and the other moderators at all.

They are told that they can be shifted to work on a different type of content or that they can have a counselling support, which is available 24 hours a day, but it’s not enough to forget all the things they saw once but will be imprinted in their mind forever. Also many of them recall of going to the counsellor and being told to quit the company and go find a proper psychologist.  So they’re basically abandoned to themselves, nobody cares about them.

Many of them, after they quitted, started to deal with traumas and even turned to drugs or alcohol.

That’s the sad life of being a content moderator and the dark face of social medias, that no one should see.

Maria, 4scB

VIDEO 1




VIDEO 2


No comments:

Post a Comment