Johansen.se

‘I didn’t eat or sleep’: a Meta moderator on his breakdown after seeing beheadings and child abuse

Detta inlägg post publicerades ursprungligen på denna sida this site ;

When Solomon* strode into the gleaming Octagon tower in Accra, Ghana, for his first day as a Meta content moderator, he was bracing himself for difficult but fulfilling work, purging social media of harmful content.

But after just two weeks of training, the scale and depravity of what he was exposed to was far darker than he ever imagined.

“The first day I didn’t come across any graphic content, but gradually I started coming across very graphic content like beheadings, child abuse, bestiality. When I first came across that ticket I was very shocked. I didn’t even look at my computer because it was very disturbing for me.

“But gradually I started normalising what happened because I became used to it. I even started to enjoy seeing people beheaded, child abuse, pornography, suicide. I asked myself, is that normal? And I replied, it’s not.”

Solomon, who arrived from his home in east Africa in late 2023, said he would “never forget the day” he came across a person being gradually skinned alive. “The system doesn’t allow us to skip it … we have to look at it for 15 seconds, at least.”

Another video featured a woman from his home country screaming for help in his native language while several people stabbed her.

He says the videos became more and more disturbing. Some days, there would be no videos, then something would start trending and in the same day 70-80% of videos would feature graphic content. He felt himself gradually becoming “out of humanity”.

In the evenings, he would return to the shared flat provided by his employer, the outsourcing company Teleperformance, with “no room for privacy and many problems with water and electricity”.

When Solomon learned that a childhood friend had been killed, his already fragile mental health unravelled. He was broke a window and a mirror in frustration, leading Teleperformance to suspend him until he felt better.

He spent the next two weeks home alone. “I started developing depression. I didn’t eat or sleep, I drank day and night and smoked day and night. I was not this kind of person before,” he said.

Solomon attempted suicide, and was admitted to a psychiatric hospital where, he said, he was diagnosed with major depression disorder with suicidal ideation. He was discharged after eight days, at the end of 2024.

Teleperformance offered to transfer him to a lower-paid job, but he feared he would not earn enough to survive in Accra. He asked for compensation for harm and longer term psychological care to be covered but instead Teleperformance sent him back to his home town, which is in the midst of an insurgency.

“You’re using me and throwing me. They treated me like a water bottle – you drink water and throw the bottle way,” Solomon said after his dismissal.

He said he had held a professional job in his home country, adding: “Before coming here I was so happy and peaceful.”

Another moderator, Abel*, shared how he, too, had had his contract terminated, for standing up for his friend Solomon and for the rights of other employees.

He said he had told Teleperformance: “You’re not treating him well.”

“They just put him in the house. He stayed alone and he was telling me he’s scared being alone all the time, it was giving him severe stress, so he started to going to the company, ‘I want to stay with you guys, I want to be in office, I’m scared.’”

Abel, too, had struggled with his mental health because of the content. “I didn’t know the nature of the job, actually. I didn’t realise I’d see people skinned alive and porn videos as my daily job … This is the first time I’d heard of content moderators … I used to be spooked when I saw blood, but now I’ve become numb. Gradually I’ve seen it altering my character … I’m struggling. Not to exaggerate, it’s 100% changed me.”

He said his colleagues often sat around drinking coffee and discussing disturbing content, including religious colleagues sharing their feelings of shame.

He has come to fear raising such issues with the wellbeing coach, as he has seen his disclosures later being referred to by his team leader. When he said he no longer wished to use wellbeing services, which he believed were “for research on us”, he was challenged.

A Teleperformance spokesperson said: “Upon learning of his depression as a result of his friend’s death, we conducted a psychological assessment and determined that the employee was no longer well enough to continue providing moderation services.

“Instead, we offered him a different non-moderation role which he declined, insisting that he wanted to continue in this moderation role. Given this was not an option, his position ended and he was provided compensation according to his contractual agreement.

“During his employment and afterward, we continued offering the employee psychological support, which he has repeatedly declined. At the suggestion of his brother, so that family could help provide the employee with support, and with the approval of medical counsel, we provided the employee with his flight back to Ethiopia.

“We have continued to offer the employee psychological support in Ethiopia, however he has declined the support. Instead, he has tried to extort Teleperformance for money under threat of going to the media.”

*Names have been changed to protect identities

The Guardian Tech RSS

https://www.theguardian.com/technology/2025/apr/27/meta-moderator-on-the-cost-of-viewing-beheadings-child-abuse-and-suicide

Exit mobile version