Moderators seek compensation after years of online violence
The vast majority of Facebookand Instagram users never see disturbing content depicting scenes of torture, child abuse or murder on these platforms. That's thanks to content moderators, who constantly review content and delete inappropriate images and videos. Often, these content moderators are employed by subcontractors in countries of the Global South. Their line of work is extremely stressful and has drawn criticism for years.
Meta, which owns Facebook and Instagram, and its African subcontractors have faced numerous lawsuits over these practices. According to British daily The Guardian, lawyers are currently preparing for a lawsuit against Majorel, a company contracted by Meta to engage in content moderation.
Content moderators working for Majorel in Ghana'scapital Accra told The Guardian that they have suffered from depression, anxiety, insomnia, and substance abuse. They believe this is a direct result of their work as content moderators. They also claim that the psychological support offered to help process disturbing social media content was inadequate.
Teleperformance, which owns Majorel, reportedly denies these accusations. According to The Guardian, the company employs its own mental health professionals, who are registered with the local supervisory authority. DW asked Majorel for comment yet received no reply. British NGO Foxglove is........
© Deutsche Welle
