Top
Navigation
2023 Excellence in Technology Reporting, Large Newsroom finalist

Content Moderation

About the Project

Two blockbuster investigations by Insider’s Tekendra Parmar demonstrated how failures in content moderation can have dire consequences, both for the general public and for those tasked by platforms like Meta and TikTok with reviewing reams of disturbing posts.

Since 2017, when Facebook/Meta was implicated in the slaughter of Rohinga Muslims in Myanmar, Meta has cited its Trusted Partner Network as a bulwark against the spread of hate speech and dangerous misinformation in conflict zones. These partners are contracted by Meta to provide local and linguistic expertise. But how effective has this program been, really?

Tekendra Parmar’s investigation into Meta’s Trusted Partner program in Ethiopia revealed how the social network ignored or was catastrophically delayed in reacting to alarms sounded by its own local experts — even as two violent conflicts were underway in Ethiopia and hate speech flourished on Facebook. Through documentary evidence and sensitive source building, this investigation revealed a pattern of inaction: Partners sometimes waited months for a response to posts they had urgently flagged, while hateful and violent posts that called out members of ethnic groups by name were often allowed to remain.

Specifically, Insider’s investigation revealed that in the lead-up to the November 2021 murder of Meareg Amare, a Tigrayan chemistry professor, one of Meta’s Trusted Partners repeatedly alerted Meta to threatening posts about Amare. But Meta failed to act. Amare’s murder is now the subject of a landmark $1.6 billion lawsuit against Meta in Kenya.

The Ethiopia investigation was the latest in Parmar’s series of stories highlighting serious and under-explored issues with content moderation.

The second story revealed the trauma faced by rank-and-file social media content moderators and exposed the profound gaps in worker protections globally. Parmar was among the first journalists to deeply examine the treatment of content moderators working for TikTok, the fastest growing social media company. He interviewed more than a dozen content moderators who broke their non-disclosure agreements to describe the harsh working conditions and mental health trauma they experienced due to their work. Despite routine exposure to graphic videos and posts, this army of invisible laborers often experienced symptoms of acute psychological stress.

Parmar also utilized Insider’s “As-Told-To” format, and commissioned a series of first-person essays that allowed TikTok moderators to speak directly to readers.