The 74’s Mark Keierleber spent more than a year investigating the ramifications of the Minneapolis School District’s decision to sever its relationship with city police in the wake of George Floyd’s murder. It quickly became clear that ending the practice of policing children was not one of them. Mark used public records and data analyses to expose details of a previously unknown student surveillance apparatus that grew larger and more pervasive in place of uniformed cops during — and after — the pandemic shuttered schools.
Mark obtained 1,300 incident reports from the first six months of the pandemic generated by Gaggle, a private, for-profit company hired by the district to provide surveillance of online student communication and flag potentially dangerous or alarming content. Mark’s analysis of that massive data trove revealed that only 25 percent of incidents were reported to district officials on school days between 8 a.m and 4 p.m., meaning the vast majority of student surveillance was occurring at night, on weekends, holidays and during the summer. This represents a significant expansion of traditional school oversight of student speech.
Gaggle’s customer base has exploded during the pandemic — it now monitors 5 million students in more than 1,500 districts nationwide — and the company sells itself as a savior of kids in mental/emotional distress. But there is no hard research showing that this type of surveillance is beneficial and, as Mark’s reporting documents, deep questions about whether it violates student privacy and may actually have a chilling effect on young people seeking help.
Mark’s investigation pulled back the curtain on how Gaggle’s AI-driven system and its hourly-wage human content moderators probe young people’s thoughts, conversations and writings. One was a 13-year-old transgender student who shared intimate details about his mental health in a class assignment, only to have them wind up in the hands of district security, exacerbating his anguish. In connecting with students and families caught in Gaggle’s surveillance web, Mark was able to reveal that the company’s expensive tech was not only invasive, but often inept.
Mark’s ongoing reporting reached well beyond Minneapolis. In May, he was able to further expose discrepancies in how Gaggle represents its operations publicly — and to Congress — and what those practices actually look like on the inside. He tracked down multiple former content moderators who talked about cursory, impersonal hiring practices, insufficient safeguards to protect students’ sensitive data — including nude photos — a quota-driven work culture that prioritized speed over quality and frequent exposure to explicit content that left some traumatized.
“I went into the experience extremely excited to help children in need,” one former employee told Mark. “I realized that was not the primary focus of the company.”
The project raised many important questions: If we can police children not only physically, but also digitally, what’s left in terms of privacy? The impact was also significant even in the halls of Congress. It’s a very relevant topic not only in the United States but globally. The work was also republished across multiple publications and media entities. An impressive effort investigating technology-based business practices.