The Markup challenges technology to serve the public good. Our journalism gives people control over the technology affecting their lives. We accomplish this in three ways.
1. Data- and Software-Driven Reporting
Using nearly 100 accounts, The Markup conducted the first field audit of Instagram’s content moderation algorithms and our data found that they routinely limited the reach of posts supporting Palestine (specifically nongraphic images of war, captions, comments, and hashtags) and denied users the ability to appeal.
Our uniquely comprehensive “digital book ban” investigation used data to reveal how web filters across 16 U.S. school districts in 11 states kept students from doing homework, exacerbated inequities, and discriminated ideologically—blocking suicide-prevention resources for LGBTQ+ teens while keeping anti-LGBTQ+ sites available.
Finally, The Markup testing revealed that NYC’s AI chatbot told business owners to break the law. The Markup asked the bot dozens of questions and found it was frequently wrong, advising visitors to discriminate in housing and to take workers’ tips. Business owners also shared getting false information in response to their own questions.
2. Tools that Give People Superpowers
After the Washington Post reported X was throttling links to competitors, The Markup published a tool that lets anyone check if X throttles any link. Readers ran hundreds of tests and found delays reaching Patreon, WhatsApp, and Messenger, leading to a second story with Patreon creators explaining how the delays hurt their income.
Readers have also used our real-time privacy inspector, Blacklight, to scan over 13 million websites for trackers, exposing abusive tracking by OB-GYNs, online pharmacies, and edtech companies. We recently upgraded Blacklight to be able to scan EU sites with more stringent privacy laws, compare mobile and desktop versions of a site, and more.
3. Partnership Between Communities and Newsrooms
In late 2022, The Markup exposed how major internet providers systematically give the worst deals to poorer and least-White neighborhoods—a data-driven investigation that eventually won a Philip Meyer award. Since then, we’ve equipped affected communities to fight that discrimination, teaching readers how to find a better internet deal, how to fact check company’s claims to the FCC (over 5,000 of them did, including the Detroit Documenters), and publishing “magic spreadsheets” that allow anyone to analyze internet speeds for disparities (a Chicago community youth group did so, presenting their findings to the mayor). Following our work, the FCC approved rules against digital discrimination and Los Angeles became the first city to outlaw it entirely.
We sought even closer collaboration with communities for reporting on misinformation, doing stories based on what 30+ Vietnamese immigrants in Oakland, California, told us they needed. We wrote about how most news-related Vietnamese YouTube videos are by influencers translating misinformation, amplified the work of a 67-year-old Vietnamese grandmother who translates articles from mainstream outlets to combat this misinformation, and created a guide for second-generation immigrants on how to talk to loved ones about misinformation. Finally, we put on two workshops for the community: one on misinformation and one on how to spot deepfakes.
The judges said the winner produced outstanding, actionable journalism that, given its subject matter, could not have come at a more consequential moment.