Top
Navigation
2024 Excellence in AI Innovation finalist

We Asked AI to Map Our Stories. Here are the Results.

Organization
THE CITY

Award
Excellence in AI Innovation

Program
2024

Entry Links
Link 1
Link 2

View Entry
About the Project

When THE CITY underwent a website migration in 2023, its chief product officer, Scott Klein, and AI and Data Fellow Tazbia Fatima took the opportunity to archive past stories and run an experiment to find the locations of all of our stories on New York City’s geographic landscape. They wanted to understand THE CITY’s coverage better and determine whether we fulfilled our reporting mission to all New Yorkers.

Beyond holding ourselves accountable to our mission, the experiment was also a chance to learn about the latest advancements in large language models and whether or not they could correctly audit our coverage.  They asked OpenAI’s ChatGPT to read all of our story files and tell us where each story took place. They then plotted the results on a map.

They used the NYC Department of City Planning’s Neighborhood Tabulation Areas to define the neighborhoods. On the map, the darkness of the blue color of each neighborhood shows the extent of our reporting connected to each area. ChatGPT was able to pick a specific “place” for 2,750 out of the 4,159 stories we published from our April 2019 launch through September 2023. We were also able to match 2,129 stories to a given neighborhood.

The traditional software approach to identifying the location of a story is to write code that reads through each article, analyzes the text, and then extracts the key location from it.

To pull out the words and phrases that mention a place, such code would  do something called “entity extraction.” Entity extraction is a part of a field called natural language processing. It involves identifying a geopolitical entity (GPE), such as countries, cities or states, or non-GPE locations, such as mountains and water-bodies, using a technique called “named entity recognition.”

But when they tried a cookie-cutter implementation of named entity recognition, they found it could not recognize the different boroughs, neighborhoods and landmarks in New York City. It could only recognize a location if it were an explicitly named place.

But this is where OpenAI’s ChatGPT came in handy. ChatGPT, like other large language models, have a kind of “common” knowledge and context of the real world. When they tried using ChatGPT, they found it was able to read, process and return the location, neighborhoods from New York, boroughs and geographical coordinates in a matter of seconds.