The Invisible Neighbor

Show Notes

AI lives in my city, and probably yours too. What does it mean for our neighborhoods, the people in them, and where data lives? Host Bridget Todd looks at how AI is changing our neighborhoods and who is behind it. She meets people bringing hidden tech systems into the full view of citizens and decision makers.

Nat Palmer helped lead a campaign in Chicago to stop the city from using AI powered surveillance that brought armed police to their neighborhoods.

What will data storage look like in the future? Marina Otero Verzier is an architect and researcher from Spain who examines sustainable new designs for data storage, from your living room to outer space.

Linda Dounia Rebeiz is an artist in Dakar, Senegal who didn’t like what she saw when she prompted generative AI to show her what her home city looks like. So she built her own training dataset.

Transcript

Bridget Todd: If you invite me over, I’ll pull up an app on my phone to figure out the best way to get to your house.

It’s still how I navigate my own city, Washington DC.

But a lot of times, people still beat technology when it comes to getting to know cities.

Linda Dounia Rebeiz: If I go to, I don’t know, Lisbon or Paris or whatever, I can probably just open Google Maps and I will have the city at my disposal. In Dakar, you need to know somebody who lives here. This is like a city refusing to be known by the Internet.

Bridget Todd: That’s Linda Dounia Rebeiz. She’s an artist in Dakar, Senegal, who is using AI to show her city some love.

Hey everyone, I’m Bridget Todd and this is IRL the award-winning podcast brought to you by Mozilla Foundation with PRX.

In this episode: we explore how AI can help-or-hinder us in our cities.

First up, we make a stop in Chicago, where communities fought back against AI that’s used to detect gunfire.

In 2021, a 13-year-old boy named Adam Toledo was shot by police on Chicago’s West Side. They came in response to a gunfire alert.

Nat Palmer: And they were out there. They were shooting guns. You know, like, let’s not act like that isn’t a thing, but like white kids shoot guns in the country all the time.

Bridget Todd: That’s Nat Palmer, a community organizer in Chicago.

Nat Palmer: So, like, they were out there shooting guns, and ShotSpotter heard it, and sent the police there.

Bridget Todd: ShotSpotter is an AI security system that uses acoustic sensors that listen for gunshots. The company behind it is called SoundThinking.

Nat Palmer: And then if they think someone’s shooting, like they come in there ready to shoot and they did that. And that really pushed a lot of folks to the edge, especially, like, this was right off, like, George Floyd and a year after the 2020 uprisings. And, you know, thankfully, like, organizing doesn’t stop after the uprisings, like, no matter how small they are. So, you know, people picked it up and were like, “No, we need to get this out of here. This can’t happen again.”

Bridget Todd: Nat discovered a coalition called STOP ShotSpotter on Instagram, and decided to get involved.

Nat Palmer: Folks realized that cops were sent there due to ShotSpotter. And folks started organizing around it like, “Yo, this is really whack.”

Bridget Todd: Three years later, that movement has spread across the United States. You see, SoundThinking makes tens of millions of dollars across more than 160 cities, including mine. Thousands of acoustic sensors are installed above city streets. In schools, hospitals, and in public housing complexes — mostly in the U.S. When the sensors detect the sound of a gunshot, they alert a control center operator.

Nat Palmer: After the AI says it is or isn’t a gunshot. It goes to someone in an office and then they look at the sound waves and listen to the sound waves. And then like, yep, this is probably a gunshot. Nope this is probably not a gunshot. But yeah, so it’s like AI reliant, but then a person at the end of the day is still, like, yes or no.

So it’s like all of these things that are being told to us are like these like complex systems that are based on science. Like, a lot of these like criminal or forensic analysis things, they’re just based on things that aren’t necessarily, uh, proven or legitimate. But, because they’re sold to us as public safety, we just gotta keep using them.

Bridget Todd: Here’s the thing. SoundThinking says they have a 97 percent accuracy rate. But there’s evidence to the contrary. In Chicago, the MacArthur Justice Center found that ShotSpotter was triggering 87 police calls a day.

That’s tens of thousands of deployments over a year. Thousands of people stopped and frisked. And yet, not even 10% of those alerts were linked to gun crime. So, even though the data isn’t always accurate, police are constantly summoned to the mostly Black and Brown neighborhoods where the sensors are.

Yeah, that doesn’t sound scientific to me either. So, were there sensors in or around or, like, near the neighborhoods where you live?

Nat Palmer: Yeah, I mean, once they pointed them out to me, I was seeing them everywhere. And now, like, I live in the Austin neighborhood, which is a West Side Chicago neighborhood. And yeah, they’re everywhere. They’re down the street from me. Like, I can’t really walk more than a block without seeing them.

Bridget Todd: So for somebody who was reading about the uptick in violent crime in D.C. who was like, “Oh, this is going to be something that helps,” what can you tell them to help them understand the fact that maybe it’s not going to help?

Nat Palmer: Usually what I go into and will say is just this is something that calls police after the fact. It doesn’t do anything to try and address things that could have prevented gun violence. Using ShotSpotter is less about actually keeping people safe, and it’s more about, like, “We need to send the police over.”

Do they always make the scene safer? A lot of people don’t feel they do. So, my offering to folks who, like, feel that this tool is a way to keep us safe is, does it really create safety? Or are we just sending police to handle another issue because we want an easy answer? There is not going to be an easy answer to gun violence.

Bridget Todd: Last year, Chicago’s mayor announced the city would be ending its contract with ShotSpotter. The community’s stop-campaign worked.

Nat Palmer: We helped keep this in conversation for three years. We helped change and shift the understanding of surveillance as common sense public safety among a lot of people.

Bridget Todd: So what is public safety to you, in light of saying, okay, it shouldn’t just be surveillance and cops.

Nat Palmer: I think that public safety is resources for all of my people. Resources specifically meaning, like, let’s update these books in some of these schools. Let’s make sure every school in Chicago has a librarian. Let’s make sure our kids have fully funded sports, music and arts programs. Let’s make sure that we have a thorough crisis response that doesn’t send people with guns.

Safety isn’t just sending someone with a gun and some chains to lock people up. Safety is, like, I’m able to walk down my neighborhood and say hi to everyone. Which I do feel comfortable doing. But like, not everyone does. And they deserve that.

Bridget Todd: Why do you think, given all that, that this technology is still so widespread around the United States, in the face of this kind of evidence that you all have collected?

Nat Palmer: What does public safety mean to a company who makes millions off of our people dying? It’s just weird to me.

Bridget Todd: What advice do you have for people who are interested in fighting AI surveillance tech in their city, like from Chicago to DC and anywhere and everywhere in between?

Nat Palmer: If you’re like me and you want to build a new world where public safety is not based on exploitation, don’t use data arguments. We’re not fighting for a more efficient ShotSpotter. We’re not fighting for a more efficient police response.

We are fighting for a brand new way to keep our people safe. So, get into your community, figure out how people feel about public safety. And I promise you, you will find all of the understanding you need around why surveillance is not safety.

Bridget Todd: By the way, we reached out to SoundThinking for comment but didn’t hear back. Stick around. We’ll be right back. And we’re back. AI is everywhere now, right? Not just up in the Cloud. Down here on the ground. And in our cities. And more AI means more data centers — those giant warehouses full of computers that crunch the data to make our chatbots chat.

Marina Otero: We imagine that all the digital infrastructures are floating over our heads. But AI and in general, every aspect of the digital world needs vast amounts of physical infrastructures, not only cables, but also huge infrastructures, buildings that are called data centers. And those consume incredible amounts of energy and water, emit CO2 and heat.

Bridget Todd: Marina Otero is an architect from Spain. She researches the impact of digital infrastructures worldwide, and currently teaches at Columbia University in New York.

Marina Otero: The weirdest thing I saw in a data center — it was wonderful — it was the surplus heat coming from servers was used for growing mealworms. And those mealworms were later fed to chickens. So I was very impressed by this combination of, you know, machines and living organisms.

Bridget Todd: So, I happen to live near one of the biggest data center hubs in the U.S., and they look like nondescript office buildings. You wouldn’t have any idea what goes on there. It almost has a militaristic vibe, not so welcoming.

Marina Otero: Every time that we are connected to our phones that we are watching, TV, like, that is streaming, all the time we are actually connecting with data centers. Those are the places where all the information is stored and is processed.

The reason why cities want to have these infrastructures is because generally the closer they are to us, the fastest the connection it is. So it’s a question of latency. So that’s one of the reasons. The other is that with the development of artificial intelligence, many countries don’t want to be left behind and they are constructing new infrastructures to be able to train and work with AI.

Bridget Todd: Marina says there are well over 10,000 data centers around the world. And right now construction is booming. I bet there’ll be one close to where you live, too. Environmental groups everywhere say companies are not transparent enough about just how much energy and water they’re using.

Marina Otero: If a new data center will be built in Madrid, where I live, I will be very concerned about the water usage, especially in places that are affected by drought, like Spain and Chile and many other places in the world. These infrastructures consume millions of liters of water. And most of the time, populations are not aware. But also governments tend to overlook these questions because they prefer to have investments, and they very easily give licenses to companies like Google, Microsoft, Amazon.

Bridget Todd: It’s possible to design data centers that have their own sources of energy. But that rarely happens. And even then, the scale of water pumped through pipes around the servers and other equipment to keep them from overheating - is incredible. Just 60 data centers can consume the same amount of water as New York City in a year. In Chile, Marina mediated policy talks between environmental activists, the government, and tech companies around Chile’s capital city, Santiago.

Marina Otero: It is fundamental that architects start to understand and also intervene in the design of data centers. The reason is that, so far, data centers have been primarily designed by engineers. We don’t have anything against engineers, but the question is that most of the time data centers are constructed as architectures that have no relation to the environment around them although they very much depend on them and depend on their water and the energy, et cetera.

So this is where I think architecture comes to reimagine how these infrastructures could be, you know, part of the cities, could be more much more integrated in our daily lives.

Bridget Todd: For example, Marina says residual heat from data centers could be used more often to warm homes or greenhouses. And here’s a fun fact: In the Paris 2024 Olympics, one of the swimming pools was heated by a data center.

Marina Otero: In one of my trips, I went to Sweden and in the north of Sweden near the Arctic Circle is LuleƄ. And in the city is one of the most important Meta data centers, and many of their data centers are located there because of the cold temperatures. Because the amount of energy that you have to put into the system in order to cool down the server room is minimum because the surrounding climate environment is already very cold. So, LuleƄ is also the site of one incredible institution called RISE where new prototypes for data centers are being built. So, when I visited, I saw some of the most amazing prototypes and one of them was the reuse of surplus heat from servers to, you know, grow worms, but there were also the possibility of reusing surplus heat for growing vegetables, creating a circular economy in the region.

Bridget Todd: Marina has visited data centers that float on a river. She even talked to a company with a prototype for a data center that orbits in outer space. But lowering the environmental impact of data centers also means thinking differently about how we use data and computing power.

Marina Otero: Some of the designs that also I found more exciting are decentralized designs. For instance, a company, Carnot, has developed a small data center that you can have it in your living room. So instead of having these incredible huge warehouses full of servers, you have something that is very similar to a heater that is able to heat your living room. And it does so because it’s a small data center and the functioning of the server is actually generating heat for free for you, in exchange of having this data center in your living room.

So there are so many different possibilities right now. The question is that none of them are perfect, unfortunately. All of them have some sort of externalities that we have to address. And, to be honest, I always tend to advocate for consuming less data.

I think one of my dreams will be to develop a data center that is managed by a neighborhood, a community. So, with this idea of the small micro data centers that could be distributed in different spaces in the city, in our living rooms, in different public buildings, I could imagine that data could be something that is managed by a population instead of being in the hands of big corporations.

Bridget Todd: I could imagine that, too. We need to be thinking collectively about digital infrastructure. After all, what defines a city are its people. And data? Yep, that’s all about people, too.

Linda Dounia Rebeiz: Every time I travel and I go to beautiful cities around the world, I just want to come home.

Bridget Todd: Linda Dounia Rebeiz lives in Dakar, Senegal, a country of 18 million people on the coast of West Africa.

Linda Dounia Rebeiz: It’s the biggest city in Senegal, but it’s still so chill compared to other cities that I’ve visited. And I think the ocean has a lot to do with that. You can almost always go to the beach. Like if something is bothering you, you just take a five-minute walk and you’re in the ocean. Literally in the ocean, right?

Bridget Todd: Back when AI was becoming popular, Linda was playing with image generators like DALL-E and Stable Diffusion. An image generator is a generative AI model that creates images based on text prompts. Linda was interested in what is behind an image that gives it meaning. With an AI generator, that can be a bit opaque. She was already training AI models as an artist, but this was different.

Linda Dounia Rebeiz: Maybe this is a common experience, but the first thing I tried to do was to just prompt for things that I was familiar with to see how good it was at representing them.

I literally just wrote “Dakar,” and then I sent in the prompt, like, that was the only prompt. And the images were cartoon images and it was this, like, dusty road and palm trees. But the architecture was like, I was, like, this is nothing — I’ve never seen anything like this in Dakar. It would always look — it didn’t look like there was any activity. Like, it looked like someone tried to make a building and then stopped halfway through.

There was a lot of, for lack of a better word, like, decay, a lot of buildings going bad, roads not being good, even in the embodiments, like the people, the cars, everything looked rusty and old and bad looking, in general.

Bridget Todd: But this isn’t at all what Dakar looks like.

Linda Dounia Rebeiz: Yeah, like, I think if you prompt for Dakar, you should see, like, bustling streets, like, very colourfully dressed people. Like, people pay attention to how they dress here, it’s a very stylish city. Like, there’s lots of really amazing brands, contemporary brands, that come out of Senegal, like, that dress famous people all around the world. So you should definitely see a lot of vibrance in like how the streets are occupied.

Bridget Todd: When a city or a population is invisible online, it’s also going to be less recognizable in generative AI outputs. Linda realized that there just weren’t enough current images of Dakar in data sets used for training. That gives a skewed representation of the city based on decades-old images of a place that no longer exists.

Linda Dounia Rebeiz: We are starting to see more content coming from countries that are not at the center of tech development and tech evolution. But we’re sort of presented with the historical artifact of the internet with AI, which is that for the most part, a lot of the internet was very skewed towards very specific communities. What AI is trained on today is almost like a time machine. Like, it’s an archive of who we’ve been online. So, if you think of archives that have been digitized, and that have been put on the internet, then you realize that the problem doesn’t start with the internet. The problem starts with way, way before that. It’s how we collect data and how we store it, and where we care to do that in the first place.

Bridget Todd: Because she wanted a better representation of the real, vibrant and colorful place Dakar is now, Linda developed her own dataset of images for an art project called “Blur Theory”. She blends her own photos and sketches of Dakar with synthetically generated ones. The results are impressionistic portraits of her city.

Linda Dounia Rebeiz: If I can’t prompt for my city and have an accurate depiction of what it looks like, or at least close, what does it say about how others perceive where I live? What does it say about people who have come around, taken pictures, put it online? I try not to throw blame around. So, it’s not just on the people training these models. It’s also on us, people living here, to realize that we just don’t have a lot of that data online. And maybe there are national archives with photographs sitting somewhere waiting to be digitized. Or maybe the models are just biased, even if they have the right data, they just project certain biases on how places like Dakar are perceived in the world.

Bridget Todd: A lot of Linda’s art is about exploring alternate realities using AI. But she also imagines a different future for AI itself.

Linda Dounia Rebeiz: If one person is to make a decision of how do you build a dataset that would represent an entire city, you’re bound to run into these very complicated questions. What do I show? What don’t I show? But if you open it up to more people to do it, you get that multiplicity of perspectives. It shouldn’t be one person doing this kind of work or one entity or one company.

What I want for the future of AI is more data agency. In the sense that I want people to be able to determine how they want to be perceived and to have control over how they want to be perceived.

Bridget Todd: Large, mainstream AI models depend on gathering as much training data as possible. Linda says smaller, more specific models would reduce bias. And she has a message for Silicon Valley.

Linda Dounia Rebeiz: You’ve decided that you have created technology that is so-called universal. And it couldn’t be further from the truth. And that’s always been the case. So, I think the philosophy and the ideals and the narratives that accompany tech need to be examined and unpacked and also debunked, right? For people to know that using Midjourney, using Dall-E, whatever, comes with an asterisk of a strong leaning and bias towards a particular way of life, and a particular kind of person, particular cities. I would compare homegrown AI models to tending to gardens. And it’s a very fun and relevant analogy because we’re now in the phase where the internet, everything looks the same. And so I think we’re in like this critical point in AI’s evolution where we can decide whether it’s this massive industrial farms that take over most arable land on Earth, or it’s lots of really beautiful gardens, some weird, some interesting, some beautiful, with different kinds of, like, species of flowers as opposed to just, like, monoculture. Like, an AI monoculture is just such a boring future, if not just, like, downright dangerous. It’s so funny. We have so many examples from ecology that should apply to how we build tech but, yeah, we should learn from how monoculture has destroyed our ecologies and our cultures to not make that same mistake with AI.

Bridget Todd: Think about it. What would you plant in your AI garden? The time for us to plant the seeds together, is now. After talking to Linda, I went to ChatGPT and wrote “Washington, DC” as a prompt. And it generated an image of a bunch of U.S. government buildings as seen from above. And, I mean, ok, that’s not totally wrong. But that’s certainly not my DC. My DC is a colorful, dynamic, vibrant city full of people from all countries, all walks of life. And that’s how I want my AI to be. Thanks for listening to IRL. For more about our guests, check out our show notes, or visit IRLpodcast dot org.