From Snapchat filters to Apple’s Face ID, biometric technology plays a growing role in our everyday lives. What do we actually give up when we upload our face to these apps? Steven Talley shares his experience as a victim of mistaken identity. Joseph Atick, a forefather of facial recognition technology, reckons with its future. We head to to China, where biometric data is part of buying toilet paper. And artist Adam Harvey investigates how racial bias seeps into big data sets.
Today, more than half of US adults are recorded in police facial recognition databases. For more on the far-reaching impact of facial recognition tech, check out our blog.
Casey: All right. Well, I’m checking this thing out. It’s got a lot of plastic around it, but it’s got one of those sweet little tabs to pull. Opening the box, it’s got a picture of the phone in the nice new background. Turning it on now.
Veronica Belmont: I’m Veronica Belmont. This is IRL: Online Life is Real Life, an original podcast from Mozilla.
Casey: Oh, cool. It’s got a little … Like a little emoticon kind of looking around a circle as it fills up and it just says, “How to set up face ID. First, position your face in the camera frame. Then move your head in a circle to show all the angles of your face.”
Veronica Belmont: My friend Casey just un-boxed her brand new iPhone 10. It’s the model that has facial recognition technology built in. Your face is now your password.
Casey: Ooh, okay. All right, it works.
Veronica Belmont: Apple says there’s a million and one chance that someone else’s face could unlock your phone. Great odds if you value security and privacy.
Casey: Kind of seeing some of the things that you can do with the facial recognition have kind of brought me over to like … I think that a lot of times, to be honest, I favor really cool technology over kind of being worried about privacy and that kind of stuff. If I’m not looking at the phone, it won’t unlock, but that doesn’t mean it can’t be misused in the future, right?
Veronica Belmont: Of course, there’s a bunch of stories out there now of hackers fooling the software with complicated 3D printed masks, or of kids unlocking their parent’s phones because they look alike. It’s a bit like real life. I’m sure you’ve been mistaken for a different person once or twice. It’s definitely happened to me before. Imagine this happening to you.
Steven Talley: I had already gone to bed and received a knock on the door. We had a screen door there, so I could see an individual sitting on the other side, and he basically had mentioned that he had hit my vehicle, which was parked outside of the house.
Veronica Belmont: In 2014, Steven was living an ordinary life as a financial broker in Denver. After answering the door, he followed the man outside.
Steven Talley: I’m crouching down to inspect the damage where he’s showing me he hit me, and I see these two cylindrical objects, which were flash bangs, and it was like a boom, boom, boom, boom, boom, boom, boom. Something like that. And I can kind of see, again, this gang of people. They looked like a bunch of Army men. They had big guns. I could see that. Obviously, I was scared to death.
Veronica Belmont: Steven had no idea what he’d done to bring this on. He couldn’t know that all of this was happening because he looked like someone else.
Steven Talley: I thought maybe I was being robbed, but then they proceeded to basically beat me. On my lower back, my legs, my rib cage. There was a gentleman wearing the ubiquitous FBI windbreaker jacket we all see on TV.
Veronica Belmont: In the month’s before a couple of bank robberies had taken place in Denver. There was a video clip from a security camera and it played on the local news. Three people who thought it could be him phoned in a tip. His ex-wife told a detective the photographs he showed her looked like her ex-husband, so the cops came for him.
Steven Talley: And he said basically I was being arrested for two armed bank robberies and assaulting a police officer.
Veronica Belmont: Steven spent months in jail before his lawyer proved it wasn’t him. Proved he was at work when the robberies took place. They let him go. A year goes by and then he’s arrested again. This time, the cops were sure it was him.
Steven Talley: I was told that they had taken my picture, I think it was my mugshot, and they compared it with either the stills or the video from the second bank robbery, and it was reviewed by experts in the FBI division that works with facial recognition analysis, and the agent came up with a match. Facial recognition analysis.
Veronica Belmont: Facial recognition analysis. It can be done by people, as it was in this case at the FBI. It can be done with video cameras and computer software. Both can get it wrong. Yet, it’s really easy to convince yourself that what you’re looking at is who you’re looking for.
Steven Talley: They were trying to use it almost like they do with fingerprints, saying that you know the fingerprints don’t lie, and that was their attitude.
Veronica Belmont: They were wrong. More evidence proved he wasn’t the suspect. Again, he was a free man, but the damage was done. You can’t keep a job in the finance industry when you’ve been accused of robbing a bank.
Steven Talley: You know it’s ruined my financial career. I’ve lost my security licenses and I’ve lost touch with my two kids that I haven’t seen in almost three years now because of this issue of this incidence.
Veronica Belmont: Because of what’s happened Steven Talley is currently homeless. He’s suing for $10 million. In Steven’s case, the FBI relied on a specialized forensic team trained to examine facial images for patterns, but more than ever, we’re turning to computers to do the heavy lifting. When we hand over that kind of data to technology, we’re trusting the powers that be with terabytes of metadata that we’ve built up over the years. As these incredible new tools begin to influence both our offline and online lives, is your face still yours? Where does your identity end and someone else’s intentions begin? Just like my pal, Casey, and her new iPhone, we can unlock our smartphones just by looking at them now. We use our faces online to turn ourselves into flower crowned anime characters, or talking poop emojis. Ever take a moment to think about how this kind of technology even gets built? You have to train the machine to learn what a face is. One way researchers do this is by scraping the internet for copyright-free face pics, like Flickr, for example. Or purchasing massive databases that have done this work already. That means if your face is on the web, it could be in one of these databases.
Adam Harvey: To think that your private moments, your vacation photos, your graduation photos, are being used to help a security company train facial recognition, I think people should be aware of that.
Veronica Belmont: This is Adam Harvey.
Adam Harvey: I’m an artist and researcher looking at computer vision and the implications of facial recognition technology. Who’s using my face and what are they doing with it? Is the question that nobody really has an answer to.
Veronica Belmont: Last October, Adam built an installation that invited people to think about that question. It was part of an exhibition called The Glass Room. It was held in London and that’s where we caught up with him. Here’s how his project works. You stand in front of a camera and a monitor. Your face is scanned and then that scan is quickly compared to a database.
Adam Harvey: The faces are from a database called Megaface, which is the largest publicly available facial recognition training base. It contains 672,000 identities and 4.7 million photos.
Veronica Belmont: We wanted to see this in action, so we chatted up a few people as they gave it a spin.
Lily Ames: Wow. You’re actually getting a lot of matches.
Emrick: I am? Oh, these are all people that look like me? I see us putting some ethnicity into it and pulling up people maybe that think my cultural background might be. A guy who’s of Islamic faith and I’m not Islamic. I looked like that when I was young.
Chelsea Jiang: Match confidence, high. Best match. A girl whose eye’s smaller than mine, nose as wide as mine, and mouth bigger than mine.
Felix Trench: Oh, hello.
Lily Ames: What do you see?
Felix Trench: This shaven headed man with strong neck tattoos. I have no, no tattoos.
Lily Ames: So what do you think of your matches so far?
Chelsea Jiang: It’s just Asian. It’s the conception that Asians all look the same.
Sarah Fitzhenry: Mostly female coming up now though. It seems to have figured out that I’m not a male. It’s definitely unnervingly intelligent in terms of what it can pick up like it knows my race, basically, with no other cues apart from my face.
Woman Speaker: A lot of girls with similar facial types of brown hair, brown eyes, big smiles. I got a beauty queen in there, so that’s a victory.
Lucy Rojas: I think it’s like they are taking all the power, all the information and we aren’t … We don’t know what happened with it, and it’s really scary. I feel we don’t have control of anything.
Veronica Belmont: One thing that comes up when talking about this tech, is whether or not these systems have biases built into their algorithms. You heard some of that from the people interacting with Adam’s project. The guy compared to images of, as he puts it, “Islamic-looking men.” Or the women pointing out the, “All Asians look the same,” stereotype.
Adam Harvey: I tell people that facial recognition is really racial recognition, plus some additional metadata.
Veronica Belmont: That hidden bias is something that comes up often. Have you tried Google’s Arts and Culture app? It has a feature where it matches your face with portraits hanging in museums around the world. Google says it created this to encourage people to interact with art more regularly. Legit intentions and yeah, it’s pretty fun too.
Jennifer Aikman: Hi, my name’s Jennifer Aikman and my racial background is Caucasian. Well, I downloaded the app, and I wasted a lot of time trying to adjust my face, and adjust the lighting, and to function without filters for once in my life. The portrait is of Democritus, which is this ancient Greek philosopher, but he’s got this terrible expression. It looks like he’s just fed on human flesh or something. It’s very, very weird. The reason I matched with this guy, other than going into a spiral of low self-esteem, is I’m pale. I have dark hair. He’s got arched eyebrows and so do I. I don’t know, maybe I’m just really diluted, but maybe I do kind of look like this, because my daughter was like, “You guys look so much alike,” so it was hurtful.
Greg Deal: My name is Greg Deal. I’m an indigenous person. I’m a member of the Pyramid Lake Paiute Tribes in Nevada. The main match I got was Head of a Man by Vincent Van Gogh. My face I think is fatter than the image itself. His hair’s short. His ears actually kind of stick out a bit as well, so that’s also not like my face.
Malachi Stewart: My name is Malachi Stewart. It’s just a picture of an African-American woman with a hat on her head and a hoop earring. It looked nothing like me. My skin tones are really, really different, and facial features are really different. The only thing that was similar about us was that we were Black, and it kind of looked like a slave picture.
Veronica Belmont: So when I tried the app, I got 39% matched with a-a work of art that looks very much like Beatrice Arthur from the Golden Girls. Amazing person, fantastic comedian. I’m 35, though, so I have to say that stung, that stung a little bit. As fun as it is to match your selfie to a painting, some of us are wondering if it’s just a sneaky way for Google to capture our faces and use them to train their face detecting machines, well when you use the app it actually tells you this isn’t the case Google says the data won’t be used for anything other than to match you to artwork and that the app stores your selfie only for the time it takes to make that match, good to know, but then again does it matter if our photos and faces already exists on other google platforms? To quote a tweet from actor Alyssa Milano “anyone suspicious of just surrendering your facial recognition to google?” or are we confident that they already have that at this point. Racial biases and software hiccups aside, this technology is being deployed around the world. In China, it’s showing up in the strangest places and headed towards a future you might find alarming.
Nancy Lee: I’m standing in the yellow sign on the floor that says, “Stand right here,” and that green square is recognizing my face.
Veronica Belmont: The woman you’re hearing is Nancy Lee. She’s in Beijing and she’s standing in front of a facial recognition terminal.
Bejan Siavoshyi: And she’s leaning in so it gets her full profile.
Veronica Belmont: And that’s Bejan Siavoshyi. He’s watching Nancy’s face get scanned by the machine.
Nancy Lee: Oh, okay it worked.
Bejan Siavoshyi: Nancy needs this machine to register her face so that she can get some toilet paper.
Veronica Belmont: Did you just say, “Toilet paper?”
Bejan Siavoshyi: Yeah, I did. It looks like a … What is that, like arm’s length? What would you say that is about?
Nancy Lee: This is barely enough if you wanted to take a…
Veronica Belmont: Okay. All right. Hang on a second, Bejan. You did say this was going to be weird, but scanning your face for toilet paper? Why is this a thing?
Bejan Siavoshyi: Locals were raiding public washrooms because they were stocked with free toilet paper, so to deter thefts …
Veronica Belmont: TP dispensing face scanners.
Bejan Siavoshyi: Yeah, you get it now.
Veronica Belmont: Devious. Okay. Well, leaving aside the call of nature nightmares this can pose, if Chinese citizens are used to using facial recognition for something like that, then by this point the technology must be becoming widespread?
Bejan Siavoshyi: Oh, it’s everywhere. You might also know that in China there’s something like 690 million smartphone users and more and more facial technology is being baked into the process.
Veronica Belmont: Like what?
Bejan Siavoshyi: Like for instance right now you can use face recognition to make sure that the cab you hailed is a registered driver with the app you hailed it through.
Veronica Belmont: That actually, that actually makes sense.
Bejan Siavoshyi: There’s a KFC in Hangzhou that’s testing a payment system where you smile into a camera and out pops your meal.
Veronica Belmont: So you pay with your face?
Bejan Siavoshyi: You straight up pay with your face.
Veronica Belmont: In the background of all of this, can we talk about these much bigger and more serious conversations about the government using this technology? I mean, it’s not just all taxi drivers and fast-food.
Bejan Siavoshyi: Yeah. When you really consider what’s coming, it’s nothing. I mean, you might have heard about the massive face ID database that China’s building.
Veronica Belmont: A little bit. What’s going on there?
Bejan Siavoshyi: Well, it is as big as it sounds. The database will contain the faces of every single Chinese citizen. That is 1.3 billion of them. I mean, they’re planning a system that can match a person’s face to his or her photo ID within three seconds and with 90% accuracy, and they want this all in place by the year 2020.
Veronica Belmont: What are they even telling people that they need this for?
Bejan Siavoshyi: Well, I mean, the main reason is for public security and there’s definitely a case for that, but the flip side of that is people are worried that the system could be used to stamp out what the government considers dissent.
Veronica Belmont: But what do you think Bejan, given how much facial recognition technology seems to be everywhere there, do you think that people are more willing to accept this kind of surveillance than perhaps somewhere else?
Bejan Siavoshyi: I think on a security level there won’t be too much pushback, but if this system starts to really impede privacy, then I do think that people will take issue in some way.
Veronica Belmont: That database China is building will have a lot of material to play with. They’ve got more than 170 million security cameras in China and millions of them have facial recognition technology built right in. There’s another 450 million cameras in the next few years. They’re calling it Skynet, little on the nose folks. When our governments and corporations start getting grabby with our faces, the line between wow and whoa can get pretty blurry.
Joseph Atick: It is not just, it’s a face recognition technology under control. It now has become a face recognition technology out of control.
Veronica Belmont: This is Joseph Atick and he’s been worrying about where all this face tech is going since pretty much the beginning. Good reason for that.
Joseph Atick: I am credited as one of the inventors of the face recognition technology in the early 90s and one of the founding fathers of the biometric industry.
Veronica Belmont: Back in the 1990s Atick noticed that the simple human act of recognizing faces was easy for a baby, and near impossible for a computer. So he wondered …
Joseph Atick: What if a computer could recognize me? What if a door could recognize me? What if all of these barriers in our life could start recognizing who they are securely and conveniently? What will they do? Of course, today, all of this is happening and it’s exciting to see essentially almost 25 to 30 years of work went into it to get us to this point, but back then that was a dream.
Veronica Belmont: Joseph and his team celebrated the birth of facial recognition, but just as Dr. Frankenstein found out the hard way, technological dreams have a funny way of turning into nightmares.
Joseph Atick: In fact, no sooner has the excitement about the discovery of how face recognition by a computer works, we started to recognize the flip side of the coin. The power of a computer to recognize you when you perhaps don’t want to be recognized.
Veronica Belmont: And then his own computer spooked him into reconsidering what he’d done.
Joseph Atick: I still remember one day when I came into the lab, and I turned on the light, and I walked into my office not even thinking about face recognition, and of course my computer was on, and the camera was on, and I heard this metallic voice which announced, “I see Joseph.”
Computer: I see Joseph.
Joseph Atick: And that immediately gave me shivers in my spine, because I was not aware, I was not thinking that it was going to do that.
Veronica Belmont: Since then, he’s watched as facial technology has become more and more influential. And for him, more and more unsettling.
Joseph Atick: The reason it’s out of control is because we as the consumers, we as the people online are feeding this massive beast with the ability to recognize us all. I never imagined the day would come where we would be volunteering our facial images to teach a machine to recognize us anywhere at any time. The linkage between your offline personality and your online personality can now be possible. Cameras on the street can recognize me, but they can link me to my online activity. They will know exactly how much I’m worth. They know what kind of things I like to buy and obviously they can start targeting me for nefarious reasons, or for commercial reasons. I mean, is that the world we want to live in?
Veronica Belmont: Joseph Atick put his heart and soul into this tech, but it sounds like he’s wondering if he should’ve put more thought into what he built. I’m Veronica Belmont and this is IRL: Online Life is Real Life. So yes, this face tech stuff can inspire that big brother is watching feeling, but with any digital-aged marvel, it all comes down to human choices, what we do with it. Sometimes it can bring light to dark places.
Emily Kennedy: There was one case in particular where there was a law enforcement agent who came across a news article about a young girl who was missing and she was believed to be sold online for sex.
Veronica Belmont: This is Emily Kennedy. She’s the CEO of a company called Marinus Analytics. Their facial recognition software is called FaceSearch.
Emily Kennedy: And so he had known that we had released FaceSearch just a couple days before, and so he took her photo that was in the news article, just a simple photo of the girl, and he uploaded it to FaceSearch, and within a few seconds he received a 93% match. They were able to confirm that it was her and they made an appointment with her and were able to go out and pretty immediately go out, rescue her, and also arrest two of her traffickers for human trafficking.
Veronica Belmont: So does it see a lot more than the human eye can see, or is it just more of a speed kind of thing?
Emily Kennedy: We’ve seen cases where know a human might not have put the two together, but when FaceSearch gives you that result, and then you look into it, and you say, “Yep, everything else matches,” then you can kind of see where it might be the same person.
Veronica Belmont: Emily Kennedy is the CEO of Marinus Analytics. Sometimes being found, being identified is what matters most. Even something as simple as a selfie can be an important act of self-expression. Glynnis MacNicol has spent a lot of time thinking about women and selfies. In particular, how they’ve become a crossroads for a self-expression and corporate interests.
Glynnis MacNicol: I think it allows girls and women to be authors of their own stories, to decide what image they want to put out there, what they want to say about themselves, and how they want to say it. And I think that is very new in the history of humankind essentially and very, very powerful. So I think that’s an important development that gets trivialized in conversations about how selfies promote narcissism, or they’re silly.
Veronica Belmont: The point is, my selfies, my rules. At least, I think it is. She wonders if while we’re having all this fun on the selfie stage, we’re not noticing the creep behind the curtain.
Glynnis MacNicol: So I think in terms of who are you handing your image over to, and how does it benefit them, and how does it benefit you, is an important question we need to continually be asking ourselves.
Veronica Belmont: Are you saying then that we need to get our faces off the internet altogether?
Glynnis MacNicol: Yes. Everyone get your faces offline. Yes, I can’t … What evidence is there that this is a good idea? I mean, really? Is there literally any evidence that this is going to benefit us? Let me ask you, why would you post a selfie?
Veronica Belmont: This is … Yeah, this is something I think about a lot actually, because I have noticed on Instagram, and Facebook, that I get a ton more likes or hearts if I post a selfie, versus a photo of something I just think is cool.
Glynnis MacNicol: Of course.
Veronica Belmont: It seems to create this feedback loop of, “Well, I know I’m going to get more attention if I post a selfie, and that feels good, but I also want to be able to get the same amount of accolades, I guess, for something that is not my face, that I think is cool and actually want to share.”
Glynnis MacNicol: You do because we all know if you do take the picture, you’re going to get that immediate rush of everybody loves you, and it’s so … I think it’s dangerous and complicated.
Veronica Belmont: Well, and now our smartphones aren’t just for selfies. We actually can use our face to unlock them. It’s a very seamless kind of process.
Glynnis MacNicol: What are we giving up for convenience? Really? Is this really necessary to open your phone with your face? Is it that much of a trial to type in six numbers? I don’t think so. I don’t think that … I think we’re trading off a lot for convenience and novelty, and I think we’ve traded it off so quickly and without a great deal of thought, so yeah.
Veronica Belmont: We are social animals and we’re hardwired to share. In an online world, that means sharing a lot of ourselves, even our faces and not just once, thousands of times. It means reveling in Instagram filters. It means liking snaps of your girlfriend’s new haircut and scrolling endlessly through your ex-boyfriends photos. And yes, it also means worrying about the mountains of data that are piled up in the meantime. Data that trains machines to see us better and better. Navigating all that with happy faces, sad faces, angry faces, smiling cat with heart eyes faces, it’s what it means to have an online identity. So here’s looking at you kid, whether you realize it or not. Hey, remember how I mentioned Adam Harvey’s exhibit was part of something called The Glass Room? Think of The Glass Room as a tech shop crossed with a public service announcement and an art gallery. It’s a gateway to talking about having a healthier online life and Mozilla was a sponsor. Adam’s project was called MegaPixels. We took a few pictures of the installation, so you can check out what it looked like. Find those and more info about Glass Room by looking up the show notes to this episode on the website, IRLPodcast.org. IRL is an original podcast from Mozilla, the nonprofit behind the all-new Firefox browser. I’m Veronica Belmont, see you online. Until we catch up again, IRL.
Lily Ames: Can I just get your consent on here that you’re okay, us using your voice for the Mozilla podcast?
Speaker 21: Voice only?
Lily Ames: Yeah, we-
Speaker 21: Okay. You can use my voice only.
Lily Ames: Yeah. Okay.
Speaker 21: Thank you.
Lily Ames: No photo?
Speaker 21: No.
Lily Ames: Okay, no photo.
Speaker 21: We’ve learned something here.