Season 5: Episode 1
Privacy policies: most apps and websites have them, buried away somewhere. These legal documents explain how companies collect, use, and share your personal data. But let’s be honest, few of us actually read these things, right? And that passive acceptance says a lot about our complicated relationship with online privacy.
Charlie Warzel is an Opinion writer at large for the New York Times. You can get more insights from him about privacy online when you sign up for the Times’ Privacy Project Newsletter.
If you’d like to learn more about privacy policies and their impact on our youth, check out Jenny Afia’s article on tech’s exploitative relationship with our children.
The other privacy policies referenced in this episode include:
Parker: So basically you have a little square and you’re a little cube, and you try and take up other people’s space.
Lila: Can you die soon so I can go I can like go after.
Parker: How dare you!
Manoush Zomorodi: Parker and Lila are buddies. Parker.
Parker: My name is Parker.
Manoush Zomorodi: Is 11 years old. And Lila.
Lila: I’m Lila.
Manoush Zomorodi: Is 10 years old. Together they like to play games on their tablets.
Parker: Oh I’m right next to the king, and the king’s the person who has the most space.
Manoush Zomorodi: The game might be free to play, but these kids are paying for it by watching ads.
Lila: Okay, my turn. My turn, thank you, thank you. Oh wait, no. I don’t …
Manoush Zomorodi: And just by downloading the app they risk sharing a lot of data about who they are.
Manoush Zomorodi: Privacy policies. Something most of us agree to without even reading. Parker and Lila haven’t read the policy for this game, so we asked them to.
Parker: Protection of your personal data is fundamental to us. With your private consent we are likely to collect and process the following data and-
Manoush Zomorodi: In the game the players move a small cube around and try to grab valuable virtual territory before the other players do. And while they do that, the app is grabbing valuable personal data. Data that some people might consider private.
Parker: -surname, forename, age, gender, email address, profile photos, hobbies, friend list. IDFA publicity identifiers? Identifiers for iOS devices and GAID for Android devices. So that’s kind of creepy.
Lila: But I hope it’s not to do anything bad.
Parker: It’s probably not going to happen, but I mean-
Lila: Yeah, I mean. Yeah.
Manoush Zomorodi: The game makers are not evil hackers who will take over the world, at least I don’t think they are, but they did a product. They offer it for free, and now they do what they can to make money from it. And so that means pushing ads and that means pulling user data. And using that data to target us with more ads, or even just packaging up the information and selling it to other companies.
Manoush Zomorodi: Is it okay that all that personal information is being collected? Or, is it a massive invasion of your privacy? How you feel about these questions speaks to how we define privacy in the internet age. Do we care about it? Or, don’t we? Because by definition privacy is an intimate thing, right? For some people, online, it means total anonymity. For others, they simply want more say in how their data is collected, or used. And for others still, it doesn’t mean much at all. And that’s what makes it so hard to agree on when we say the phrase, online privacy. What do we even mean?
Manoush Zomorodi: I’m Manoush Zomorodii, and welcome back to IRL, an original podcast from Firefox.
Rowenna Fielding: That’s part of the issue, and the word policy as well is a real pain in the backside because people think of policies as documents which have to be boring and have to be written in corporate jargon, legalese. And have to be there to protect the organization if it gets caught doing something wrong.
Manoush Zomorodi: Rowenna Fielding calls herself a professional data protection nerd. She helps British companies craft their privacy policies. I should say a lot of companies are trying to make their policies more readable and user friendly, Google, Uber, Microsoft, Twitter, Facebook. But these are really long chunky blocks of texts, I mean who is going to read all these? Rowenna Fielding, that’s who.
Rowenna Fielding: I reckon I must have read hundreds, if not thousands in my life so far. This is how I spend my free time, I’m really sad.
Manoush Zomorodi: No, you’re not sad at all. It is thanks to brave heroes like Rowena that there is hope that privacy policies can be easy to understand and written more honestly. She can also help us navigate these policies and teach us how to spot red flags. There are many, but here are a handful to consider.
Rowenna Fielding: I’ve got three that I see most often, so the first is saying, we may do such and such with your information. And that basically leaves the reader to play a guessing game. Are you doing this, or are you not? If you’re only sometimes doing it, what are the circumstances under which you would or would not do it?
Rowenna Fielding: The second one is having basically a series of lists. So here is a list of data that we process, and then here is a list of purposes we process it for. And then here is a list of lawful bases we might rely on. And again, as a reader if I’m looking at that I kind of have to guess which ones apply to me and my data, and which ones don’t.
Manoush Zomorodi: Her third red flag has to do with the word, purpose.
Rowenna Fielding: Where organizations say things like, for marketing purposes, or for HR purposes. Or even for legal purposes. Or my absolute favorite, for record keeping purposes. I mean none of those are purposes, they’re activities. They don’t really tell me what’s being done with my information, and why and how.
Rowenna Fielding: The actual purpose of the notice is to be transparent, to inform the individual is completely lost.
Manoush Zomorodi: This is Lorrie Cranor, she’s a professor at Carnegie Mellon and directs the CyLab Security and Privacy Institute.
Lorrie Cranor: I think before the internet when people talked about having control over their own privacy, they were talking about really physical manifestations. You could close a door or put down the window shades, choose not to fill out a paper form, or lower your voice.
Lorrie Cranor: When users click through, they haven’t read or understand what they’re consenting to. So no, it’s not really a meaningful or informed consent.
Manoush Zomorodi: So it’s kind of like you’re incentivized to ignore the information, right? The company wants you to click yes, and you want to use this new app as quickly as possible so, in a heartbeat, the deal between the two of you is done. You’re rewarded for giving in without learning what you might be giving up.
Lorrie Cranor: I think meaningful consent would explain to you what is being collected, why it’s being collected, what it’s going to be used for.
Manoush Zomorodi: So let’s imagine how much more useful it could be. You open the app and the first thing it does is tell you in simple efficient language things like, when you use this app, we will know where you are. Or, we package your data and sell it to other companies. Or, when you leave this website we will track where you go next and follow you across the web.
Manoush Zomorodi: And you’re given a real choice. Do you consent, or do you say no?
Lorrie Cranor: Now the problem is, is that given the amount of data collection that’s happening these days, you would spend an awful lot of time looking at this information and having to make lots of decisions. And so I think we need a balance between providing all of the information, and making it really easy for people to provide or withhold consent without having to spend all their time doing it.
Charlie Warzel: You know I think a lot of people hear the word privacy and their eyes glaze over.
Manoush Zomorodi: Charlie Warzel is writer at large for the New York Times.
Charlie Warzel: But it’s really just that we haven’t been thinking about it the right way, I’m convinced. And we haven’t really been conditioned to think of it as an everyday topic.
Manoush Zomorodi: Charlie’s been trying to understand our complicated relationship with privacy, data, and online and offline life. He’s part of the Times series called The Privacy Project. But you got to wonder if privacy, that word, even covers it all. Is privacy a good word for this stuff?
Charlie Warzel: The short answer to that is, no. It’s an impoverished word. I think it’s very similar to the term climate change, right? It’s basically something that is so big and all encompassing that you can never really experience all of it at once. That trying to describe it, you’re always going to fall short of really conveying the actual understanding.
Manoush Zomorodi: Happy anniversary to us, right?
Charlie Warzel: I know, it’s absolutely crazy. But it’s such a quaint artifact of a different time on the internet. It’s very much like, we might have some information on you, but don’t worry we hold your privacy in the highest regard. And really all you’re doing is entering in some search terms so it’s not a big deal. And to watch the evolution of that is really to watch the evolution of the modern commercial internet, right? Like the policy starts to get a little longer, and a little more lawyered, and a little less friendly and jovial. And a little more concerned with the ways in which they are not liable for certain parts of your information.
Charlie Warzel: They start to talk about, yes well we might share certain bits of your information with third parties in order to improve an experience. Then it’s, we might share certain bits of this information with third parties, sort of as a way to sell ads and help promote our free services. At one point it goes from we might be taking some of this location data, to, we use all your location data.
Charlie Warzel: And so it really just shows how this idea of surveillance has become omnipresent as the result of these wonderful black mirror devices that we keep in our pockets all day. So you sort of see the evolution of the modern internet, how it’s not only become so dependent on an advertising machine, but how that advertising machine is really based off of the most granular real time information about you.
Manoush Zomorodi: The way that I often think of privacy is this concept of invasiveness. I’ll give you an example. One of my listeners said that she was concerned that she had a drinking problem. And so she was Googling to figure out what, did she have a drinking problem? Should she maybe visit an AA, or Alcoholics Anonymous? And the next time that she went on Facebook she started getting targeted with ads from local liquor stores.
Manoush Zomorodi: And it’s just, right, that’s the only reaction there can be is like, ugh. That’s when I think data, personal information and geo-targeting go horribly wrong.
Charlie Warzel: It really brings up that idea of Google search term. We actually had Google CEO, Sundar Pichai, write an op ed for the privacy project in which he sort of defends Google’s business model and the way that they do things. And one of his examples was that for the last 20 years Google has actually allowed people to ask questions of the service that they would never do in real life. You know you might not walk up to your doctor 30 years ago and say, I have a drinking problem, I’m worried, I want to find some help.
Charlie Warzel: And so his argument is, that is actually a mode of privacy. And I actually think that is a good point, but it’s not like a fully formed point in that it doesn’t take into context the rest of the ecosystem, right? Like it doesn’t take into account the entire advertising ecosystem that’s built up around it. The entire platform, social networking industry that revolves on our data, and the way that stuff is passed. And yes it’s passed anonymously, but like that example proves, it can be easily connected to you. And it can feel really invasive.
Manoush Zomorodi: I think what you’re saying is like you can’t find religion on privacy on your own, dear tech companies. We’ve seen Europe pass new laws, California next year will have a new privacy law. But are we at this point where there’s a realization that it’s beyond Google. It’s beyond Facebook. It’s beyond each company. It’s a system now that needs oversight.
Manoush Zomorodi: Right. Okay, so let’s step away from privacy policies now and open this conversation up a bit wider. I want to hear more about how privacy and data can clash. What are your favorite examples of how data has invaded someone’s privacy in a strange or surprising way?
Charlie Warzel: Sure. Yeah, absolutely. An example of this that really caught my eye was a guy who is a computer programmer from Philadelphia, he installed a nest thermostat in his home, and he tied it to this program called If This Then That, which basically is like an internet of things program that links different things. So his was, anytime my nest notices that goes into away mode, which means the person is no longer in the house, turn off all the lights in my house. All the smart lights.
Charlie Warzel: Which is a nice little thing. He, a couple of months ago moved out of his home, and he reset the thermostat, so he thought. The If This Then That program, though, which he didn’t really touch, because he thought it was no longer going to work, was still working. And it was pinging him every time that people had left the house. So he actually could see when the people who bought his house, or his apartment from him, were gone every time. And it was this real, like he didn’t want it, but it was this real invasion of privacy.
Charlie Warzel: Another example is this sort of, it speaks to the push and pull of privacy and privacy legislation that I think is actually really interesting. A privacy advocate I spoke with was telling me about a 2014 law that Louisiana passed over student data privacy. Which I believe, and I’m just kind of talking off the cuff here, but I believe it mentioned that if you had a student’s name you couldn’t have more than one piece of identifiable data linked to them without the parent’s permission.
Charlie Warzel: And that sounds just like something that’s good, right? Like you don’t want any of that publicly available, you want to protect these students. But the law was so stringent and so broad that it actually resulted in they couldn’t publish student yearbooks because it had a photo and a name. And they couldn’t announce kids names at football games when they scored a touchdown because that was a piece of identifying information with their name, they couldn’t put the batting average of the shortstop on the high school team in the program, or something like that.
Charlie Warzel: It’s this kind of crazy situation. And that is sort of funny and quaint, but the flip side of that is, they also for certain students, couldn’t release their GPAs and transcripts to scholarships.
Manoush Zomorodi: Wow.
Charlie Warzel: So I think what it highlights to me is this idea that our information is everywhere. And it is incredibly important, it is part of our identities, it has such consequence in our life. And putting any kind of constraint on that is going to have all these ripple effects and unintended consequences.
Manoush Zomorodi: To me we’re at this very crucial moment where that things could go either way. Like the public could either decide, okay we’re not going to stand for this, or they just say, well like you said can’t put the toothpaste back into the tube, so here we go. And this is just the world we live in. And oh, did you order dinner yet by the way? You know what I mean?
Charlie Warzel: Right. It’s a great point, and I think that it’s something I’m trying to really keep abreast of, right? I mean the most staggering thing, I was on a trip in Cambodia earlier this year and was speaking to people there about Facebook. Just sort of in passing, not for any reporting reasons. And talking to them. And they mentioned the Cambridge Analytica scandal, which I thought was really interesting. And they said, your data …
Charlie Warzel: This was very striking to me, they said, your data is very valuable. You should be very mad that it was taken. But I would be pleased if someone stole my data because then it would mean that it was valuable. I don’t have any money, I don’t have any clout in the world, I don’t have any influence. I live in this country that has less influence. My data is worth nothing. And it was this really striking sort of -
Charlie Warzel: I mean obviously kind of heartbreaking thing to hear, but I think it speaks to the difference in attitudes, right?
Manoush Zomorodi: Yeah.
Charlie Warzel: I think that this intersects our lives in so many ways. If you are a person of color, of a certain demographic in a certain area of the country, you may not know that you care about privacy, but you certainly care about the ability to have police sort of constantly tracking you throughout a city. Or tracking your license plate wherever you go to make it that much easier to have a traffic stop. If you are an undocumented individual, you absolutely care about the ability for ICE to partner with companies that have large license plate databases that then partner with camera companies, that then partner with law enforcement, right? You care greatly about that.
Manoush Zomorodi: That’s Charlie Warzel, writer at large for the New York Times. And like Charlie says, maybe privacy is an impoverished word to explain what we are wrestling with online. Maybe as he also says, it’s like climate change, it’s just too big to wrap our heads around. Maybe we need to keep it simple, bring it back to basics. So here’s one last thought to consider, it’s from Jenny Afia, she’s a privacy lawyer in England, and a member of the Children’s Commissioners Digital Taskforce.
Jenny Afia: I’ve got two small children, a five year old and a three and a half year old.
Manoush Zomorodi: And watching her own kids grow up, Jenny sees how privacy is fundamentally part of personhood.
Jenny Afia: When my daughter is really young, at 18 months she found some chocolate biscuits in a friend’s handbag and went off down the corridor to eat them furtively by herself. And so there was some sort of innate desire for privacy. I think many parents don’t take seriously enough children’s right to privacy. So for example, they will just overshare photos of them. And I don’t know how these kids will necessarily feel when they are 18 and there’s photos of them in the bath still available on the internet.
Jenny Afia: And that goes for the heart of privacy for me. It’s not necessarily that you’re doing anything wrong at all, or that you have anything to hide, but we all should have a sphere of our life where we’re not on stage or being scrutinized. And we’re just able to develop and grow relationships and make mistakes and do stupid but not illegal things. And if we get rid of our privacy it’s going to have a massive impact on our ability to develop as humans.
Manoush Zomorodi: If the internet was a big shiny coin you’d have privacy on one side and data on the other. Right now, our data is worth a lot of coin to a lot of companies. But privacy, it’s priceless. It’s a necessary part of a healthy functioning society. And more of us need to have these conversations about what privacy is, what it means to us. Together we decide what we’re okay with giving up. And what, we’re just not okay with.
Manoush Zomorodi: One way to get started is to look at some of those privacy policies that we’ve all been so good at ignoring. I mean Parker’s doing it, she’s one of the kids we heard from earlier.
Manoush Zomorodi: Go to IRLpodcast.org. Because an internet that respects privacy is an internet that is more human. And that is the theme that’s going to carry us through this season of the podcast. We are asking big questions about the impact the internet is having on privacy, on democracy, on climate change, and of course on our everyday lives. How can we ensure the internet puts people first? Let’s explore that together.
Manoush Zomorodi: I’m Manoush Zomorodii, and this is IRL, an original podcast from Firefox. And I’ll talk to you again in a couple of weeks. Thanks so much for listening.