The “Privacy Policy” Policy

Season 5: Episode 1

Privacy policies: most apps and websites have them, buried away somewhere. These legal documents explain how companies collect, use, and share your personal data. But let’s be honest, few of us actually read these things, right? And that passive acceptance says a lot about our complicated relationship with online privacy.

In the Season 5 premier of IRL, host Manoush Zomorodi speaks with Charlie Warzel, writer-at-large with the New York Times, about our complicated relationship with data and privacy — and the role privacy policies play in keeping things, well, confusing. You’ll also hear from Parker and Lila, two young girls who realize how gaming and personal data intersect. Rowenna Fielding, a data protection expert, walks us through the most efficient ways to understand a privacy policy. Professor Lorrie Cranor explains how these policies have warped our understanding of consent. And privacy lawyer Jenny Afia explains why “privacy” is a base element of being human.



Published: June 14, 2019

Show Notes

Charlie Warzel is an Opinion writer at large for the New York Times. You can get more insights from him about privacy online when you sign up for the Times’ Privacy Project Newsletter.

If you’d like to learn more about privacy policies and their impact on our youth, check out Jenny Afia’s article on tech’s exploitative relationship with our children.

This IRL podcast episode referenced several privacy policies, and we encourage you to read them. To start, here’s Firefox’s privacy policy. You’ll see that Firefox’s business model is not dependent on packaging your personal info. And, we hope you’ll find that our policy is easy-to-read, fully transparent, and specific.

The other privacy policies referenced in this episode include:

Transcript

Parker: So basically you have a little square and you’re a little cube, and you try and take up other people’s space.

Lila: Can you die soon so I can go I can like go after.

Parker: How dare you!

Manoush Zomorodi: Parker and Lila are buddies. Parker.

Parker: My name is Parker.

Manoush Zomorodi: Is 11 years old. And Lila.

Lila: I’m Lila.

Manoush Zomorodi: Is 10 years old. Together they like to play games on their tablets.

Parker: Oh I’m right next to the king, and the king’s the person who has the most space.

Manoush Zomorodi: The game might be free to play, but these kids are paying for it by watching ads.

Lila: Okay, my turn. My turn, thank you, thank you. Oh wait, no. I don’t …

Manoush Zomorodi: And just by downloading the app they risk sharing a lot of data about who they are.

Parker: You know at the beginning of every game it’s like, oh yeah I agree with the terms of privacy policy.

Manoush Zomorodi: Privacy policies. Something most of us agree to without even reading. Parker and Lila haven’t read the policy for this game, so we asked them to.

Parker: Protection of your personal data is fundamental to us. With your private consent we are likely to collect and process the following data and-

Manoush Zomorodi: In the game the players move a small cube around and try to grab valuable virtual territory before the other players do. And while they do that, the app is grabbing valuable personal data. Data that some people might consider private.

Parker: -surname, forename, age, gender, email address, profile photos, hobbies, friend list. IDFA publicity identifiers? Identifiers for iOS devices and GAID for Android devices. So that’s kind of creepy.

Lila: But I hope it’s not to do anything bad.

Parker: A lot of that stuff they do not need, and I don’t really get why. I feel like maybe it’s like some evil corp. Like it probably isn’t, but some evil corporation, really they just download a free game and it’s just like, this is like the best game in all the universe and it’s free. And it’s just like, hey just click yes on our privacy policy and then you can play. And people are just like, yeah. And then they’re just like evil hackers and they take over the world.

Parker: It’s probably not going to happen, but I mean-

Lila: Yeah, I mean. Yeah.

Manoush Zomorodi: The game makers are not evil hackers who will take over the world, at least I don’t think they are, but they did a product. They offer it for free, and now they do what they can to make money from it. And so that means pushing ads and that means pulling user data. And using that data to target us with more ads, or even just packaging up the information and selling it to other companies.

Manoush Zomorodi: Is it okay that all that personal information is being collected? Or, is it a massive invasion of your privacy? How you feel about these questions speaks to how we define privacy in the internet age. Do we care about it? Or, don’t we? Because by definition privacy is an intimate thing, right? For some people, online, it means total anonymity. For others, they simply want more say in how their data is collected, or used. And for others still, it doesn’t mean much at all. And that’s what makes it so hard to agree on when we say the phrase, online privacy. What do we even mean?

Manoush Zomorodi: So as we launch the new season of IRL, let’s take a look at this word, privacy. And learn how we can make sense of what it means to us online and offline. And we’re going to start where it shows up the most, that annoying place, the privacy policy.

Manoush Zomorodi: I’m Manoush Zomorodii, and welcome back to IRL, an original podcast from Firefox.

Manoush Zomorodi: Firefox has always taken your privacy seriously. Their lightening fast browser for instance doesn’t collect user data and automatically blocks ad trackers. All that and a privacy policy that’s easy to read and puts you first. Find out more at firefox.com.

Manoush Zomorodi: Some of us think, and there have been surveys that show this, that when a site has a privacy policy it means that the company behind the site is protecting the user’s privacy. Well, that is actually rarely the case. The privacy policy is a legal document, it spells out how a company collects, stores, uses, and shares your data.

Rowenna Fielding: That’s part of the issue, and the word policy as well is a real pain in the backside because people think of policies as documents which have to be boring and have to be written in corporate jargon, legalese. And have to be there to protect the organization if it gets caught doing something wrong.

Manoush Zomorodi: Rowenna Fielding calls herself a professional data protection nerd. She helps British companies craft their privacy policies. I should say a lot of companies are trying to make their policies more readable and user friendly, Google, Uber, Microsoft, Twitter, Facebook. But these are really long chunky blocks of texts, I mean who is going to read all these? Rowenna Fielding, that’s who.

Rowenna Fielding: I reckon I must have read hundreds, if not thousands in my life so far. This is how I spend my free time, I’m really sad.

Manoush Zomorodi: No, you’re not sad at all. It is thanks to brave heroes like Rowena that there is hope that privacy policies can be easy to understand and written more honestly. She can also help us navigate these policies and teach us how to spot red flags. There are many, but here are a handful to consider.

Rowenna Fielding: I’ve got three that I see most often, so the first is saying, we may do such and such with your information. And that basically leaves the reader to play a guessing game. Are you doing this, or are you not? If you’re only sometimes doing it, what are the circumstances under which you would or would not do it?

Rowenna Fielding: The second one is having basically a series of lists. So here is a list of data that we process, and then here is a list of purposes we process it for. And then here is a list of lawful bases we might rely on. And again, as a reader if I’m looking at that I kind of have to guess which ones apply to me and my data, and which ones don’t.

Manoush Zomorodi: Her third red flag has to do with the word, purpose.

Rowenna Fielding: Where organizations say things like, for marketing purposes, or for HR purposes. Or even for legal purposes. Or my absolute favorite, for record keeping purposes. I mean none of those are purposes, they’re activities. They don’t really tell me what’s being done with my information, and why and how.

Rowenna Fielding: The actual purpose of the notice is to be transparent, to inform the individual is completely lost.

Manoush Zomorodi: Rowenna wants us to lookout for weaselly, wishy washy language in a privacy policy. She sees this as signs that the company behind it isn’t being transparent about its data collection and use. In which case, you may want to reconsider your relationship with that company. But it’s not really that simple is it. I mean I use lots of apps and services every day, I depend on them. I have not read their privacy policies, and I’m not going to stop using them.

Lorrie Cranor: There was actually a study that showed that just seeing a link to a privacy policy made people feel good, and they never looked at it.

Manoush Zomorodi: This is Lorrie Cranor, she’s a professor at Carnegie Mellon and directs the CyLab Security and Privacy Institute.

Lorrie Cranor: I think before the internet when people talked about having control over their own privacy, they were talking about really physical manifestations. You could close a door or put down the window shades, choose not to fill out a paper form, or lower your voice.

Manoush Zomorodi: Lorrie brings up another important point about why when we’re online we tend to leave the window shades rolled up. Let me give you an example. Okay so you install a new app on your phone and then you open it. And there’s this popup and it asks you to agree to the app’s terms and conditions and the app’s privacy policy. You know what I’m talking about. We all do it.

Lorrie Cranor: When users click through, they haven’t read or understand what they’re consenting to. So no, it’s not really a meaningful or informed consent.

Manoush Zomorodi: So it’s kind of like you’re incentivized to ignore the information, right? The company wants you to click yes, and you want to use this new app as quickly as possible so, in a heartbeat, the deal between the two of you is done. You’re rewarded for giving in without learning what you might be giving up.

Lorrie Cranor: I think meaningful consent would explain to you what is being collected, why it’s being collected, what it’s going to be used for.

Manoush Zomorodi: So let’s imagine how much more useful it could be. You open the app and the first thing it does is tell you in simple efficient language things like, when you use this app, we will know where you are. Or, we package your data and sell it to other companies. Or, when you leave this website we will track where you go next and follow you across the web.

Manoush Zomorodi: And you’re given a real choice. Do you consent, or do you say no?

Lorrie Cranor: Now the problem is, is that given the amount of data collection that’s happening these days, you would spend an awful lot of time looking at this information and having to make lots of decisions. And so I think we need a balance between providing all of the information, and making it really easy for people to provide or withhold consent without having to spend all their time doing it.

Manoush Zomorodi: That information exists of course, it’s in the privacy policy you just agreed to without reading. But I think Lorrie’s right, it is harder to control our privacy because the internet subsists on data. We are drowning in it. It feels like it’s everywhere and yet, it’s intangible. It’s so much easier to just try and move on with your day.

Charlie Warzel: You know I think a lot of people hear the word privacy and their eyes glaze over.

Manoush Zomorodi: Charlie Warzel is writer at large for the New York Times.

Charlie Warzel: But it’s really just that we haven’t been thinking about it the right way, I’m convinced. And we haven’t really been conditioned to think of it as an everyday topic.

Manoush Zomorodi: Charlie’s been trying to understand our complicated relationship with privacy, data, and online and offline life. He’s part of the Times series called The Privacy Project. But you got to wonder if privacy, that word, even covers it all. Is privacy a good word for this stuff?

Charlie Warzel: The short answer to that is, no. It’s an impoverished word. I think it’s very similar to the term climate change, right? It’s basically something that is so big and all encompassing that you can never really experience all of it at once. That trying to describe it, you’re always going to fall short of really conveying the actual understanding.

Manoush Zomorodi: I mean I think, you just sort of laid out just how complicated this issue of privacy is. And that it is at the intersection of so so so many various different topics. So I want to, let’s start with the companies themselves. Google’s first privacy policy came out I mean two decades ago, 20 years ago. And when we look at what that original policy sort of covered, how do you feel like it compares to today’s policies, what they are, and maybe what they should be?

Charlie Warzel: Yeah. I mean Google’s privacy policy from I think it was May 1999, so we’re really actually at like the 20 year anniversary of that.

Manoush Zomorodi: Happy anniversary to us, right?

Charlie Warzel: I know, it’s absolutely crazy. But it’s such a quaint artifact of a different time on the internet. It’s very much like, we might have some information on you, but don’t worry we hold your privacy in the highest regard. And really all you’re doing is entering in some search terms so it’s not a big deal. And to watch the evolution of that is really to watch the evolution of the modern commercial internet, right? Like the policy starts to get a little longer, and a little more lawyered, and a little less friendly and jovial. And a little more concerned with the ways in which they are not liable for certain parts of your information.

Charlie Warzel: They start to talk about, yes well we might share certain bits of your information with third parties in order to improve an experience. Then it’s, we might share certain bits of this information with third parties, sort of as a way to sell ads and help promote our free services. At one point it goes from we might be taking some of this location data, to, we use all your location data.

Charlie Warzel: And so it really just shows how this idea of surveillance has become omnipresent as the result of these wonderful black mirror devices that we keep in our pockets all day. So you sort of see the evolution of the modern internet, how it’s not only become so dependent on an advertising machine, but how that advertising machine is really based off of the most granular real time information about you.

Manoush Zomorodi: The way that I often think of privacy is this concept of invasiveness. I’ll give you an example. One of my listeners said that she was concerned that she had a drinking problem. And so she was Googling to figure out what, did she have a drinking problem? Should she maybe visit an AA, or Alcoholics Anonymous? And the next time that she went on Facebook she started getting targeted with ads from local liquor stores.

Manoush Zomorodi: And it’s just, right, that’s the only reaction there can be is like, ugh. That’s when I think data, personal information and geo-targeting go horribly wrong.

Charlie Warzel: It really brings up that idea of Google search term. We actually had Google CEO, Sundar Pichai, write an op ed for the privacy project in which he sort of defends Google’s business model and the way that they do things. And one of his examples was that for the last 20 years Google has actually allowed people to ask questions of the service that they would never do in real life. You know you might not walk up to your doctor 30 years ago and say, I have a drinking problem, I’m worried, I want to find some help.

Charlie Warzel: And so his argument is, that is actually a mode of privacy. And I actually think that is a good point, but it’s not like a fully formed point in that it doesn’t take into context the rest of the ecosystem, right? Like it doesn’t take into account the entire advertising ecosystem that’s built up around it. The entire platform, social networking industry that revolves on our data, and the way that stuff is passed. And yes it’s passed anonymously, but like that example proves, it can be easily connected to you. And it can feel really invasive.

Manoush Zomorodi: I think what you’re saying is like you can’t find religion on privacy on your own, dear tech companies. We’ve seen Europe pass new laws, California next year will have a new privacy law. But are we at this point where there’s a realization that it’s beyond Google. It’s beyond Facebook. It’s beyond each company. It’s a system now that needs oversight.

Charlie Warzel: Yeah, I think that’s exactly it. I think it’s a system. And I think that anyone who believes that … Well first of all you can’t put the toothpaste back in the tube on this stuff, right? Everyone has sort of woken up now to this idea that there are consequences to this. That the internet is real life, so to speak. And you can’t put those two decades since the first Google privacy policy, you can’t put that back. That information has been passed around, it’s been sliced and diced and targeted and re-targeted. And put into a database which has probably gotten hacked and has spilled out into the internet in many different ways.

Manoush Zomorodi: Right. Okay, so let’s step away from privacy policies now and open this conversation up a bit wider. I want to hear more about how privacy and data can clash. What are your favorite examples of how data has invaded someone’s privacy in a strange or surprising way?

Charlie Warzel: Sure. Yeah, absolutely. An example of this that really caught my eye was a guy who is a computer programmer from Philadelphia, he installed a nest thermostat in his home, and he tied it to this program called If This Then That, which basically is like an internet of things program that links different things. So his was, anytime my nest notices that goes into away mode, which means the person is no longer in the house, turn off all the lights in my house. All the smart lights.

Charlie Warzel: Which is a nice little thing. He, a couple of months ago moved out of his home, and he reset the thermostat, so he thought. The If This Then That program, though, which he didn’t really touch, because he thought it was no longer going to work, was still working. And it was pinging him every time that people had left the house. So he actually could see when the people who bought his house, or his apartment from him, were gone every time. And it was this real, like he didn’t want it, but it was this real invasion of privacy.

Charlie Warzel: Another example is this sort of, it speaks to the push and pull of privacy and privacy legislation that I think is actually really interesting. A privacy advocate I spoke with was telling me about a 2014 law that Louisiana passed over student data privacy. Which I believe, and I’m just kind of talking off the cuff here, but I believe it mentioned that if you had a student’s name you couldn’t have more than one piece of identifiable data linked to them without the parent’s permission.

Charlie Warzel: And that sounds just like something that’s good, right? Like you don’t want any of that publicly available, you want to protect these students. But the law was so stringent and so broad that it actually resulted in they couldn’t publish student yearbooks because it had a photo and a name. And they couldn’t announce kids names at football games when they scored a touchdown because that was a piece of identifying information with their name, they couldn’t put the batting average of the shortstop on the high school team in the program, or something like that.

Charlie Warzel: It’s this kind of crazy situation. And that is sort of funny and quaint, but the flip side of that is, they also for certain students, couldn’t release their GPAs and transcripts to scholarships.

Manoush Zomorodi: Wow.

Charlie Warzel: So I think what it highlights to me is this idea that our information is everywhere. And it is incredibly important, it is part of our identities, it has such consequence in our life. And putting any kind of constraint on that is going to have all these ripple effects and unintended consequences.

Manoush Zomorodi: To me we’re at this very crucial moment where that things could go either way. Like the public could either decide, okay we’re not going to stand for this, or they just say, well like you said can’t put the toothpaste back into the tube, so here we go. And this is just the world we live in. And oh, did you order dinner yet by the way? You know what I mean?

Charlie Warzel: Right. It’s a great point, and I think that it’s something I’m trying to really keep abreast of, right? I mean the most staggering thing, I was on a trip in Cambodia earlier this year and was speaking to people there about Facebook. Just sort of in passing, not for any reporting reasons. And talking to them. And they mentioned the Cambridge Analytica scandal, which I thought was really interesting. And they said, your data …

Charlie Warzel: This was very striking to me, they said, your data is very valuable. You should be very mad that it was taken. But I would be pleased if someone stole my data because then it would mean that it was valuable. I don’t have any money, I don’t have any clout in the world, I don’t have any influence. I live in this country that has less influence. My data is worth nothing. And it was this really striking sort of -

Charlie Warzel: I mean obviously kind of heartbreaking thing to hear, but I think it speaks to the difference in attitudes, right?

Manoush Zomorodi: Yeah.

Charlie Warzel: I think that this intersects our lives in so many ways. If you are a person of color, of a certain demographic in a certain area of the country, you may not know that you care about privacy, but you certainly care about the ability to have police sort of constantly tracking you throughout a city. Or tracking your license plate wherever you go to make it that much easier to have a traffic stop. If you are an undocumented individual, you absolutely care about the ability for ICE to partner with companies that have large license plate databases that then partner with camera companies, that then partner with law enforcement, right? You care greatly about that.

Manoush Zomorodi: That’s Charlie Warzel, writer at large for the New York Times. And like Charlie says, maybe privacy is an impoverished word to explain what we are wrestling with online. Maybe as he also says, it’s like climate change, it’s just too big to wrap our heads around. Maybe we need to keep it simple, bring it back to basics. So here’s one last thought to consider, it’s from Jenny Afia, she’s a privacy lawyer in England, and a member of the Children’s Commissioners Digital Taskforce.

Jenny Afia: I’ve got two small children, a five year old and a three and a half year old.

Manoush Zomorodi: And watching her own kids grow up, Jenny sees how privacy is fundamentally part of personhood.

Jenny Afia: When my daughter is really young, at 18 months she found some chocolate biscuits in a friend’s handbag and went off down the corridor to eat them furtively by herself. And so there was some sort of innate desire for privacy. I think many parents don’t take seriously enough children’s right to privacy. So for example, they will just overshare photos of them. And I don’t know how these kids will necessarily feel when they are 18 and there’s photos of them in the bath still available on the internet.

Jenny Afia: And that goes for the heart of privacy for me. It’s not necessarily that you’re doing anything wrong at all, or that you have anything to hide, but we all should have a sphere of our life where we’re not on stage or being scrutinized. And we’re just able to develop and grow relationships and make mistakes and do stupid but not illegal things. And if we get rid of our privacy it’s going to have a massive impact on our ability to develop as humans.

Manoush Zomorodi: If the internet was a big shiny coin you’d have privacy on one side and data on the other. Right now, our data is worth a lot of coin to a lot of companies. But privacy, it’s priceless. It’s a necessary part of a healthy functioning society. And more of us need to have these conversations about what privacy is, what it means to us. Together we decide what we’re okay with giving up. And what, we’re just not okay with.

Manoush Zomorodi: One way to get started is to look at some of those privacy policies that we’ve all been so good at ignoring. I mean Parker’s doing it, she’s one of the kids we heard from earlier.

Parker: I got a new computer for my 11th birthday recently, and I read the whole privacy policy. And it was really long, so I feel better about it now, but …

Manoush Zomorodi: If privacy policies were presented more obviously and transparently, it would help consumers make more informed choices. And maybe even hold companies accountable for their behavior. Likewise, if companies realized their customers actually read these things, it could compel them to review how they do business. Head to the show notes for this episode and find links to the policies we talked about. And if you want to see what a good privacy policy can look like, check out Firefox’s while you’re there.

Manoush Zomorodi: Go to IRLpodcast.org. Because an internet that respects privacy is an internet that is more human. And that is the theme that’s going to carry us through this season of the podcast. We are asking big questions about the impact the internet is having on privacy, on democracy, on climate change, and of course on our everyday lives. How can we ensure the internet puts people first? Let’s explore that together.

Manoush Zomorodi: I’m Manoush Zomorodii, and this is IRL, an original podcast from Firefox. And I’ll talk to you again in a couple of weeks. Thanks so much for listening.