Democracy and the Internet

Season 5: Episode 2

Part of celebrating democracy is questioning what influences it. In this episode of IRL, we look at how the internet influences us, our votes, and our systems of government. Is democracy in trouble? Are democratic elections and the internet incompatible?

Politico’s Mark Scott takes us into Facebook’s European Union election war room. Karina Gould, Canada’s Minister for Democratic Institutions, explains why they passed a law governing online political ads. The ACLU’s Ben Wizner says our online electoral integrity problem goes well beyond a few bad ads. The team at Stop Fake describes a massive problem that Ukraine faces in telling political news fact from fiction, as well as how they’re tackling it. And NYU professor Eric Klinenberg explains how a little bit of offline conversation goes a long way to inoculate an electorate against election interference.



Published: July 1, 2019

Show Notes

Early on in this episode, we comment about how more privacy online means more democracy offline. Here’s more on that concept from Michaela Smiley at Firefox.

Have a read through Mark Scott’s Politico reporting on Facebook’s European election war room.

For more from Eric Klinenberg, check out his book, Palaces for the People: How Social Infrastructure Can Help Fight Inequality, Polarization, and the Decline of Civic Life.

And, find out more about Stop Fake, its history, and its mission.

Transcript

Manoush Zomorodi: This spring, about a week before the polls opened in the European Union elections, a handful of journalists were given a tour of a unique office space.

Mark Scott: Okay. So, we’re walking into this relatively nondescript, dark lit room. On your left hand side, there’s a European Union flag and a poster board with a bunch of “go get ‘em tiger” messages to help these mostly 20-something coders, engineers, right out of general casting, to motivate them.

Manoush Zomorodi: On this tour is Mark Scott, chief tech correspondent for Politico.

Mark Scott: There’s about 40 of them, mostly male, mostly white.

Manoush Zomorodi: The mostly male, mostly white engineers in this room, they’re working for Facebook.

Mark Scott: How they divided it is, all these 40 people work on specific country desks. So, there’s U.K., Germany, Slovakia, Lithuania, and all of the people in the room, they collectively speak all of the 24 national EU languages.

Manoush Zomorodi: And collectively, they’re monitoring online content ahead of the election.

Mark Scott: On the sides of the walls, there are humongous flat screen TVs, and they are showing in real time what is going on in social media, mostly in the Facebook world. So, that’s WhatsApp, Instagram, and Facebook itself. But also, there’s metrics around what is sort of the chatter on social media.

Manoush Zomorodi: Facebook set up this room, this war room, to find and take down posts with false information, to delete fake accounts, terminate bots, anything that violates the platform’s rules and is clearly designed to interfere with the election.

Mark Scott: Yeah, I mean, to be honest, for something that is so pivotal to democracy, checking to see if disinformation or so-called fake news is being spread, it was pretty mundane. You want to hear like people dropping and running across the room, we found this from Russia or whatever. That just doesn’t happen. It’s one of these things where the movies make it seem a lot more sexy. But, in reality, there’s a lot of looking at a screen and checking to see what’s going on.

Manoush Zomorodi: A war for the future of democracy is playing out online, and we’re starting to see just how big a battle it really is. On one side you have politicians and governments talking to their citizens, trying to inform them on the issues and courting their vote. On the other side, individuals and foreign agents weaponize the same communication tools to spread misinformation. Or they target us with provocative messages to divide us from each other, maybe even influence how we vote. In both cases, social media is a powerful vehicle to get their message across. And, with their EU election war room, Facebook is trying to fix a problem that they are at least partly responsible for creating.

Mark Scott: Everyone wants this to be better. No one wants for illegal advertising or political messages to get through that isn’t being flagged.

Manoush Zomorodi: Mark gives them credit for trying to protect the integrity of the EU election, but he also thinks the war room is as much about trying hard as it is about looking good.

Mark Scott: When you look at the efficacy of the Facebook war room, as they like to call it, I think some of it is public relations. It’s good to show people doing something on this front.

Manoush Zomorodi: And he wonders if, in the end, this is even a battle that can be won.

Mark Scott: I think, right now, no government or private company, be it Facebook or anyone else, has the power or frankly ability to stop bad actors, or digital tricksters, or whoever it is from spreading false narratives or disinformation that may sway voters at some point.

Manoush Zomorodi: We’re releasing this episode on a week when both Canada and the U.S. celebrate the founding of their countries. But, even as we party, countries around the world worry that democracy itself is at risk. So, is democracy in trouble? Are elections and the internet incompatible? I’m Manoush Zomorodi, and this is IRL, an original podcast from Firefox.

The Firefox Browser has built in tracking protection. That makes it harder for politicians, advertisers, and disinformation disseminators to find you. And with the free Facebook container extension, you can isolate your Facebook session from everything else you do online. More privacy means more democracy. Learn more at Firefox.com.

There’s nothing inherently wrong with political ads. They are, after all, a form of free expression. And, as long as there have been democracies and elections, there’s probably been deceptive advertising, disinformation campaigns, and misinformation problems. What’s new is how precisely you can target potential voters. Now, there can be so many tailored ads that learning to separate truth from falsehood becomes impossible. When we can’t track who is sending what to whom, we’re asking for trouble.

This is one reason why democracies around the world worry that ad targeting and social media can erode election integrity and help blur the line between good information and false. Canada is worried. Citizens head to the polls in a federal election this fall. And a report published by the Canadian Center for Cyber Security says, foreign interference is likely. So, the government passed a new bill making it harder for foreign money to find its way into the election campaign.

Karina Gould: It also did one thing that’s pretty unique in the world so far, in that it regulated online platforms for the first time in the election space, by requiring all political ads that happened from July 1st until election day to go on an online ad registry.

Manoush Zomorodi: Karina Gould is a member of Canada’s Federal Liberal Cabinet.

Karina Gould: I am the Minister of Democratic Institutions and the member of Parliament for Burlington.

Manoush Zomorodi: The Canadian government felt motivated to pass this law after observing electoral interference elsewhere.

Karina Gould: Really, this is based on the fact that what we saw in both the 2016 U.S. presidential election, but also in the Brexit referendum campaign and other elections around the world, is that this was one of the avenues of choice for a malicious foreign actors to try and interfere in domestic elections.

Manoush Zomorodi: Karina is referring to the Cambridge Analytica scandal. The company harvested Facebook user data to target potential voters with political ads. Here’s why she thinks this law was necessary.

Karina Gould: It actually reflects the spirit of the legislation when it comes to political ads in the offline world. When you and I are watching TV, if we’re watching the same channel, we’re going to be seeing the same ads. If we’re opening the newspaper, we’re going to be seeing the same ads. But, when you’re online, they are so micro-targeted that I might not be getting the same content that you’re getting, and that might skew my understanding of the idea. But it’s also important for people to be able to see who is behind these advertisements, also know that it’s an advertisement, it’s paid content, it’s not organic, and to really be able to look and understand who’s behind that and what their objectives are when they’re communicating with them.

Manoush Zomorodi: Have you heard from the tech companies in terms of their response to this?

Karina Gould: Yes. So, I would say that they were not very happy about it, is probably putting it mildly. Facebook quite quickly came out and said they would be doing the registry. They’d be ensuring that political ads can take place on their various platforms. Google came out quite quickly to say that they would just ban political advertisements in Canada because, according to them, it was too difficult to do the registry. And my feeling is, it’s not my job to legislate what Google wants me to legislate, it’s to legislate what’s in the best interest of Canadians.

Manoush Zomorodi: I want to ask you, so this law covers ads. What do you do though about the misinformation problem that there still is, even if Google decides not to allow any paid for political advertisements?

Karina Gould: In terms of what the government of Canada can do, I think we have to be really careful. Because it’s one thing for a Canadian citizen who is online and sharing their opinion or sharing something that they think is true and it’s not. They have a right to share that information. I mean, we have freedom of expression in this country. But we also want to be careful that we’re not, as government, seen to be the arbiters of good and bad information. And so, this is where civil society plays a really important role and where media plays a really important role to be informing Canadians as to the information that’s coming across at their screens and that they’re consuming.

Manoush Zomorodi: Yeah. I think, as someone with the word democracy in your title, you must think about that concept from a very sort of values-based and idealistic way. And I think it makes one even wonder whether the online space is good or not so good for democracy in this context.

Karina Gould: Yeah, it’s really interesting. And look, I’m an optimist and I do think that there’s a lot of value in the online space. But, just like we have rules and norms in the offline world, we should be applying those online as well.

Manoush Zomorodi: So… anyone wanting to run a political ad on Facebook in Canada will need to have it added to a trackable registry. *And at the end of June, Twitter announced it will also have a registry in place for the election period this Fall. Until then, Canadian political ads are banned from the platform.

Helpful, absolutely. And Canada’s law bans foreign-backed political advertising. And that helps too.*

But the law… it only applies to paid ads. There are plenty of other ways politicians, organizations and, yes, foreign agencies, can try to target voters and influence the election. There’s email, and texting, for instance. And anyone is still free to make and upload their own content to their platform of choice.

Ben Wizner thinks this law is just another form of content moderation and, all things considered, he says the lawmakers aren’t thinking big enough.

Ben Wizner: I’m the director of the Speech Privacy and Technology Project at the American Civil Liberties Union here in New York. I do think that governments need to be more assertive, but I think it’s the wrong answer to the wrong question. It’s the wrong answer because content moderation is really a drop in the bucket of the problems that we’re talking about.

Manoush Zomorodi: I mean, people seem very worried about this idea of political ads, and you are actually saying, maybe we’re worrying too much about their impact. I mean, it was never proven actually in any way that Cambridge Analytica, that those tactics actually did anything to persuade anyone to vote a certain way or not. Right?

Ben Wizner: I think that the elections of 2016, Trump being elected in the U.S., Brexit taking place in the U.K. were so traumatic to people that there was this need to ascribe it to some new or outside factor. Our news ecosystem has been polluted by talk radio for a generation, long before there was ever a Facebook. So many of my fellow citizens here have had their brains poisoned by television and radio propaganda for a long time. And now, we’re going to say that a bunch of people in a troll factory in St. Petersburg who were posting ads in bad English are the ones who we should be existentially afraid of? It just doesn’t compute for me.

Ben Wizner: We should be worried about Facebook and Google. We should be worried about them for a lot of reasons. They tend to really increase our polarization by responding to what we want rather than what we need. People love fake news. They adore it. People want that kind of content. We should not be getting most of our news and information from platforms that deliver on those base desires. But it seems to me that that is the urgent project.

Manoush Zomorodi: So, how do we get to a point where there are checks and balances? If it’s not regulation, how do we have some sort of agreement where we know when we have crossed the line and gone too far?

Ben Wizner: So, let’s be optimistic for just a second here, in the same way that I think the Snowden revelations really gave us this rare generational moment and to look at the dangers of mass surveillance by governments, maybe 2016 is what was needed for people across the American political spectrum to recognize that we may not be getting such a good deal from this business model, from this world of big tech where we get everything for free, and it’s so convenient, and there’s no cost, but maybe there is a cost. And so, that kind of wake up call is what’s necessary. This is the conversation we need, even if we haven’t yet arrived at the solution that answers all of these questions that we’re asking.

Manoush Zomorodi: That is very optimistic. It means like we’re in a good place. It’s a tough place, but it’s a good place in some ways.

Ben Wizner: We will see what comes out of it. I don’t want to be too optimistic. The tech companies are spending more money in Washington and Brussels than oil companies. These are still some of the most powerful corporations that the world has ever seen, and they’re not just going to voluntarily disarm and break themselves up. It’s going to be a huge political fight that’s going to take a long time.

Manoush Zomorodi: This idea of being able to target a voter specifically on topics that speak to them, or make them worried, or all of those things, if you were running for office, would you feel comfortable with parsing, and slicing, and dicing all the voters out there and targeting like that?

Ben Wizner: Let’s say they shouldn’t have access. Now, are we going to come up with a way that we can say that Nike and Walmart can have this information but the Beto O’Rourke campaign can’t have this information? It’s just not a workable set of rules for us to come up with. And, if you looked at the articles about Barack Obama’s 2012 campaign, the same kinds of tactics were celebrated. They were using people’s Facebook friendship networks in order to figure out which voters to target and how. Now, obviously, all of those tools have gotten much, much more sophisticated. But, from a sort of policy standpoint, it’s not so easy to draw a line between persuasion, which is politics, and manipulation.

Part of this is going to depend on our own resilience. Resilience has always been the best response to propaganda. There is no way in free societies for us to control what messages our citizens are going to hear from different sources, but they can have a stronger base of civic education and values, in order to kind of withstand that.

Manoush Zomorodi: Ben Wizner makes it pretty clear he would love to see social media platforms held to account for their role in spreading misleading information and prioritizing more clicks over better content. It may be that companies, like Google’s YouTube, are paying attention. In June, the video platform said they were expanding their effort to slow the spread of, as they call it, “borderline content and harmful misinformation.” YouTube’s algorithm will be adjusted to limit promotion of this kind of content. It will also recommend videos from more trustworthy sources like top news channels on the platform.

Let’s pick up on that last argument that Ben was making. He believes an important part of this fight to protect democracy is making sure that citizens are well informed and share common values. In fact, civic institutions are ramping up efforts to teach people how to tell good information from misinformation from disinformation. An example of that is the Stop Fake Organization in Ukraine.

Ruslan Deynychenko: My name is Ruslan Deynychenko, and I am one of co-founders of the Stop Fake fact checking project from Ukraine from Kiev.

Yevhen Fedchenko: My name is Yevhen Fedchenko, and I am chief editor and one of the co-founders of StopFake.org.

Manoush Zomorodi: Ukraine, of course, sits right next door to Russia. The two countries are not on good terms. Russia annexed part of the country in 2014. Ruslan and Yevhen say Ukraine faces constant Russian disinformation campaigns.

Ruslan Deynychenko: Of course, when we have elections, the quality and quantity of fakes are absolutely amazing.

Manoush Zomorodi: Ukraine’s presidential election took place in March and April. A 41 year old comedian named Volodymyr Zelinsky challenged incumbent President Petro Poroshenko, but that’s not the only challenge Poroshenko faced.

Ruslan Deynychenko: It was like the whole spectrum of lies, basically starting with naming Mr. Poroshenko alcoholic, liar. They repeated constantly that he killed his brother, and his brother died in the car accident many, many years ago.

Yevhen Fedchenko: We’ve seen examples that basically you can fake anything, starting with textual information, videos, photos, fake experts, fake think tanks, forged and fake governmental documents, books. So, we need to explain all those things to people and provide them with as many examples as possible. And to show them that they really should be much more critical in terms of what they consume and from what sources.

Manoush Zomorodi: Stop Fake’s efforts don’t stop there.

Yevhen Fedchenko: What we can do is to teach people to become fact checkers themselves. We have a special sections where we basically say anyone of you can become a fact checker. You can use the same instruments we are using. You can use the same approaches.

Ruslan Deynychenko: It’s very easy to produce some fake because you need just five minutes of time to create a fake story, because your fantasy only the limit. And it takes us sometimes weeks or days to find the truth and to find the evidence that it’s not true.

Manoush Zomorodi: Find out more about Stop Fake at Stopfake.org, or look for a link in this episode’s show notes. If you’re wondering what happened to President Poroshenko, he lost the election by a landslide. 73% of votes cast went to Vladimir Zelinsky, the comedian. The extent of his political experience before was, well, playing the role of president in a TV show. It’s like if Julia Louis Dreyfus left HBO’s Veep to run against Donald Trump and won.

Stop Fake uses the Internet to reach fellow citizens in Ukraine directly. In effect, the website becomes a sort of digital public space, a forum where people can trust each other’s motives and work together to safeguard democracy. It’s important to be able to find safe spaces like this online. Yet, however useful a site like Stop Fake may be, maintaining public spaces offline can help inoculate citizens against deceptive advertising and false information. It’s something Eric Klinenberg explains in his book-

Eric Klinenberg: Called Palaces for the People: How Social Infrastructure Can Help Fight Inequality, Polarization, and the Decline of Civic Life.

Manoush Zomorodi: Eric is also a sociology professor at New York University.

Eric Klinenberg: Social infrastructure refers to the physical places and the organizations that shape our interactions. But social infrastructure shapes the quality of our democratic culture because it can determine whether we have meaningful interactions with other people in general, it can determine whether we are open to or closed off from neighbors and strangers, and it can shape our capacity to start dialogues with people who don’t agree with us on everything.

Manoush Zomorodi: Eric wonders if the way we engage and disengage from online discussions makes us vulnerable to influence.

Eric Klinenberg: We tend to interact with people who are very far away from us and who quickly come to represent some archetype. And so, it’s easy for us to get hostile, and aggressive, and make threats, or be dismissive without really engaging the idea or the person. So, in some ways we are susceptible to targeted ads that are hate mongering and stigmatizing of other people. At the same time, what really influences our behavior and changes our minds are the conversations that we have with people we know and trust.

When we’re with other people in real life, we tend to take in the reality of their humanity in a different way. I’m not saying in any way that we have an ideal public sphere and that we form our opinions based solely on reason and evidence. But I do think the kind of social infrastructure we have shapes the conversations that we have access to.

For instance, my son plays on a soccer team that draws kids from all over the region. And, on the weekends, I spend a lot of my time on or around soccer fields with a bunch of other parents who have nothing to do with one another except for that their kids are on the same team and love the same sport. And, on many occasions, we’ve been able to talk things through with each other. I don’t know if people vote differently because of those conversations, but it feels much more like meaningful political dialogue, and it happens because we’re around a soccer field and not an iPhone.

My point here is not we just need to build a library or a park. That would be naïve. But it is my view that we need to start establishing some sense of a common conversation with shared standards of things like evidence and logic. And I don’t see how we start to do that if we don’t have some shared physical spaces where we can spend time together, hopefully in a sustained way, and speak and listen to one another.

Manoush Zomorodi: In and of itself, online data is benign until you use it. Targeted political ads can be good or they can be bad, just depends on who’s doing the targeting and if we can even tell where it’s coming from. In that confusion, misinformation and disinformation can spread.

And it’s worth remembering, we still don’t know if ad targeting or false information actually works to get people to vote for specific candidates. What we have seen, certainly here in the U.S., is that it can sew doubt among citizens, make them question the basic pillars of democracy, like fairness and equality. The goal of those looking to cause confusion might be less about turning elections one way or another and more about creating chaos and division.

In the meantime, citizens need to remind each other, the best way to defend democracy is to stay informed. Question the source of your information. Engage your friends and your family in conversation. Politics should not be a taboo topic at the dinner table. And, of course, when it’s you and your country’s turn, go vote and let everyone online know that you did. It’s the easiest and the best way to celebrate and defend democracy.

I’m Manoush Zomorodi, and I’ll speak to you again in a couple of weeks. This is IRL, an original podcast from Firefox.