Free Speech, Limited?

Season 1: Episode 7

Recent events like the Charlottesville, VA rally have revealed the Internet’s role in helping spread IRL threats and violence. Leaders in the tech world have represented varying positions on both protecting free speech and also reducing hate speech online. Should tech companies regulate who says what on the Internet?

Published: September 18, 2017

Show Notes

Freedom of speech is important, online and off. And, it’s also important that free speech not infringe on the freedom of others. Tell us: what can regular internet citizens do to address this issue? How can we all accelerate the pace of change for a more free, civil and healthy Internet?

Transcript

Veronica Belmont: Just a head’s up, today’s episode contains a word or two some people may find offensive. This is IRL, Online Life is Real Life, an original podcast from Mozilla. I’m Veronica Belmont.

News clip: A horrifying scene in Charlottesville as this car plowed into a crowd of people. The driver, then, backing up and witnesses say, “Dragging at least one person.” Unite the Right organizers.

Veronica Belmont: In August, a protest turned deadly in the American town of Charlottesville, Virginia. After organizing online, a large group of torch-wielding white supremacists descended on the town to protest a removal of a Confederate statue. They had assembled lawfully. They had every right to be there, and to express their opinion, and so did a large group of counter-protesters. Things got ugly, and in the end a woman named Heather Heyer was dead. Hit by a car. Driven by one of the racist protestors. Laws that govern free expression differ depending on what country you live in. In the United States, freedom of speech is considered a fundamental virtue of democracy. There’s a reluctance to restrict a person’s right to say what’s on their mind, and yet there’s also a belief that free speech should not infringe on the freedom of others. After Charlottesville, this tension manifested itself online, and with the Neo-Nazis website, the Daily Stormer, in particular. The site was accused of further inciting or encouraging violence. It was all too much for a group of tech companies, including GoDaddy, Google, and Cloudflare. They pulled their services, and, just like that, the Daily Stormer was offline, exiled to the dark web. Some will say this is an act of common decency, but it also brings up an important question: when large companies own the publication platforms and the plumbing that makes the internet accessible to all of us, should they have the power to control our right to speak? That’s what today’s episode is exploring. I have three guests joining me for this conversation. There’s Jillian York. She’s with the Electronic Frontier Foundation and runs the website, onlinecensorship.org. She’s in Berlin, Germany. Hi, Jillian.

Jillian York: Hi!

Veronica Belmont: Also, here in the studio with me in San Francisco is Brandi Collins. She’s a Senior Campaign Director with Color of Change, an online racial justice organization. Hi, Brandi.

Brandi Collins: Hi, there!

Veronica Belmont: And Anil Dash, who’s joining us from New York. He’s a tech entrepreneur and CEO of Fog Creek Software. Hi Anil!

Anil Dash: Hello, glad to be here.

Veronica Belmont: Okay, to start us off, I’d like you to listen to someone who watched the Charlottesville news footage from a perspective very different from ours. His name is Brad Galloway. When Brad was younger, he was part of a skinhead street gang. Here’s what he said he noticed most about the alt-Right protesters.

Brad Galloway: If you look at Charlottesville, I was watching the faces of the people that were in person offline at the rally. You can see the ones that are really, really involved, and there’s a lot of confused faces I noticed, too, there. I would say those are maybe some guys that were online, noticing the advertisements for this rally, and they agreed online to say, “Yeah, yeah! That’s a great idea! We should do that.” Then, all of a sudden, they’re actually there in person going, “Whoa! This is different than online.” Right?

Veronica Belmont: Brad is saying something interesting though about how online and offline hate intersects. Any all, how do you react to what Brad’s saying here about people having to confront the reality of what they’re saying offline, online?

Anil Dash: I think this is something that a lot of our digital platforms have built into them is that they distance us from the humanity that people we’re talking to, talking about. I think there is a point, hopefully not usually as dramatic as being a white supremacists, where a lot of people feel, “Oh, this thing I say online isn’t real; doesn’t count.” It’s sort of a game. I do think there are a lot of circumstances, where, as soon as things get personal, immediate, human, face-to-face, decent people, who may only made a mistake instead of committing to being their worst selves, may confront that and have a reckoning with it. I think we’ve all had minor versions of that. I mean, I’ve had whatever the usual back-and-forth online debate with somebody on social media, and I’d say, “Why don’t you give me a call?” And I’d put my phone number out there. All of a sudden, get a very, very different tone. I think we’re seeing a version of that here.

Veronica Belmont: Brandi, what questions were you asking after Charlottesville?

Brandi Collins: I’m not sure I was asking a lot of questions after Charlottesville. I think I was demanding a lot of answers because this isn’t something that’s been unpredictable, right? I think things that came out afterward showed that there were actually online discussions that were being had anticipating violence in Charlottesville. So, this is not just out of the blue and people were like, “Hey, you know what? I think I’ll take in a nice Confederate flag rally, and not expect anything to go down.” This is a part of a continued violence that we’ve seen throughout our history and particularly has been ramped up in recent months and years.

Veronica Belmont: Jillian, do you think that there’s any chance that some of these protestors realized that they’re online hate speech had consequences beyond maybe what they’d considered?

Jillian York: Yeah, I know. I understand what Anil was saying. That I really understand that idea of being a different person offline than you are online, but a lot of people do come out with stuff. I do it all the time. Say stuff on Twitter that I probably wouldn’t have the guts to say in person. I mean, of course, I think we have to separate that problem from the actual issue of ideology here, which is really the root of this.

Veronica Belmont: I think it’s also kind of at the root of this whole conversation of the difference between hate speech and free speech, and Anil, I know that’s not super well defined. Where’s your line?

Anil Dash: I think it depends on context, right? There are things that are legal to say. But, specifically, almost all this speech happens on commercial platforms. Right? They’ve already gone well past simply saying, “Everything that’s legal is allowed on our platforms.” They’re many, many things that are legal that they don’t allow. They’ve built technologies to constrain speech for business reasons. Now, the question is what they consider worthy of their protection as a business versus not? That’s declaration of the values of those companies. It’s as simple as that! If they say … And to the point about free speech, the organizers of a lot of these white nationalist rallies and white supremacists’ rallies very deliberately and explicitly talk about how they want to use the rhetoric of free speech as a tactic for getting their message out there. That’s something that we should be mindful of is this is what they see as an exploitable value of ours that they can use to grow their base, grow the amount of people sending them money, grow their events. What I think is we haven’t really, really reckon with the fact that it’s used as a tool for advancing causes that all of us would agree are destructive.

Veronica Belmont: That actually ties in really well with a comment from Matthew Prince, who’s the CEO of Cloud Flare. Cloudflare is a service that protects websites from hackers. Vice Magazine actually interviewed him about his decision to kick Daily Stormer off of their service. He admits that his decision was arbitrary, but, well, let’s take a listen with what he said to Vice’s Managing Editor, Michael Moynihan.

Matthew Prince: I did not shedding a tear that that content isn’t online anymore, but one of my fellow employees came up to me the day that we talked, took it offline and said, “Hey, is this the day the internet dies?”

Michael Moynihan: There was no due process. You woke up one morning and you said, “ This is bad, and I’m going to do something about it.”

Matthew Prince: I am deeply concerned that I had the authority and the power to wake up one morning and say, “You know what? I’m done. These guys! I’m sick of this! Fuck ‘em!” They’re off the internet.

Michael Moynihan: Do you have too much power?

Matthew Prince: What we need to have is a conversation about where is it the right place for tech companies to be regulating the internet?

Veronica Belmont: Brandi, what do you have to say about that?

Brandi Collins: I think for me there’s always this huge question that the role of Silicon Valley in the climate that we’re seeing right now. I think Anil completely nailed it on the head when he said that when it comes to white nationalism or domestic terrorism, there seems to be a lot more allowed than when it comes to other spaces. For a long time, Silicon Valley has been super hesitant to actually do anything. We had been having hard conversations with these platforms for months before Charlottesville around some of the things that we were seeing online, and the internet world that’s being created, which, in theory, the internet is supposed to be this really global place where anything is possible. Yet, what we see in the way that the bits and bytes and algorithms are developed is that it’s actually creating more siloed, more radicalized spaces that are then bleeding offline and having these deep consequences. I think what you hear him, I think, going through is like Silicon Valley is now waking up to “Oh! We do have a role to play here.” But, what is that role? I think is one that they still seemed to be struggling with.

Veronica Belmont: Yeah, it seems like the tech industry, which I am a part of. We’ve created these platforms, yet we don’t fully understand the repercussions of the rules we make around those platforms. Jillian, it sounds like he admits here that his decision creates a problem, but he still stands by what he did. How much of a problem is it really?

Jillian York: I really understand what he’s saying, because I also didn’t have any sympathy. I mean, I’m in Germany, the word that describes best what I experienced that day was schottenslata. I wasn’t upset at all to see these platforms kicked off, and yet I’m a free speech advocate, this is my job. I think what’s really interesting is that we’re seeing this conversation coming up around the issue because these whites supremacists got censored. Last year Twitter kicked off more than 200,000 accounts that were allegedly supporting terrorism, with no transparency around what that was, about who those people were, what exactly they were saying. You know, and I’ve seen people have their accounts removed for, put it … I’m doing air quotes right now, for supporting terrorism, when what they were actually doing was mocking terrorism, or engaging in counter-speech. And so for me, really what Matthew raises there, I agree with. The question is okay, if we are to agree that companies have to do this, and you know it is their bottom line. They’re corporations, they have the right to regulate speech as they see fit as it stands right now. We need to be putting processes in place so that users have the right to appeal, so that companies are transparent about who exactly they’re taking down.

Anil Dash: I want to be a little skeptical, too, about what Matthew said. Because I run a software company, we host content on our platforms, I’ve run social media platforms and used CloudSearch, their product. And there’s a couple things that come out of this. One is all of a sudden he’s saying “We have so much power, how do we deal with this power? What’s our transparent process, what do we do about this?” This has been true from day one of that company’s existence, right? And I say this as a customer and somebody that likes their services. They have always had that power. And the question is why they didn’t reckon with it before. But I think in Silicon Valley they’ve been fighting really, really hard for decades now, to take powers that were formerly things we advocated to regulation, to policy, to governance, and put it in their ball court, and say “We control this. We control this at the software layer, at the technology layer, and we don’t want to have regulators or policymakers be in charge of this, even though they have the mature process that’s been around for, in some cases, centuries.” And as soon as it gets hard, and tricky, and they’re being held accountable, they’re like “Whoa, whoa, whoa, let’s put this back in the courts and back with the lawmakers, and all of a sudden we don’t want to be in charge of this stuff.” And then, as soon as it gets challenging to moderate, regulate, oversee, engage, be literate enough in these issues, they’re like “Well, we can’t afford to do that, we want the public to pay for doing that.” It’s like, well you made all the money off of us, some of that should go towards having to deal with the repercussions.

Veronica Belmont: Some argue that decisions like this cross the line. That it’s not a company’s business to decide what is and isn’t allowed to be said or done on their platforms or by using their services. I want you to hear this clip from Nick Lim. He’s the CEO of Bit Mitigate, and they’re actually a lot like CloudFlare. We talked to Nick about why he made the decision to offer their services to Daily Stormer after all of this went down.

Nick Lim: First, I’m Asian, so I’m a bit of a target of their hate speech, so I think it’s clear to see that I definitely don’t agree with it. I think it’s a stupid idea. But frankly, it’s not my place to decide for all people, and I just service technological infrastructure to provide free speech for all people, as long as it’s legal. I think that if I weren’t have to upset though, I think this has been real concerning, because it would have been a significant point in history where the corporations have really dictated the course of content and communications on the internet.

Veronica Belmont: So Jillian, what do you have to say to that? Does this count under free speech in your mind?

Jillian York: You know, I’m torn. But at the same time, I do agree with his point, that I don’t want these unelected, undemocratic companies making these decisions. But when we’re talking about Facebook and Twitter, I mean, look at these companies. They have very little diversity within them. We’re talking about companies that have less than 25% women at them, and then when you break that down through different races and whatnot, there’s really not a lot of diversity in Silicon Valley in the first place. And you’ve got these people making decisions, not just for Americans but then for the rest of the world as well, and then exporting those rules. And so just to throw a non-hate speech example in there, you know, Facebook for a long time has said “Men are okay topless, women are not.” You know, and that’s a decision that to me is very American, and very male, frankly. It’s not a decision that I’m comfortable with, and not a decision that most Europeans I talk to about this are comfortable with. And so they’re exporting their values, and I think that that kind of shows through in this. So no, I mean, I don’t want companies to be making these decisions. I don’t want private actors who are unaccountable to me making those kinds of decisions about speech.

Veronica Belmont: And Brandi, is he right? How hands off should technology companies be on these issues of free speech?

Brandi Collins: I think it’s interesting that he says that we’re at this critical juncture, because I think we’ve flown past that. Corporations are already making decision around what type of content gets put in front of us. And as he was talking, I was thinking about this quick example. So I have three computers. One for work, one for personal, and then one that’s sort of for work, but that I use to look at hate groups, basically. And so I kind of did this experiment for myself, like what if the machine thinks I’m a white male with maybe white nationalist leanings, what kind of internet experience do I have? And I’ve got to tell you, it’s a radically different experience than the one that I have on my regular computer. The type of content that’s put in front of me, the sort of ads that I get for guns, the different news stories that look to reinforce this ideology. It’s like set in front of me. It’s created this world where there’s not even an opportunity to see the other side of that.

Anil Dash: Yeah. And I do want to talk, too, about the way they sort of talk out of both sides of their mouth. I think Nick Lim is a perfect example of this, where his first reaction to the fact that he was hosting the Daily Stormer, or as he said, these are quotes, he found it really entertaining, and that it got publicity for his company, and the thing to keep in mind is he offered to host the Daily Stormer for these white supremacists after they were kicked off of other services. So their credential was they were too extreme and had incited too much violence for other services in Silicon Valley to find them acceptable, and that was why he took them in. We talk around it, because we can, to the earlier point, fall back on the rhetoric of free speech, and fall back on well, I want to make sure that I’m not the one that’s the arbiter. Well, the arbiter is the person who takes them in and gives them shelter, right? When I used to host other kinds of content, I came close to that myself. Like gosh, who am I to say what’s good, what’s bad, what’s right, what’s wrong? And the truth is we all are. We all set our boundaries, we all are accountable and responsible for what we do. And certainly to seek out the worst of the worst and say “I want to give you sanctuary,” is to subsidize them. And that has a higher bar. That has an element of accountability and culpability that we treat as equivalent of “I didn’t know this was here, and nobody had flagged it and nobody reported it.” Which is a very different circumstance.

Veronica Belmont: This is IRL, online life is real life. An original podcast from Mozilla. I’m Veronica Belmont. I wanted to jump back a little bit into this idea of corporate spaces on the web as platforms for our speech. And we talked to internet law professor and writer, Jonathan Zittrain for his perspective on this, but he thinks it’s more of an issue of whether or not what you have to say will actually be heard by anyone.

Jonathan Zittrain: It’s easier to get a message out today than it’s ever been in the course of humanity. Think Facebook, or Twitter, or Medium.com, but often these platforms are also in the business of steering people who want to get the content, to point them in one direction or another. You’re free to post a note on Facebook, but it might well be a tree that falls in a forest, that no one hears. If Facebook should choose to promote that in other people’s feeds, or Twitter chooses to make a tweet something that appears high up in people’s Twitter stream, that can direct a ton of attention, and that then, without any influence by a government, suggests the kind of power that just a few platform hosts might have in steering people to some content over others. And that’s harder to describe as outright censorship or even blocking of speech, not only because there’s not maybe a government involved in this instance, but because nothing’s being cut off, it’s just what’s being emphasized that of course by its nature will de-emphasize everything else. And these are the kinds of battles that have been fought before, but with smaller stakes and over a more distributed media environment. And that’s something we have yet to really contend with.

Veronica Belmont: I think this is especially fascinating now that Brandi, you discussed how you went online as a completely different person and experienced almost a completely different internet, just by the information you were putting out there into the world. So how much do you think this could be a bigger problem? That as social platforms keep maturing and keep deciding what content deserves more attention, that the free expression of smaller groups and marginalized individuals gets buried under the weight of everything else.

Brandi Collins: I absolutely think that we’re going in this direction, and added to that, the ability to pay more to get more exposure, more and more you see this become less what this platform was intended to be, and more and more are becoming a place for the elite.

Jillian York: The queer community has been complaining about this for a long time. Facebook doesn’t necessarily understand the difference between the uses of reclaimed words versus the use of slurs. And so, they have responded to this, but there was sort of a systematic takedown of posts that contained the word “dyke.” Several of the people that I spoke to who experienced this said that the post that they had put out there that was taken down had maybe 100 people who had access to it. They were friends only. They didn’t supposed that anyone within their friend group had reported it, and so their suspicion, and again, not confirmed, but their suspicion was that that was in some way algorithmic.

Veronica Belmont: How fair is it to even suggest that tech companies have to bear this responsibility of protecting free speech at all costs? Aren’t these companies allowed to make money and grow based on their own missions and values? I mean, there’s this bottom line that they have to look out for. Anil, what do you think?

Anil Dash: I think if we want to foist responsibility for these things onto anyone, it should probably be organizations that are among the wealthiest and most powerful institutions that have ever existed in history. Fortunately, that describes Facebook and Google and Apple and all these other companies, so it isn’t unfair to ask these incredibly wealthy companies to do a little bit more. The problem is, they’ve never even been asked. I think there’s this interesting thing where the same company, the same people who say, “We can put a self-driving car on the moon and we can build any technology you can imagine,” as soon as you talk to them about, “Well, can you make a community where people aren’t openly hostile towards one another and the voices of the marginalized aren’t squashed down?” And they’re like, “Wow, that’s science fiction. That seems too hard for me.”

Veronica Belmont: It really does feel like they’re throwing their hands up in the air and going, “No, this is beyond us. This is impossible.”

Anil Dash: “Who can conceive of such a thing? It can’t be done.”

Veronica Belmont: “Who could even think of it?

Jillian York: We’re just neutral technology companies. We can’t do anything about it.”

Anil Dash: That’s right.

Brandi Collins: Yeah, we’ve been, in Color of Change, with our partners at Center for Media Justice, and some of us and others in conversations with Facebook well over two years around bullying in their community. I think our first entry point was that some smaller groups we were working with. There were hate groups on Facebook that were posting the information of people and where they work and where they live, and people were showing up with guns. Again, this stuff has been happening for a while, and the whole time, it’s been resistance around that and always the onus on us to figure it out, instead of on these billion-dollar companies to get it right.

Veronica Belmont: This is the thing that baffles me the most about the free speech argument, and Jillian, maybe you can help me with this, because I’m a podcaster. I don’t know, but tell me where I’m right and where I’m wrong here. But that free speech kind of ends when you incite violence against a person or a group.

Jillian York: I do think that it’s really important that we draw the line between what is incitement and what is hate speech, and I think that hate speech is a much trickier subject. What is just hateful speech, speech that is nasty but doesn’t necessarily have those kinds of consequences? Speech that hurts feelings, sure, but doesn’t have the same kind of violent consequences, and then speech that is actually dangerous, speech that can call for genocide, speech that can call for violence against individuals. But unfortunately, what’s happened, in Silicon Valley, at least, is that they’ve kind of put all of this stuff under the bucket of hate speech, and then they’re trying to grapple with all of these different things. They’re trying to enforce rules that they don’t even have to have in the first place. But then when it comes to the really, really serious stuff that has consequences, that’s when they seem to throw their hands up the most, and I’m baffled by it, honestly.

Anil Dash: I definitely had those points where we were … like Facebook. “Oh, we have to take down these people’s profiles pictures because they’re showing breastfeeding, and that’s an issue.” And then on the same platform, we had people who were self-avowed Nazis and really towing the line about what they were doing to threaten people. And we’ve got, “Gosh, well, that’s just free speech.” I mean, I was not senior enough to actually have authority over it, but I was amenable to those arguments. After a while, it started to feel like, “Am I nuts here? This seems totally backwards. I don’t care about somebody showing a picture of breastfeeding, and I care a lot about somebody threatening someone’s life over their identity.” What I realized later was, the primary consideration was really stuff like advertising. Most of the large networks are ad-supported. The advertisers have these rules around what they’ll show. The thing that we often forget is, a lot of the most hateful, hurtful groups, the most threatening groups online, are very, very sophisticated. They know how the networks work. They understand the boundaries of things like advertisers’ [inaudible 00:24:38], so they’ll flag their stuff as, “Don’t put ads on this,” and, “This is adult content,” or, “It’s private,” or whatever. And then as far as the hosting companies, the platforms, are concerned, they’re like, “Oh, okay. Well, then we don’t need to censor that. We don’t need to worry about that because it’s not gonna affect our ad dollars.” Meanwhile, the other people who are sharing pictures of their kids or something, and they happen to be breastfeeding in it, they want to be out there and they want to be sharing their content with their friends. They want to be discoverable to the world, and all of a sudden they’re the ones where the advertisers are freaked out or ostensibly would be, because the algorithm is the thing detecting, “Is this the sort of thing an advertiser might take offense to?”

Veronica Belmont: #freethenips, Anil. Gotta free the nips. I’m Veronica Belmont, and this is IRL, a podcast from Mozilla. I’m changing pace a little bit here, and we touched on it a little bit earlier. It’s a question about how the technologies we’ve created can accidentally block free speech without even meaning to, and I think Jillian brought it up especially with the LGBTQ community. One example, of course, is when YouTube introduced parental controls to its service a few years ago. They meant well, as so many companies do, but in the process a bunch of content ended up being accidentally restricted. Tegan Quin from the rock group Tegan and Sara found that some of their music videos had actually been completely blocked, and we asked her about it.

Tegan Quin: The moment that I read online that some of our videos were being restricted under the new parental controls on YouTube, I was horrified. As I started searching through YouTube trying to find our music videos, I realized that a lot of our videos had been blocked, specifically a song called Alligator and a song called I Was a Fool. They have no content in it that could’ve possibly been seen as offensive. One is us dancing around in snowsuits with a bunch of other people in snowsuits. It wasn’t suggestive. It wasn’t queer. I mean, the only thing queer in the videos was us. I also at that time spoke to some other LGBTQ artists whose content was being blocked. I actually found, what most intrigued me was that their record labels or managers were telling them not to call out YouTube because they were afraid that YouTube would, I don’t know, create some sort of list of people who had gone after them or whatever. I was like, “That’s ridiculous.”

Veronica Belmont: Ultimately, YouTube realized what went wrong and took steps to fix the problem. But Jillian, here you have band managers and record labels allegedly telling artists to self-censor so they don’t stir the pot and end up on some kind of blacklist. What does that say to you about the challenges that we all still face in understanding how free speech does and doesn’t work online?

Jillian York: Yeah, no, that’s such a tough question. I’m still picking myself up off the floor because I love Tegan and Sara so much.

Veronica Belmont: Band girl squee. Yeah, totally.

Anil Dash: Who doesn’t? You would have to be pretty terrible not to.

Veronica Belmont: I’m swooning over here.

Jillian York: No, and I think this has been a question for a long time. Just to bring in kind of my international perspective on this, one of the things that a lot of these companies like to use as an excuse … And I know Anil’s been talking about the business side. Brandi brought that up as well. But a lot of … if you look at Facebook’s community guidelines around sexuality, and I have these almost memorized, which is why I’m focusing on that and not YouTube, they’re basically saying, “We don’t allow nudity because then Saudi Arabia’s going to block us.” I know that maybe that is hyperbole, but I’ve heard from insiders that in fact they have gotten those kinds of threats before, that they don’t do this or that. I think that it says a lot, and it concerns me that we’re ending up with this really flattened version of what free speech is.

Veronica Belmont: Brandi, self-censorship like this really does predate the internet as well, but do you wonder if the way the web works today makes this a bigger or different problem than previously?

Brandi Collins: Yeah. As I was listening to that, I was sort of reminded, “Oh, this is why I don’t post prolifically online as much, because there doesn’t seem to be room for nuance online.” There’s all of this space around how we have this conversation around monitoring free speech online. I always feel weird about having a conversation about free speech as a woman and a person of colour because I feel like free speech in the US is only free to those who can afford to pay the cost.

Veronica Belmont: Anil, what do you think of that, of what Brandi and Jillian are saying?

Anil Dash: It’s very telling that there’s this huge chilling effect of people wondering, “What are these networks, what are these giant social media companies and tech companies gonna allow me to publish?” Part of it is, the content creators, whether they’re making videos or writing or whatever else they’re doing, feel dependent. If you don’t have a presence on Facebook, you don’t have a presence on YouTube, can you be out there in the world? Are you even gonna exist? Are you gonna risk getting silently squelched by these platforms? You can’t afford to take that risk, and certainly I don’t want Saudi Arabia’s standards to dictate the world’s standards for how we communicate with each other.

Veronica Belmont: That’s a really good point, yeah.

Anil Dash: I wish that Saudi Arabia would at least bar white supremacists, because then the social networks here in America would presumably have to follow. But I think that there could be some sort of-

Veronica Belmont: Possible solution.

Anil Dash: Yeah, exactly. My other solution was that the major record labels could record each of the common forms of hate speech as pop singles and copyright them, and then whenever anybody uses those slurs, they would have to take it down because it’d be violating the record labels’ copyright. But I don’t know if that would work.

Veronica Belmont: This is outside the box thinking. I like it. We’re coming up with ideas.

Anil Dash: See? See, I’m about solving problems.

Veronica Belmont: We’re just spitballing this.

Anil Dash: Exactly.

Veronica Belmont: This disconnect we can see of how we understand free speech online versus free speech offline. Where do you want this conversation to go from here, for all of us? Jillian, what do you think?

Jillian York: On a more specific prescription, I want to see these companies start to put in place due process. If you are a user who is wrongly taken down or a marginalized group that’s affected by bad algorithms and things like that, there’s no way that you can appeal that, and that’s something that I think that companies could’ve done yesterday and just have no will to do.

Veronica Belmont: Anil?

Anil Dash: I want to see direct action from ordinary citizens. It takes a long time to pull these things off. I look at analogies like the slow food movement or the organic food movement. At the time they were sort of derided as being wild-eyed extremists, and nobody ever cares about this stuff. Now you walk into Whole Foods and it tells you the farm where your apple was raised on and how far it flew to get to your store. Change can happen, and sometimes it takes years or decades, but there is a very, very large audience of people starting to think of information and our digital diets in a similar way, as something we can take control of, have agency over, and we can expect it to be sourced ethically. Everybody that cares about Fair Trade coffee beans should certainly care about the working conditions of the people that build the technologies they use and the effects that those technologies have on their communities.

Veronica Belmont: Brandi, what do you think?

Brandi Collins: I think all of that is right. I really want to see corporations feel urgency to get this right before people die. It’s really unfortunate that it takes someone getting run over by a car before these corporations act prudently. I really love these mass mobilization efforts, and I absolutely agree with Anil that I would love to see what that looks like, in terms of adding that corporate push and pressure that, yeah, that forces them to reevaluate what they’re willing and able to do for the sake of their bottom line.

Veronica Belmont: Figuring out if we should or shouldn’t place limits on free expression online isn’t easy. Even with today’s episode, I feel like we’ve really only scratched the surface. There is a lot more to this and more viewpoints to consider. Take Nick Lim’s perspective, the CEO of BitMitigate we heard from earlier. He’s not alone in believing that democracy thrives when everyone is free to share ideas and opinion, that free speech must be protected at all costs.

Nick Lim: Well, the fact of the matter is, you can’t stop an idea. You can only change its form. For example, the Daily Stormer’s still online. They’ve been pushed off a lot of domains’ registrars, but their content is still online under the dark web, and it actually is now unregulatable and it’s unstoppable. So it’s only gotten stronger in that regard, so I think a lot of people’s interest, while in good interest, have been extremely counterproductive, and now the platform is substantially more advanced.

Veronica Belmont: But there’s no absolute answer. I’m not sure that there ever can be. At least we’re able to have these conversations and try to strike a balance. There are people in countries around the world who have never had the freedom to speak their minds, and I want you to hear one last thought from internet pioneer and Mozilla chair Mitchell Baker. She has a perspective on how this conversation fits into the evolution of the web as a whole.

Mitchell Baker: I think we’re just incredibly early in understanding what networking humanity means. I guess in some ways, it’s not surprising that these questions of social norms and what’s decent and what’s civil behavior, and what does a society expect, and how does it both protect individuals expressing themselves and protect individuals from violence, are coming up all over again because we have these new tools and technologies of such powerful nature. I don’t, for any minute, think that the label “free speech” means that a community interested in civility and health and non-violence is required to do nothing. That person’s right to free speech does not mean that a community can’t build norms or can’t associate or can’t decide, “That is unacceptable behavior. I don’t want to have anything to do with it,” and figure out how to actually move forward.

Veronica Belmont: So what do you think? Freedom of speech is important, online and off. Should it have limits? Who draws those limits? Tell me what role you think us regular internet citizens should play to help create a more free, civil, and healthy internet. Head to the IRL podcast website and check out the show notes to find ways to reach out, irlpodcast.org. A big thank you to my guests today for having this conversation and giving us lots to think about. Brandi Collins is with Color of Change. Jillian York is with the Electronic Frontier Foundation and the website onlinecensorship.org. And Anil Dash is the CEO of Fog Creek Software. That’s a wrap on the first season of IRL. If you’ve just discovered us, there are six other action-packed episodes for you to indulge in, and the themes play well with some of what we talked about today. We sent delicious cakes to online trolls, spoke with hackers about our insecure web. I had a private investigator dig up what he could about me online, and it was truly a serious case of TMI. We even destroyed a children’s toy to protect us from online spies, not at all because she was totally creepy. Lots to learn and lots to enjoy, so look us up online or in your podcast player and dive into the archive. Meanwhile, we’re going to take a bit of a break to clear your caches, update our passwords, and refresh our browsers before coming back for season two. Stay subscribed and stay tuned, and I’ll be back before you know it. IRL is an original podcast from Mozilla, the nonprofit behind the Firefox browser. I’m Veronica Belmont. Thanks for listening. I’ll see you online until we catch up online next season, IRL.