/cdn.vox-cdn.com/uploads/chorus_image/image/56404537/prince_cohn.0.jpg)
This week on Too Embarrassed to Ask, Recode’s Kara Swisher and The Verge’s Lauren Goode ask: Should internet services be allowed to kick hate groups off their systems? To help answer that, they invited Matthew Prince, the CEO of CloudFlare, who recently booted a white supremacist group from CloudFlare’s service, and Cindy Cohn, the executive director of the Electronic Freedom Foundation, who posted in opposition to CloudFlare’s actions.
You can read some of the highlights from their discussion here, or listen to it in the audio player above. Below, we’ve posted a lightly edited complete transcript of their conversation.
If you like this, be sure to subscribe to Too Embarrassed to Ask on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
Kara Swisher: Hi, I’m Kara Swisher, executive editor of Recode.
Lauren Goode: I’m Lauren Goode, senior tech editor at The Verge.
KS: You’re listening to Too Embarrassed to Ask, coming to you from the Vox Media podcast network. This is a show where we answer all of your embarrassing questions about consumer tech.
LG: It could be anything, like, “Will the tech industry ever be able to solve its diversity problems?”
KS: No.
LG: “How do I protect my accounts and my identity online?”
KS: You cannot.
LG: Kara’s password is 12345.
KS: It is not.
LG: Just telling everybody right now. “Who will the new CEO of Uber be?”
KS: Me.
LG: Kara, I feel like it’s you. Is it you?
KS: It’s me.
LG: Oh, they’re getting the CEO they deserve.
KS: Exactly. I’m going to take some names, knock some heads.
LG: That’s going to be fun, you and Frances Frei.
KS: Yeah, right.
LG: The pair of you, jeez. “Why does Kara hate the Google bike so much?”
KS: I just was at Google yesterday.
LG: We talked about the bikes in our bike-sharing podcast last week.
KS: They were there again and they’re still multicolored and irritating. They irritate me.
LG: Why are you so bothered by the Google bikes?
KS: I don’t know, but I had the same lizard feeling of wanting to destroy them. Anyway, so send us your questions, we do read them all. Find us on Twitter and tweet them to @Recode or to myself or Lauren, with the #TooEmbarrassed.
LG: We also have an email address, TooEmbarrassed@recode.net. Just a friendly reminder that embarrassed has two R’s and two S’s because we want to get your emails.
Today on Too Embarrassed to Ask we’re joined by two guests. First up is Matthew Prince, the CEO of CloudFlare, which is a content distribution network and we’ll talk about what that means. If you’ve been following the news recently, you might know that CloudFlare publicly cut off its support for the neo-Nazi website the Daily Stormer after what happened in Charlottesville. That was a more complicated and controversial decision than you might think. Matthew, welcome to Too Embarrassed to Ask.
Michael Prince: I am happy to be here.
LG: We’re also joined by the executive director of the Electronic Frontier Foundation, Cindy Cohn. She recently co-authored a blog post for the EFF which criticized the moves being made by CloudFlare and other tech companies, saying that these are dangerous moves and that the precedents being set now can shift the justice of those removals. Cindy, welcome to the show and thank you for being here.
Cindy Cohn: Thank you.
KS: All right, so this is what we’re not ... We’re not trying to have too much of a debate here, but actually, there’s different points of view and in Silicon Valley it’s really been a big debate. Not just here with the firing of James Damore at Google, all kinds of issues around free speech and the limits of it, and other things. Let’s first get to CloudFlare. Explain very briefly what you guys do and then what prompted you to do this and take this act.
MP: Sure. So, CloudFlare isn’t a household name.
KS: No.
MP: We run what is internet infrastructure at some level. We sit behind the scenes and make internet applications faster and we protect them from a wide range of cyber attacks. Our customers range from some of the biggest institutions in the world, down to small businesses, individual blogs, and we see about 10 percent of all internet requests flow through our network. While we have just shy of 10 million customers, we see about 2.8 billion people go through our network. You both have probably used our network hundreds of times in the last 24 hours. When we’re doing our job, we ... You don’t notice anything except for a slightly faster internet experience.
What most of the content flows through our network is, is not anything controversial, but every once in a while there are people who sign up for us, and we have a free version of the service, who sign up that are pretty vile and pretty repugnant. The Daily Stormer was a neo-Nazi website and it had signed up for us quite some time ago. We get 20,000 new customers signing up every day, so it wasn’t something that came on our radar until about three months ago we became aware that people who were submitted abuse reports to us. The way that we would handle that is pass them onto the host of the content, so that they can make a determination whether that had violated one of their principles or one of their rules.
KS: So you were taking a very hands-off approach. You would get these complaints and you would say, “We’re not addressing them, we’re giving them to the host site.”
MP: Yeah. So, what we had always thought is that true north for us was that, if you think about it from a perspective of like law enforcement, the job of law enforcement should be no harder because CloudFlare exists, but it shouldn’t easier, either. We shouldn’t be a great system to be able to add an additional point of control in the network. It’s important that ... We had said for quite some time that if we fire a customer, the content doesn’t disappear from the internet, it just becomes slower and a little bit more vulnerable to attacks.
What was happening in the case of the Daily Stormer, though, was that they were doing something that we had actually never contemplated, which was just actually really pretty evil. They were going after the individuals that had submitted those complaints and harassing them, stalking them, threatening them in various ways. That was the point at which they got on my radar screen.
KS: So, you’ve been contemplating ... I think you wrote in a blog post that got a lot of attention, besides calling ... You woke up in a bad mood and decided they shouldn’t be allowed on the internet and you cut them off.
MP: Yep.
KS: Then you said, “Because I think he’s an asshole.”
MP: Yeah.
KS: I think that’s the word you used, correct?
MP: Yeah. That was something that was an email that I sent to our team and it was really ... At some level it changed in our policy. I think the straw that was sort of the final straw for me was exactly a week ago I woke up and on Twitter there were screenshots of people who are on this site, alleging that we supported them. They said that, “The senior echelons of CloudFlare, they’re one of us.” Then it just became a major distraction to us.
While we wanted to have a conversation about where the right place was for the internet to be regulated or content to be controlled, that was just getting lost in the noise of, “Why do you support Nazis?” The reality was that we thought the content was absolutely repugnant, but we had traditionally thought that it wasn’t the right place for us as an infrastructure company to be making choices on what content was allowed and not allowed online. So, on Wednesday, I think we deviated from our policy and I think that, that’s the exception that proves how important it is for us to have conversations [about this].
KS: I’m going to get to Cindy in a second about deviation, but what caused you to deviate suddenly?
MP: I woke up in a bad mood and I was sick of these guys.
LG: To be clear, too, we’re taping this on a Wednesday, but by the time you hear this podcast, for our listeners, it might be a little bit after that fact.
MP: Yep.
LG: But you’re saying this is exactly a week ago.
MP: That’s correct.
LG: From this point you woke up and ...
MP: And said, we need to have a conversation about where the right place for regulation to happen is. We can’t start to have that conversation until we deal with this issue, but one of my favorite sayings is from a former congressman from Oklahoma named JC Watts who said, “When you’re explaining, you’re losing.”
We found ourselves getting calls from reporters saying, “Why are you supporting Daily Stormer?” We would say, “Well, here’s our position. We think that it’s important for us to be neutral.” People would say, “Oh,” one of two things, either, “How can you support these, these are Nazis,” and they’d write a story about how evil we were to support Nazis. Or they’d say, “Oh, that makes sense,” and then they wouldn’t write anything at all.
So, what we weren’t having was a public debate like we did internally, and we thought that that public debate was really important. I’m extremely happy that Cindy wrote what she wrote and that we’re here talking about this. I think it’s an important issue that we need to think about as technology companies, but also as internet consumers, content creators, regulators, law enforcement, where is the right place in the infrastructure stack for editorial decisions to be made?
KS: So, Cindy, you were like, “No, no, no, no.” A lot of them did it, not just CloudFlare, but GoDaddy ...
LG: Google refused Daily Stormer as a customer.
KS: And have now thrown them off.
LG: Facebook.
KS: Lots of people.
LG: Discord, GoFundMe, they’ve all taken steps to ban white nationalist content. Your response has been, I’m going to put words in your mouth, “This is a slippery slope.”
CC: I think it’s really dangerous. It’s not just a slippery slope that we’re starting, it’s a slippery slope we’ve been on for 17 years. The whole time I’ve been at EFF, I’ve been trying to help people who are trying to speak online, who face people who don’t come to them, but instead go upstream, who go to their domain name host, try to get the whole domain thrown off. Go up where CloudFlare sits, which is kind of in some ways a little to the side, but even higher in the infrastructure, and try to silence voices they don’t like. The vast majority of those cases are not Nazis.
LG: Give us an example.
CC: Well, we represented a group called the Yes Men. They’re a parody group and they did some pointed political criticism of De Beers diamonds. They went upstream from ... The Chamber of Commerce lodged a complaint against then, went upstream and tried to get the Yes Men thrown offline. We’ve worked with activists who are active in the oil and gas fights about where drilling should happen. Again, Shell Oil got mad at them, went upstream, got their website kicked off the internet for a while until we got it put back on again.
So, the moment where this is about Nazis, to me, is very kind of late in the conversation and in some ways not the best example because the vast majority of these take-down requests — especially, again, I want to differentiate between deep-infrastructure companies and the frontline companies that really have a direct relationship with their users like the Facebook, Twitter, and those guys. It’s not that different, but I think it’s very different once you start going up the chain because what they do is, they take down the whole website, they can’t just take down the one bad article, they take down the whole website. So, the whole Recode website comes down because you guys say something that pisses off some billionaire.
LG: Kara never does that.
CC: It never happens, right?
KS: Not yet.
CC: But that’s a very, very big tool and that’s why these companies, including Matthew’s, have a right to decide who they’re doing business with, but we urge them to be really, really cautious about this. We need companies to think about this, not as something they’re doing in the one time because of a very public, ugly, horrible speech, I have no love lost for the people at the Daily Stormer or any neo-Nazis. But, is this a tactic that we’re willing to say is a good tactic because it’s going after people we hate now? Because I can guarantee you that in the past, and in the future, it will be used against causes that you love.
KS: So, what is that like, defending Nazis? Because I think the ACLU also had pushback from some of its donors and stuff like that. What happens, is it just you say the word neo-Nazi and everyone’s like, “Okay, let’s go after that?” What’s it like defending them?
CC: Well, I don’t defend them.
KS: No, but what’s it like having to defend that? Or the principle?
CC: Well, I think that if we don’t believe in laws and processes and we throw them away every time somebody gets angry at us, we will end up on the backside of this. As a Jew, we’ve had our voices silenced over and over again through time. The idea that there needs to be rule of law, and careful processes, and ability to make correction when you do the wrong thing, is as vital to voices on all sides of this debate. That’s why the First Amendment works the way it does, that’s why the international rules about freedom of expression — this idea that freedom of expression is an American concept is not true, it’s in all of the international covenants, as well.
I have no love lost for these particular people, I don’t believe that anybody should listen to them, that they deserve a platform, but I worry about endorsing a tactic at a time when our emotions are high that has been and will be used against us. I understand the people who can’t hear very well what we’re saying about the rule because Nazis make them so angry. I have a lot of sympathy for those folks, but I think that as somebody who cares about free speech online and who sees the vast majority of small, powerless people who get to speak because the internet is an open platform, far more than in regulated industries and other sorts of situations. I want to protect that space and so you’ve got to get ... Be as angry as you want, but don’t embrace a rule that’s going to bite you ...
LG: Talk about this idea of free speech because ...
KS: Both of you.
LG: Yeah, for both of you. They use that phrase a lot when they’re talking about online content moderation when in reality, a tech company is not beholden to a citizen the same way that a state or a government is in terms of protecting the freedom of speech. I guess, Matthew, where do you draw the line? And Cindy, where do you see the responsibility of these tech companies being?
MP: I think that there’s a conversation that we need to have about when technology companies get to such scale that they effectively become the public square. In this case, the reality of publishing something on the internet, if it’s at all controversial, requires you, at this point, to have a network at a scale of CloudFlare’s, or Google’s, or Facebook’s, or Microsoft’s, that the resources to just withstand the attacks, and hacking, and everything that tries to take you down. If you don’t have that, it’s really difficult for you to be online at all.
What I worry about — which, I think, echoes what Cindy worries about — is that there’s almost a kabal of effectively 10 tech CEOs and tech executives who are making decisions on what can and cannot be online. Those decisions are somewhat arbitrary. In Google’s case, Google stopped the Daily Stormer from using their registrar service, but Google runs a lot of other services, they have an ISP, Google Fiber, they have a DNS service, Google DNS, they have the market-dominant browser, Chrome. They have search, obviously.
KS: Search, yeah.
MP: Google didn’t withdraw the Daily Stormer from any of those things. They could have ...
KS: Right. They could take it out of search. I was thinking ...
MP: They could take it out of search.
KS: They technically can.
MP: Yeah. They block things on Chrome all the time, malware, you get the sites like, “This is a dangerous site, don’t go to it,” they block stuff. They could block this, filters. They could stop DNS requests. Yet, they chose not to do that. I’m not saying that decision was wrong, in fact, if anything I think that the decision to pull them from the registrar service seems incredibly dangerous because as Cindy said, that’s an all-or-nothing decision, that’s a global decision that you’re either on or off. So, that’s a very dangerous place for content to be regulated.
But, it does show that this is a nuanced conversation. I analogize to a pre-internet world where if you image that Ma Bell is listening in on your phone calls and decides at some point you’re talking about something that doesn’t serve either their political, moral or economic interests, and they unplug you from the phone network, that doesn’t feel right.
I think we need to have a conversation about, if we’re going to regulate content online, where is the right place to regulate it? I’m the son of a journalist, I believe deeply in free speech, that’s what we talked about around the dinner table, I think it’s one of the things that makes this country so powerful, but doesn’t have the same force around the rest of the world. What does, on the other hand, is an idea of due process. An idea that there are a set of rules that you should follow and you should be able to know going into that. I don’t think that the tech industry has a set of due process.
KS: No, it doesn’t. It’s totally arbitrary.
MP: We didn’t follow principles of due process in this case.
KS: No, you just decided.
MP: That’s right. That’s one of the things that, again, I think we need to question, should that be something that we follow? We need to have that public debate and that conversation.
LG: So, Cindy, nobody can regulate these companies.
KS: Take you off search is because Larry Page thinks you’re an asshole. If Larry Page woke up and said, “I think they’re assholes, I think I’ll just remove them from search,” he can do that.
CC: He can, and I think that that’s one of the reasons why we do need to have the public conversation. I should give a small ... Just something for your listeners to know.
KS: Yeah.
CC: EFF has represented CloudFlare in the past, they fought a gag order and National Security Letter case for many years under seal and secret, we couldn’t tell anybody, but recently been unsealed. I just wanted to ...
MP: And pretty amazing. We got that order when we were a 30-person company.
KS: What did they want?
MP: Records on customers and, again, we could not, with the resources that we ... Making the decision that we were going to stand up and sue the federal government to protect what you could have looked at and said, “This is not a very nice customer,” but the process was wrong. We could not have done that without the EFF support.
CC: Yeah, so I just want to make sure, Matthew and I, I actually think Matthew did the wrong thing here, but it sparked a really important conversation, but we do have a relationship ...
KS: What would you have had him do if he feels like this? Like, “Eh, I’m not going to help this guy.”
CC: Well, I think it’s not his decision. I really think the idea that the guy who rides the BART next to me is going to be able to tell all around the world who’s the good guys and who’s the bad guys in all of these situations, is a fantasy. They’re not going to be able to get it right.
We have processes by which we put certain organizations onto a sanctions list, where companies cannot do business with them, you can’t do business with the North Korean government, you can’t do business with ISIS, there is a process by which you can put things beyond the pale. I worry that that process gets misused, too, but at least there’s a process there. It’s not, somebody had a stomach ache and decided to throw somebody off the internet.
The internet is too important to be subject to the whims of this small group of people. They’re going to get it wrong, they’re going to get played, and I don’t think that anybody should have that kind of power. That’s what Matthew said in his blog post.
KS: But they are not a public entity.
CC: They’re not.
KS: Let me think about ... This is kind of an ...
LG: Like a utility company or like a government.
KS: They’re beholden to nobody.
LG: Right. How do you think they should be regulated?
CC: Well, I don’t know that they ... I think ...
KS: And who can?
CC: Yeah. I think that there are, again, most every country around the world has certain rules about people who are beyond the pale.
LG: North Korea.
CC: North Korea and all those kinds of things. Those kinds of things are very much present and they’re present all around the world. I think that there is a process by which that happens. One of the things that’s going on here is that the U.S. government is not stepping up and taking the steps that are within its power to try to rein in the actual violence on the ground that is happening. A woman got killed, people got beat up, and we are not seeing the actual forces in our society that are supposed to respond to those kind of things responding. So, people are turning to the tech companies because maybe they’ll do something.
LG: Right.
CC: I think that we’re also turning to speech about violence because we’re feeling helpful about the violence itself.
KS: Power are amplified by social media, they do start to attract people that show up at these things, they organize online.
CC: But I think that if the people who were engaged in beating people up in Charlottesville, the people who were involved there, were actually getting arrested and they were actually doing, say, the things they do about gang violence to some of these organized, armed violent people, people wouldn’t feel such a pull to deal with the speech part of it, right?
KS: Yeah.
CC: I think that this is a byproduct of the hate, not the hate itself. We’re solving for the byproduct because it feels like that’s someplace where somebody can do something because the people who we’ve elected and are running things are not stopping the actual violence.
I think that you can get distracted by that, but I think that the focus ought to be back on the violence. Whether you’re talking about gang ordinances, you’re talking about weapons rules — which we have in some states, different ones in different states. If you’re talking about those kinds of things and really being serious about enforcing them, say as we are with things like gangs and things like criminal conspiracies, you would see a lot less pressure on the companies to step in and fill this void.
KS: Right. You’d have to acknowledge, though, a lot of the organization happens online, a lot of the social amplification. Social media’s become weaponized in many ways in the same way. So, you’re on Facebook and you’re a Facebook manager who created the Groups thing, and you see them using this, your first instinct is, “You can’t use this. I didn’t make this for you. I didn’t make it to do this.” Or, if you’re a Twitter, I can imagine the discussion. I know what happened within Twitter, I know what happened within Facebook, I’ve been at dinners, they’re practically weeping this is what’s happened with their products. So, should they do nothing? Or what would you imagine they would do? Both of you. First you and then you talk about it.
LG: Well, it’s also worth just quickly noting, too, that back in 2011 during the Arab Spring, when social media was credited with a lot of the uprising that happened there, it was almost portrayed in a more positive way.
KS: It was.
LG: This is a similar kind of thing in the sense of people just using it as a tool to gather.
KS: But a different outcome.
CC: Yeah, I think that’s right. I mean, look, these tools that help people gather together can be used to do good things and can do bad things. The interstate highway can be used to make bank robbery a lot easier, it also made it easier for us to go from here to there. These kinds of systems that help people connect, help people talk to each other, I use the highways because of traveling. They always can be used for good and bad. I think that these companies, if they’re going to decide, I think that internet infrastructure companies are never going to have the structure or the ability, the deep-infrastructure companies, to be able to not get played and to do this right. They’re always going to be running from headline to headline and they’re going to get it wrong.
When you’re talking about Twitter and Facebook, the people who are really direct user experiences, they’re beginning to develop better tools, but they get it wrong. YouTube just started using their AI to try to identify extremist content, you know what they took down? Work crimes investigators who were putting together things for the work crimes ...
LG: They’ve have had those problems.
CC: They all had those problems and it’s why I really urge caution if they’re going to go into this. We have some principles that EFF plus a bunch of international organizations put together called the Manila Principles, which are about when intermediaries can be held responsible for speech of other people. If they’re going to get involved in taking this kind of stuff down, what are the processes that have to be in place?
It looks like due process. It looks like a lot of things that we have the government do before they’re going to come after you, and those kinds of systems are not in place right now. Facebook just took down a whole bunch of, over the summer, took down a whole bunch of queer activists because they were keyword searching for dyke and fag, because they were looking for evil speech and they end up taking down good speech.
I’ve worked with women for the entire 17 years that I’ve been at EFF, I’ve worked with women in breastfeeding forums who were trying to talk about latching on, who were trying to show pictures of how to do this right, they consistently get flagged as inappropriate, or obscene, or pornographic behavior.
KS: Right.
CC: So, I worry about trying to lodge this with these companies. They’re going to get it wrong, they’re going to over-block. Right now, all the incentives are towards taking things down, not towards taking down. There’s no due process. If you ever try to put something back up again, unless you’re a celebrity, it’s very hard to get them to reconsider a decision. If we’re going to lodge this responsibility in the companies — and I’m not saying they should nothing — but I’m saying, you have to take it on seriously, you have to build in real processes and you have to stick to them.
KS: Okay, before we get to this, do you regret doing this? Because you got ...
MP: Totally.
KS: You regret having to write that blog post or what do ...
MP: No, no, no, no, no. I am confident we made the right decision in the short term because we needed to have this conversation.
KS: Right.
MP: We couldn’t have this conversation until we made that determination, but it is the wrong decision in the long term.
KS: Okay.
MP: So, echoing what Cindy said, infrastructure is never going to be the right place to make these sorts of editorial decisions, but we need to draw distinctions between what the different layers in the tech stack are. Just because a Facebook or a Twitter is making editorial determinations on what can and cannot use their platform, I’m not sure that a Level 3 or an AT&T or a CloudFlare should be doing that. We need to have that public legitimacy where when the mob comes to Level 3 and says, “Block this stuff” — because they will — or when they come to Google and say, not just the registrar, but now block people from using it in search, block people from getting to it in Chrome. That we as a public, that you, have to be able to go, “Whoa, whoa, whoa, whoa, whoa. We had this conversation, there is sort of a social contract, it’s creepy if Ma Bell is listening in on the phone calls and determining who can and cannot be online.” CloudFlare made that mistake once, let’s not let them make it again.
CC: I want to add just one thing. We’re talking about this as if the tech company CEOs or the tech companies are going to make this decision and it will just rest with them. If you think that the governments of the world aren’t going to notice that tech companies can and do exercise their right to kick people off whenever they don’t like them, and they aren’t going to show up with court orders, pressure, the ability to kick you out of their country and start getting you to implement things like ... Things about whole groups of people. If you think this is always going to be a private decision, you haven’t been paying attention.
The governments have been doing this for a long time, so you’re going to end up with Tibetans not having a voice on the internet because the Chinese don’t want them do. You’re going to end up with ... There are fights right now about Palestinians and Israelis because people are calling each other hate speech. There’s fights about Ukrainian Independence people and the Russians will come and tell you that those people are Nazis; some of them are, most of them aren’t.
MP: In 2013, under the guidance of the EFF, we set down a transparency report that said how we interact with law enforcement. We put some things in that, some bullet points of things we have never done. One of those was, we’ve never taken down content due to political pressure, and we get political pressure all the time. We protect LGBT groups in the Middle East, we protect African journalists who are reporting on government corruption, we protect human rights workers in China. We’re constantly getting requests to say, “Take that down.” It’s been very powerful for us to be able to say, “We have never done that.”
We can’t say that with the same force and conviction that we did just a week ago. I think that that is really risky. Whatever you think of Nazis, if you believe that it’s going to be an abuse of the regulatory power of technology companies by oppressive governments, by people who disagree with transparency, then you have to hold the line very clearly on what can and cannot be done.
LG: Very quickly, how concerned are each of you about the dark web? The movement of some of this information to areas of the web where people maybe can’t see it or find it as easily. One of the things that I asked Jack Dorsey a couple of weeks ago in a conversation that was largely about Square and the future of finance, but I did ask a couple questions about Twitter and their sort of unique struggles on that platform, was about this idea of how much are you supposed to censor? How much are you supposed to regulate? His answer, and I’m paraphrasing and you should go watch the interview if you’re interested, was about, he said it’s better that things are out in the open.
MP: Yeah.
LG: It’s better that things are transparent, that we can see what’s happening. How do you feel about that? Because there is a lot going on in the dark web, too, and there’s movement now into the dark web of things that were previously visible to all of us.
MP: Again, I think that the best way to fight horrible speech is with good speech. If you look at ... There was a series of articles about a week ago about, in Germany, neo-Nazis marching. For a long time, it was, how do you control the march? Towns flipped that on its head and created a system where they said, for every mile or ever step that the Nazis take, we’ll donate money to anti-Nazi causes. I think that that’s the right way to do that.
We, internally, when we see content or when we see things on our network that we find repugnant, if they pay us anything — and again, most of the time they don’t — but every once in a while someone will pay us something for it. What we do is, we very quietly turn around and we’ll take that and work with people in our organization who feel important about it, and we’ll take that money and donate it to causes that are directly opposite whatever the repugnant point of view is. That seems like a much better way to ... I think the idea that you can censor thought goes away.
What President Obama used to say, which I think was so wise, he’d say people learn to hate and you’re not born hating, you learn it.
LG: That was Nelson Mandela.
MP: He was quoting Mandela, I heard it when Obama said it. If that’s the case, then we can teach them how not to, but again, that’s not by driving it underground, that’s by exposing it, pointing at it, laughing at it, and parodying it or finding kind of counter-speech to ...
KS: So, let me get this straight. You regret what you did?
MP: I am so happy that we’re here today having this conversation.
KS: All right.
MP: Had we not done what we did, we would not be having this conversation. So, at some point, we had to have this conversation and so, I think it’s the wrong policy, which is why we said very clearly, this does not set a precedent. But, we do want to have a conversation.
LG: Will you sell your services to the Daily Stormer at some point in the future?
MP: You know, the Daily Stormer, there’s a group internally that said we didn’t kick them off because of the speech, we kicked them off because they were jerks and they were talking about how we supported them, and they were abusing our abuse processes, and they were harassing some of our staff. I totally reserve the right to not do business with jerks. I will say, though, I think that that’s ... I don’t totally believe that argument.
LG: Yeah.
MP: Had this been a blog about cute kittens, we would have cut them more slack. So, it’s really hard in our position to divorce yourself from what ...
KS: Oh no, I think you caught them because they’re Nazis. Do you regret, Cindy, that he did it? Or not?
CC: Well, I think that he did the wrong thing for the right reason. I also think that he did the wrong thing to start this conversation. I think that had they just continued, we wouldn’t be here, you wouldn’t have invited Matthew and I on this show to talk about free speech on the internet.
KS: Yeah. Well, true, but there were a lot of people, it wasn’t just Matthew, it was GoDaddy, Doable ...
CC: Yeah. I think ...
MP: They didn’t have the conversation, though.
CC: Yeah.
MP: They just said they violated our terms of service, no comment.
CC: Right. I think that the way that Matthew did this was in order to make sure that we didn’t just slide past this moment and try to pretend like we were making the Bush be Gore of censorship decisions, we never sided ever again. But actually had a conversation and it brings in stuff that we’ve been doing for a while. As I said, we put together these Manila Principles, which is Manila because they were created in the Philippines in a conversation about when companies should be held liable for content, but also then, what do the processes have to look like? If a government’s going to come to you and say, “Take down this information,” what does that process have to look like? What do you, as the content provider, have to do? We’ve thought through some of this stuff, they may not be done, but this conversation is one that we’ve needed to have for a while.
It’s all well and good for the little oil activists who can call me and I can go and I can talk to their ISP and get them reserved, but that doesn’t scale. Most of these decisions, I hear you that CEOs are starting to talk about this, but for most people who get disappeared off the internet, who get censored, who get their websites taken away, those things happen completely in the dark. They’re by low-paid employees all around the world, they’ve got ... Facebook has the equivalent of call centers all around the world where they give them a little handbook and say, “Go to it.” That’s how most people get silenced. That problem, we shouldn’t let this moment distract us from that problem ...
LG: Oh, well, it’s Nazis, so why not?
CC: Yeah, the Nazi exception. There’s a huge part of me that might be okay with that, but I think that if we don’t stand by our principles once, it’s very hard to stand by them the next time.
KS: Okay. All right. So, in just a minute we’re going to take some questions from our readers and listeners, and Matthew and Cindy are going to answer them. First, we’re going to take a quick break for a word from our sponsor.
LG: Ka-ching.
KS: Thank you.
[ad]
We’re back with Matthew Prince, the CEO of CloudFlare, and Cindy Cohn, the executive director of the Electronic Frontier Foundation. We’re going to take some questions from our readers and listeners. Lauren, would you like to read the first question?
LG: I’d be happy to. The first question is from Anshul Kapoor, @IamAnshul on Twitter, who asks, “Does CloudFlare agree with its reputation of censor for internet after Daily Stormer shutdown? Where does it draw the line? #TooEmbarassed.” Well, Anshul, I encourage you to go back and listen to the first 30 minutes of this program. However, let’s address this again. Do you agree with the reputation that you are a company that now censors people?
MP: We took one site offline. We haven’t taken a single other site offline. We’re having this conversation and I think out of this conversation hopefully comes a policy, which has more legitimacy because it’s been discussed not just inside the walls of CloudFlare, but publicly. At the end of the day, that transparency is really one of the key core principles of a system of due process.
We think it’s really important that we have this and we debate it and we think about it. I won’t be surprised if when we come out the other side of this the right answer is still that for an infrastructure company like CloudFlare, being neutral to the content while still complying with the law is the right decision and the right policy for us.
KS: But people with cat pictures get a little more slack than Nazis, right?
MP: They did in this case, but I hope that in the future, again, it doesn’t matter if I’m a cat person or a dog person, content flowing through our network, we should be neutral toward.
KS: That’s only sticking to regular rules, right? Right, okay. All right. Next one, bit ... I can’t pronounce this.
LG: Bitcionary, it’s like a Bitcoin or a Bictionary.
KS: “Does @eastdakota support regulating companies like his as utilities —” but let’s have Cindy answer this first — “to stop people like himself from actions such as his own?” So, utilities, Cindy? Everyone in internet loves a government regulation. Not.
CC: Yeah. Well, I think with the current government, you have to be very careful what you might ask for, right?
KS: Yes, please.
CC: We certainly agree on a whole different area around network neutrality. We think that the FCC’s rules for basically enforcing non-discrimination online were good ones and we’re sad that they’re under siege and going away right now. In general, I don’t think that government regulation — except on the very outer boundaries, things like the sanctions list and stuff like that — is really the right way to go because I don’t think that the government, certainly right now, I wouldn’t be comfortable with the decisions that the government would make.
MP: The thing I would add is, what government?
KS: Which one?
MP: We operate in 70 different countries around the world, we have infrastructure. Each of those different countries has different rules. So, it’s probably one of the biggest realizations that I’ve had, as CloudFlare has grown, about how there really is a patchwork of different government regulation around the world. We need to comply with those laws, but I think that saying, “This is a utility,” if you do that in the United States, well, is it in Germany and do you want to set that precedent? How exactly does that work? I think the better answer is that companies should be transparent about the policies that they follow, they should comply with the laws in the jurisdictions that they operate in, and the government should be very mindful of the places that they put in content controls. Such that they respect what the geographic boundaries are of where they have sovereignty.
The Chinese government has started to make a point that they have a sovereign right to be able to regulate networks inside China. Now, you may agree with that or not agree with that, what I think is really important, though, is that inherent in that argument has to mean that their regulations stop at their borders. Because if they go any further, then China’s laws spilling over into Thailand or Canada or Brazil is actually violating the sovereign rights of those various countries. So, if you do something like, say, someone can’t register a domain, that’s all or nothing.
LG: Or download a VPN from the app store.
MP: Right.
KS: Right.
MP: That’s all or nothing. Whereas if you ... I think that what we try very hard to do is that when we do have to comply with some regulation or some law that that stays within the borders and it doesn’t ...
KS: You only do it there. Many companies struggle with this, many.
MP: It’s a hard problem.
KS: Yeah. Absolutely.
CC: Well, there’s some bad law. There’s a Canadian Supreme Court decision recently called Equustek where the Canadian government said that they had a right to demand that Google take down a website, I think it was a trademark infringement or something, all over the world. The European right to be forgotten and the question about where those things are going — European right to be forgotten, famously created on the backs of some guy who got his picture taken in a Nazi uniform and wanted it scoured off the internet.
So, a lot of these powers you have to worry about, it’s not just the U.S. government, which we might be nervous about right now, but it’s a race to the bottom with the governments around the world. If we start putting these infrastructure providers in the place where they’re going to operate as choke points, you’re going to very quickly find out that women who aren’t covered are not going to be okay on the internet because there’s a country in the Middle East that doesn’t think that ...
KS: Well, I think in that case, Google would just block in those countries, right?
CC: They do, but there’s a question about now whether that’s going to be okay or not, right? This is what the Equustek decision says.
KS: Right.
CC: The Canadian Supreme Court has said, "We have the right in Canada to tell Google what they do everywhere."
KS: But they don’t.
CC: Well ...
MP: Maybe.
CC: That’s a fight I’ve got to win first, right?
KS: How can they compel Google to do that?
MP: Fine them.
KS: And they can’t operate in ...
CC: Yeah, the fines ...
MP: They can’t operate in Canada.
CC: The fines in Europe under the right to be forgotten are tremendous, they’re a percentage of your annual profits.
MP: GDPR, if your listeners don’t know what that acronym stands for and you’re running any kind of technology company, be afraid because that’s coming for you. Again, there is a thought for some time that the U.S. regulated the internet. The deal that the U.S. had with the rest of the world, with Europe in particular, was don’t mess with our tech companies, we’ll keep buying your cars.
One of the consequences of the current administration pulling back from a lot of those negotiations and not engaging has been that Europe is now quite emboldened to say, “Well, actually, this First Amendment thing that you’ve got in the United States, that really doesn’t seem like it makes sense to us. Maybe we should be the ones that are regulating the internet and maybe we should take over that mantle.” That is a significant risk.
What’s interesting is I think that there’s a part of that that Europe is doing because they think that that’s going to help them clamp down on the Facebooks or the Googles. I think that the likely outcome is actually the opposite of that. Facebook and Google and CloudFlare have the resources to figure out all of the different regulatory framework. Some new startup that comes out of Europe, that all of the sudden is like, “Well, now I have to figure this out, this out, this out,” it’s going to be impossible for them ever to get the scale and the momentum to be competitive. So, the laws that are intended to sort of take down that ...
KS: They’ll do the opposite. Well, that’s never happened, Matthew.
MP: Yeah.
KS: That’s never happened. Also, we invented the internet, they can’t do that. But I agree with you.
LG: Next question is from Ian Gertler, who asks, “How are things going with CloudFlare’s Project Galileo efforts? Upping the marketing ante soon? #TooEmbarrassed #CyberSecurity #NGO.” By the way, happy 10th birthday to the hashtag today, as we tape this. Project Galileo, talk briefly about that.
MP: Yeah. So, Project Galileo came out in 2013 when we started to see that generally CloudFlare’s business, small companies would get small attacks, and so we would charge them a little bit of money, and big companies would get big attacks, and we’d charge them a lot of money. But there was this group of organizations that were small organizations, largely nonprofits or small commercial entities, that would get massive cyber attacks launched against them.
It really came to my attention when a ... I’d get a list of basically free customers that we’d kicked off our network the night before for violating some policy, usually because they’d come under some major attack. I was looking down the list and I was like, “Uh-oh,” and one of them, I looked it up, the site wasn’t online anymore, but I looked it up, it was on Wikipedia, it turned out it was a largest independent newspaper in Ukraine, covering the conflict in Crimea, almost certainly came under attack by either Russian sympathizers or the Russian government itself, and it got knocked offline. We’ve got some 20-year-old SRE on our team who has to make a decision of, is this politically important or not, and that’s just a responsibility that we’re not hiring ... We can’t require everyone on our technical operations team to have a political science degree.
So, we struggled with it. One of the questions was, we don’t like making determinations on what bad content is and so, we didn’t want to make determinations on what good content, or in this case, politically or artistically important content, is the way to rephrase it. We went out and we worked with over 100 different civil society organizations, the EFF is one, ACLU, Center for Democracy and Technology, Committee to Protect Journalists ...
CC: All of them.
MP: Yeah. Tried to span the not only geographic boundaries, but political boundaries, as well. Like arguing for conservative groups, “No, seriously, we really do want you to be on this committee to pick who is either politically or artistically important.” So now, what Project Galileo does is, that if somebody submits something that says, “Listen, I’m under attack and I need help,” we send it out to this committee of over 100 NGOs and if any one of them says, “This meets the criteria of being a small commercial entity or a nonprofit, and we think it’s worth keeping online,” we extend our full enterprise-class support to protect them.
Again, the organizations that we protect as a result of that are the things that I am just the most proud of in CloudFlare, whether it’s LGBT groups in the Middle East ... We have these three African journalists who came into our office about a year and a half ago, one was from Angola, one was from Ethiopia, and they wouldn’t tell us where the third one was from because he was currently being hunted by death squads. They were reporting on corruption in their countries that they work in and they all came up to me, one guy had tears in his eyes, and hugged me and said, “We couldn’t do what we do without you being there to protect us from the attacks that are there.”
KS: These are online attacks?
MP: Yeah.
LG: These are DDOS attacks that you’re responding to?
MP: Or different cyber hacking attacks. We’re really good at stopping these things and we have the resources to be able to do it.
LG: So, in this case, you’re responding to people who have been knocked off by some other third party and you have to figure out whether to reinstate them.
MP: Yeah.
So, you’re going to this group of experts, essentially, that you’ve assembled, groups, and asking them to sort of vet the process. With what you did with the Daily Stormer, you sort of took the initiative in this case.
MP: Yep.
LG: Did you go to that group and vet them through that same process?
MP: No. One of the questions that we’ve wondered is, do you assemble kind of a committee of elders that can come in and say, “This content should be kicked offline or it shouldn’t.” I think that there’s one thing to say — again, I come from a perspective that more speech is good speech — and so in this case, we said if every single ... If any one of these organizations says that this is something worth protecting, we’ll extend our full protection to it. If you flip that around and you say, “When will we withdraw protection?” The question is, what standard do you create? Is it if, I don’t know, the Heritage Foundation says that, then would you kick it off? Or does it have to be unanimous that everyone says it? There’s a part of me that says, why should we recreate a political institution when we already have governments and regulations and laws, and they’re going to have more political legitimacy globally in each jurisdiction where those are than any committee of experts that we can assemble?
KS: Right. You would have said, “Keep them on,” if they had brought Daily Stormer to you, correct?
CC: I think we would have said it was a really dangerous for them to kick them off, yeah.
KS: Right. Okay.
CC: Yeah, I think that’s right. We have a process in the real world for trying to get an injunction to stop somebody from speaking, it’s a prior restraint, there’s a process in the law, you can get them. They are very, very hard to get and they’re very, very hard to get because we have 217 years of experience in this country that says when you try to stop somebody from speaking before they speak, that’s one of the most dangerous things you can do. We wouldn’t have a country if people couldn’t voice radical ideas and they had to go through a committee of experts before they got to go ... A committee of experts or a committee of technos, it doesn’t matter.
LG: Yeah.
CC: If you have to go on bended knee in front of somebody before you get to speak, you’re going to reduce the universe of ideas. Maybe you don’t get some heinous ideas, but you might not get the Nelson Mandelas able to speak, either.
KS: Absolutely. Absolutely. That’s why we have to listen to Donald Trump’s tweets every day.
MP: Just to be clear about Project Galileo ...
KS: And it’s the right thing to do.
MP: With Project Galileo, all of these sites could have signed up, what the committee ... What anyone there is saying is, by the way, this is so important that, don’t just put them on the free version of your service, give them the version of your service that people would pay millions of dollars a year for if they were a big financial institution or something else. So, for us, it’s not the question of, should they be on us or not, it’s the question of do they get what is hundreds of thousands or millions of dollars worth of service for free because they deserve it.
KS: Right. They deserve it.
Okay. Next question, I’ll have both of you answer this, it’s from JK @bismarchiavelli: “How do you feel about founders making unilateral decisions? How much input should investors and employees have in the decisions?” Why don’t you start, Cindy, and then Matthew.
CC: Well, I think we’re starting to see employees in Silicon Valley have a much bigger voice in these kind of things and I think that’s good.
KS: Oh yeah.
CC: I think that you’re spending your life at this organization, you’re trying to make it successful, you ought to also be able to have at least some version of a voice in some of these decisions. I think it’s good. Again, my problem is, I’m not sure anybody should have this kind of power because it’s too big a power.
KS: Investors, you know how they go.
CC: Yeah. I just think that whether ... Or the mob, or the government, I think that these are ... Many of these decisions are decisions that really ... Except on the very outer edges, should not really be made by anybody. The answer to this ... Louis Brandeis said, “The answer to speech is more speech, it’s not silencing people.” We don’t do that because we like the speech. There’s all sorts of awful speech that I would like to see gone. The Daily Stormers, in my heart of hearts, the Daily Stormers included.
We do it because if you set up that mechanism, again, we had 217 years of experience in this country, if you set up that mechanism, it will not only be used in the ways that you agree with, it will be used in the ways that you don’t. Our political discourse and our conversation as a country will shrink to just what those in power think are right. We already have enough power imbalances in this country. Powerful people have a bigger voice than less powerful people. I would like to keep the internet as one of the few places where people without power get to have a voice, too.
KS: Can I just flip something on it and say, what if ... I am certain Twitter has a contingency plan if Donald Trump crosses their line. There’s a line they have, the TOS, whatever it is. Would you defend that if he did ... He knows just where that line is, or whoever tweets for him, what if that ... Like if a big thing happened like that, which could, one night it could be he takes it in the wrong way ... You know.
CC: I think it would be the wrong decision for Twitter. I think it’s within ...
KS: What if he violates what is clearly in their TOS?
CC: Then I think it’s within Twitter’s right to do it, but it’s just as I feel about Matthew’s decision, I don’t think that it was outside of CloudFlare’s right as a company. Actually, EFF spends a lot of time defending the idea that intermediaries should not be held responsible for the speech the people make, there’s a law that we love a lot called the Communications Decency Act, section CDA-230.
MP: Which is at real risk right now.
CC: Which has an attack going on in Congress right now. I believe that Twitter has the right to kick off Donald Trump. I think it would be the wrong decision.
KS: If he violated ...
CC: I think it would be the wrong decision.
MP: So, we thought about these issues for a long time and these are really important. I think that there’s been a history in technology companies of wanting to hide from public policy issues sometimes. We really embraced them from the very beginning and we really chose to publicly come out and talk about these issues, to try and be as transparent as possible. Employees choose to come to work at CloudFlare thinking about this and these issues, and they really like it. Internally, again, there wasn’t a whole mob saying, “We have to kick them off,” there was a really careful, thoughtful conversation on here’s the pluses and the minuses, and it was thoughtful.
I’ll give you a really specific example about investors. We, at one point, had a very big deal that was all ready to get signed, it would have been worth tens of millions of dollars to us. The organization that we were going to do this deal with came back and said, “Okay, but there’s one other thing, you need to fire this one customer.” It was actually a Project Galileo customer, someone that the EFF had referred to us, actually. It was a big deal for us because it was, if we did this, this was 2013, this was tens of millions of dollars, that was a ... That’s a lot of money still to us. That was a lot of money to us back then. So, we went to our board and we said, “Here’s the situation, how should we help think about it?” I’m really proud that our board was extremely thoughtful about it and at the end of the day said, “We’re playing this for the long term, we can’t put ourselves in a position where any one organization can dictate who can and cannot use our service. We gave our word to the EFF, to Mozilla, to Project Galileo as a concept, that we would stand behind those people. Even if that means that we’re going to walk away from this deal that, that’s the right thing to do.”
CC: You can’t tell us who this was?
MP: I can’t tell you who it was.
CC: Can you tell me the organization they wanted off? We can guess ...
MP: I can’t. I can’t. I’d love ... It’ll be great for the book someday.
CC: All right. Okay.
KS: Last question.
CC: Sorry.
LG: Last question is from Cody Wilson. Cody, by the way, is a well-known — and Cindy’s laughing, chuckling I should say — is a well-known techno anarchist who founded Defense Distributed. It’s a platform for open sourcing 3-D printed guns. We’ve written about him before and he wanted to know, “Will Matthew promise never to act unilaterally again?”
MP: No. I won’t promise to do that, but I hope that I won’t ever have to do that again. I don’t think that’s the right way for an infrastructure company like us to operate. Again, I think that we’ve tried to sort of show how dangerous those decisions are when they’re made by me, or Mark, or Jeff, or Larry, or anyone else that’s out there. So, I’ve learned enough through this that you’ll be amazed at the situations that you find yourself in. What I hope is that, again, with the policy that we come up with, we create a social contract around what infrastructure companies like CloudFlare should do. That when situations come up like this in the future, that we’ll feel like people ... Like the conversation isn’t, “Why do you support Nazis?” the conversation is, “It’s really awful that Nazis use your platform, but let’s use real legal process in order to deal with this.” As opposed to just pressuring me or the company, or if it’s not us, Google or Facebook or GoDaddy or whoever ...
KS: So, you reserve the right to act unilaterally?
MP: Sure.
KS: Yeah.
MP: That is why the board of CloudFlare appoints me.
KS: Right.
MP: That’s what the CEO ultimately does. There is a point at which my job is chief coin flipper. Where really hard decisions come up to us and the team says, “We don’t know if the answer is A or B,” and somebody’s got to make that call, I will make those calls and I think that people like Cindy should hold me accountable for the calls that I make.
KS: So, Cindy, what are you going to do to stop him from flipping those coins in a way that you don’t like?
CC: Well, hopefully he’ll call me and we can talk it through, and we can be part of that. I should just point out, the reason I chuckled is because EFF has filed amicus briefs in support of Cody in his case, and we have a relationship with him and the cases that he’s doing. Then, of course, there’s Matthew who’s also a client, so I was kind of chuckling that I really can’t throw a rock without hitting somebody that EFF ... We can represent anybody, right?
KS: Yeah.
CC: I didn’t want anybody to misunderstand my chuckling, my chuckling was just the idea that these conversations end up with lots of people who EFF has helped over the years.
KS: You must have a fun holiday party.
CC: We have a good time.
MP: EFF is such an important organization ...
CC: No Nazis over there.
LG: We know what we’re doing next December.
KS: I know, we’re going to your party.
CC: You’re going to come to our holiday party?
MP: They do have a fun holiday party, but it’s ...
CC: God, I’ll bring my 3-D printed weapon.
MP: It’s so important what they do, and if you do believe in free speech and that the internet should be open and transparent, supporting EFF in any ways you can is something that makes a ton of sense.
CC: Often, you know, free speech is like this, almost more than any other part of the Constitution, although I would say the Fourth Amendment as well, is that sometimes you have to support people doing things that you vehemently disagree with in order to uphold the principles. So, the amount of First Amendment law that you rely on right now, that was created by Larry Flint, is a huge amount.
KS: Yeah.
CC: So, we end up in the American legal system especially, having situations that are the hard ones coming up in the court. Organizations like EFF will try to come in and talk about the principles at stake. Not regardless of, but often despite some of the positions that are taken by the people in the case.
KS: You really just don’t like what you do, but you love it, right?
CC: I love what I do. I get up happy every single day that I get to try to make the world better.
KS: Can I ask you very briefly, just answer, what do you think about the Google firing of James Damore? Would you have done that? Quickly, very short.
MP: I don’t know. I honestly am not up on the individual circumstances of that enough. What was hard in this case is that things can become such distractions internally that you can’t get work done.
KS: Right.
MP: So, sometimes you make decisions that are like, at some point, we have to get work done again.
KS: Last word?
CC: Yeah. I think that it was a very, very hard decision. I do think that simply getting rid of somebody and firing somebody because of their political beliefs is wrong.
MP: Dangerous.
CC: And it’s actually illegal in the state of California. What he was doing was actually trying to talk about Google and how Google should be different, it was a slightly different situation, and there, I think, as an employer, Google has to think about what kind of environment it’s creating.
KS: Yeah. It’s a great case.
CC: Yeah, I think it’s a really hard one, but because they overlap, they intertwine a lot, but in general, I think you should be able to work at a place and espouse political positions that are unpopular, and not have that result in you getting fired.
KS: And yet, you can be.
CC: Yet, you can be, and every employer will tell you the need to create a safe work environment, a place that doesn’t have ... Especially in sexual harassment issues, a work environment that feels open and free is of tremendous value, as well. So, that’s why I say it’s a hard case.
KS: Hard case. Okay. Great. This has been another fantastic episode ...
LG: A really good episode.
KS: ... of Too Embarrassed to Ask. Matthew and Cindy, thank you so much for joining us.
CC: Thank you.
MP: Thanks.
This article originally appeared on Recode.net.