On a recent episode of Recode Decode, Cindy Cohn, the executive director of the Electronic Frontier Foundation, and Claire Boyle, the managing editor of McSweeney’s Quarterly Concern, spoke with Recode’s Kara Swisher about their special team-up issue, “The End of Trust.”
On the new podcast, they discussed the state of privacy, why Facebook’s users are more like “hostages” than customers, and whether Congress will make a move to regulate tech giants when the government reopens. Plus: Why we can’t rely on algorithms to make important decisions for us.
“When you’ve got a system with a bunch of set rules and those rules are not widely interpretable and you’ve got a huge set of data that you can tell whether it’s right or it’s wrong, you can get a machine to automate in ways that seem really miraculous,” Cohn said, referring to advancements in chess-playing artificial intelligence. “But when you get beyond that relatively narrow scope of things where you’re trying to do things where you really need context, where you really need things like that, when you don’t have good data, right? If you use policing data to try to predict who’s a criminal, you’re not predicting who’s a criminal, you’re predicting who the police think is a criminal, which we all know is a very different set of things.”
Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Claire and Cindy.
Kara Swisher: Hi, I’m Kara Swisher, editor at large of Recode. You may know me as the new spokesperson for Facebook Portal, but in my spare time I talk tech, and you’re listening to Recode Decode, from the Vox Media Podcast Network.
Today, I’m excited to be in the studio with Cindy Cohn, the executive director of the Electronic Frontier Foundation, and Claire Boyle, the managing editor of “Timothy McSweeney’s Quarterly Concern.” EFF and McSweeney’s teamed up late last year to put out a special issue of the literary journal about privacy, called “The End of Trust.” Cindy and Claire, welcome to Recode Decode.
Claire Boyle: Thank you.
Cindy Cohn: Thanks.
So, this is a great conversation. I’ve been having debates with my kids on this issue. I think it’s sort of the debate of the next year, of what can be said and what should be said online. I just recently wrote a column about Donald Trump’s use of Twitter and things like that. So I want to get into it, but first I want to get your backgrounds really quickly. Why don’t we start with you, Cindy? And then Claire. Tell me how you got to where you got.
Cindy Cohn: Well, I’m the executive director of the EFF. I was the legal director for EFF from about 2000 to 2015, so through all the copyright wars and things like that. I started doing digital rights in the ’90s, though. I was counsel in one of the cases that freed up encryption technology, which is how you have security online from government control. The work that I and others did to free up encryption is why you can have a private conversation or buy something online. So, I’ve been involved.
What got you interested? What was the impetus?
Cindy Cohn: I actually have a background in international human rights. That’s what I went to school for and I worked at the United Nations. But when I got to San Francisco in the early ’90s, I met some of the early internet folks, some folks who were working for the Free Software Foundation. And it just became really clear to me that this digital world that they were already living in and that we were going to be joining was going to have a whole lot of really interesting questions about people’s rights and our Constitutional values. And I got lucky enough to get asked to join in one of the earliest fights. The crypto case was really a First Amendment case.
Right. And the Electronic Frontier Foundation, for those who don’t know, is a group here in San Francisco that ...?
Cindy Cohn: We work to protect your rights when you go online. We were founded in 1990, so we predate the World Wide Web. And we were founded by some early internet folks who really recognized that we were gonna have to think through a lot of our rights and the balances of power between us and the government and us and the corporations as we moved into the digital age. And they wanted to make sure there was an organization out there advocating for users and for freedom.
So you called it, it’s sort of like the ACLU for the internet in a lot of ways. I mean, I think of it that way.
Cindy Cohn: Yeah. We certainly do. Although, my friends at the ACLU often point out that they are the ACLU for the internet. We work in the same set of issues around how the Constitution interacts with people.
Right. And it’s never been more important than right now. It’s astonishing, after 20 years, this is sort of the time where now, some of these issues have really been coming for 20, 30 years.
Cindy Cohn: Yeah. We’re 28 years for EFF. I think, you know, when we started talking about this, it was kind of a niche thing and we were often talking about the future and we were like, “Don’t. You may not care about this now, but you’re gonna care about it in the future.” And now, we’re at a time when, if you want to get a job, if you want to find a place to live, having access to the digital world is crucial for the vast majority of people certainly in the United States and increasingly around the world. So these issues become much more real to people.
Absolutely. And Claire, tell me about your background, how you got to McSweeney’s. Of course, explain what that is. It’s a wonderful publication.
Claire Boyle: Yeah. McSweeney’s is a publishing house based in the Mission District in San Francisco. We’re entering our 20th year, actually. So it was started 20 years ago by the author Dave Eggers. And it started just as a literary journal that he made in his kitchen. And he actually said that when he started the issue, he said that there would only be 56 issues ever made of this journal, which is a little frightening because we’re working on our 56th issue right now.
Claire Boyle: But it kind of became a much bigger project than he expected. And we have an online element that has humor pieces and we started a books division. So we do literary fiction, we do nonfiction, we do essays, kids’ books. So we do a little bit of everything.
And you have a facility on Valencia Street.
Claire Boyle: Yeah. So we have a nice little abode on Valencia Street with a couple other nonprofits that Dave started as a ...
Right, helping students, writing and different … So, your background, how did you get to McSweeney’s?
Claire Boyle: I studied writing with a couple of McSweeney’s authors, and they turned me onto this weird and wonderful project and kind of shooed me along to them.
And so how did this come about, this idea? And why don’t you start, Claire, talking about, how did you decide to do this issue? Because you have topical issues, right? You’ve had topical issues before.
Claire Boyle: Yeah. We have themed issues. Themed issues is definitely a big part of the quarterly. They’re usually themed around fiction stories. So this is our first-ever entirely nonfiction issue. So it was definitely out of our wheelhouse.
Dave Eggers, who is in his other work, has been concerned with these issues — he wrote the novel The Circle and is just very invested in these ideas — and he came to us and he said, “I want to do this issue. I want us to do an issue called ‘End of Trust,’ and I want it to be about surveillance and privacy. Take it away.” So it was a really interesting way for us to start this project because we, by definition of how it came to us, were entirely novices, weren’t even necessarily engaged because it came from Dave. It was his interest.
And The Circle was about Google, essentially. It was sort of a Google, better book than a movie.
Claire Boyle: Some may say.
It was not a good movie. I just recently saw it. Not a good movie. Could’ve been. But the concept was that these giant platforms are ruling our lives and sometimes in frightening ways, essentially. And so that was the concept. So Cindy, how did you guys get involved? So they’re novices, they don’t know a lot about it beyond Dave’s books.
Cindy Cohn: Well, they reached out to us and one of my designers, Soraya Okuda, had worked with Dave in the past. And so I don’t know, I think McSweeney’s has reached out to Soraya and said, “Would EFF be interested in doing this?” We have long recognized that if we’re gonna build a really good digital future, it can’t just be the lawyers and the technologists who are thinking about it.
You’ve put out reports from time to time.
Cindy Cohn: We’ve put out reports, we actually did a whole volume of speculative fiction a year or so ago where we had friends of ours who were fiction writers write some pieces for us. But we’d never done something as professional as this. So McSweeney’s reached out and said, “Would you help us?” And we were delighted because this endeavor of building a digital world that we all wanna live in, it’s a big one and it requires people with all sorts of skill sets, not just, we’re primarily lawyers and technologists in EFF.
Right. But you’re trying to get through speculative, what is that, like, a “Black Mirror” episode you were trying to do there?
Cindy Cohn: Yeah. It’s a whole bunch of ... speculative fiction is a kind of science fiction. Science fiction purists will probably cringe that I said that. And I think you can’t build a better world unless you can envision it. And you also can’t be clear-eyed at our world unless you describe it really clearly. And so this volume is really about the clear-eyed view of the world we’re in right now and then the fiction work is really kind of, how do we take the next step?
I recently was telling a bunch of people who make products at these companies what they should do is, every time they think of a product, “What is the Black Mirror episode of it? What would be the bad story that would make us very upset? What’s the worst case scenario?”
Cindy Cohn: And some of these companies are trying to reach out to science fiction writers and have them do some of that thinking for them. And I think that’s useful. Too long, I think, folks in Silicon Valley really couldn’t step outside of their own experience of things and look at how somebody might misuse stuff.
We’ll start with the Terminator, maybe. One, two, and three.
Cindy Cohn: It’s funny because science fiction is a whole genre right now, especially overloaded with dystopian futures and I think that’s a reaction. And I’m, in my point, trying to pivot towards the other side. I keep joking that, sexual politics aside, I’d like to get back to The Jetsons and away from Black Mirror.
Interestingly, on Black Mirror, San Jacinto is the most popular show, which is a hopeful look at the future, which is interesting.
So you guys are figuring out what to do. Talk about how you brought in writers and what you were thinking about. Did you solicit writers, or what?
Claire Boyle: Sure, yeah. So when we started thinking about this project, we sat down and we did some research and we started educating ourselves — which was a big chunk of this project — and we just kept hitting up upon this realization that we needed some smart people on board. And that’s when we reached out to EFF. And thank god they were interested in being a part of it because this issue would not have existed and would not have been the thing that it is without them. So I think, like Cindy was saying, it’s this really awesome marriage between everything that EFF has been working on and their razor sharp reviews.
And how did you decide who you wanted and what was it ...
Claire Boyle: So then we started ... I think the first person I reached out to in the community was actually Cory Doctorow, who generously threw me a list of 30 to 50 people to just start reading their work. So that’s where I started. And I just read a bit of everyone that he gave to us and we started putting a list together and we talked to EFF about it. We had a meeting. And we kind of went from there. And then we solicited. All the pieces are solicited except for, I think, one. So we had an open call for submissions as well and read a bunch of them and one of the pieces in there was pulled from that.
Was pulled from that. And what were looking for, Cindy? Because there’s people who don’t agree with you. So did you want to create that, the idea of a debate, or just, here’s the 10 things we need to be worried about?
Cindy Cohn: We tried to actually have people with different viewpoints. Not necessarily a debate, there’s not really a debate inside the book, very much. But we did have people ...
Well, it’s called “The End of Trust,” so that kind of sets the tone.
Cindy Cohn: A little bit. So we had some people who think, Ethan Zuckerman, who kind of is a wildly intelligent technologist. We have people who track things. We have people who do other things. Different pieces of where trust is ending, really, whether you’re looking at Mr. Sun’s pieces about the blockchain, this is kind of the end of trust of money. Other people were talking about copyright. Other people were talking about future FOIA requests.
I thought it was really important that we include voices from people of color because when we talk about surveillance, the surveillance society we’ve built for ourselves now really is more familiar to people of color. They have been targeted by surveillance disproportionately for a long time, and a lot of the things that we’re now seeing being deployed, say, in our streets, were first deployed against people of color. And so having a Georgetown professor, Alvaro Bedoya, who runs the Color of Surveillance conference and is a dear friend of mine and there are others as well.
But it was really important for me that we not talk about just how surveillance affects the kind of people who we think of as the technical digerati, but actually where this stuff really comes from and hits the hardest is not our community.
Right. So Claire, go through some of the writers and what they wrote about.
Claire Boyle: Sure, yeah. So the first piece is Sara Wachter Boettcher, who wrote a piece called ...
Cindy Cohn: “Everything Happens So Much.”
Claire Boyle: Yes, thank you. Nailed it. And so she wrote a piece about her experience going and giving a talk at Google about her recent book that talks about the biases within algorithms and how that is created through who’s making these products. And she’s talking to Google about this. And she actually ...
Who’s making these products.
Claire Boyle: Right, exactly. She actually opens her talk with a Google Maps hiccup they had where they were putting how many calories were burned if you walk a certain distance and relating that to how many cupcakes that would be. And so it was a pretty risky, in my mind, thing that she did. It was a bold move.
And she just immediately got, once she got that link, she looked at the comments and they were so disrespectful and scathing and focused on her appearance and on her being a woman and all these things. And so it’s kind of that, she talks about being so well versed in this industry and actually so aware of the ways that it is problematic, and how even all that knowledge didn’t prepare her, didn’t inoculate her from being affected by it herself.
All right. Another one? What did Jenna Wortham write about?
Claire Boyle: Jenna Wortham wrote a really beautiful letter.
She’s a New York Times writer.
Claire Boyle: Yeah. She wrote a really beautiful letter about, you know when you walk into a convenience store and there are those cameras that you can see yourself on the screen?
Claire Boyle: There’s this kind of trend of taking a picture of yourself in that camera on the screen.
Oh, okay. Jenna would know.
Claire Boyle: Yeah. And she’s talking about the idea of recapturing, as a black woman — and this happens largely in the black community — this idea of recapturing your image and how it’s this statement about technology and about taking it back and framing yourself. Yeah. That was a really beautiful letter she wrote for us.
Some other pieces ... Cory Doctorow wrote a really wonderful piece about peak indifference and privacy nihilism, which was a really important topic. I think Cindy also wrote about in her forward, about this idea of people feeling like the issue has gone so far that we can’t pull ourselves back from it. We’ve become nihilistic, immediately, as you realize that there’s a problem.
Talk about that, Cindy.
Cindy Cohn: Yeah. In fact, there was just a study that came out over the last couple of days around this that confirmed what I think a lot of our intuition was, which is that there’s this sense that the advertisers push and the big platforms push, which is that people don’t really care about their privacy. What we’re discovering is that that’s actually not the case. People just feel unempowered. They feel like there’s nothing they can do, that they can’t really function, that they don’t really have a choice or a voice.
You see that with contracts that are supposed to be contracts, but really aren’t. They’re just a thing you have to agree to in order to do something. And so I think one of the goals that we had for this, and it’s called “The End of Trust,” was to not make people feel like at the end of it that they should just unplug all their devices, because we’re not gonna do that, or just give up and go away.
We want people to feel empowered, not disempowered, by the truth. And that’s actually tricky. And Cory works with EFF as well, is right now fighting a really bad copyright proposal in Europe. But he’s one of the best people to fire people up.
With the idea that you’re not disempowered. Because you do feel that way. We’ll talk about this in a little bit, with Facebook, you don’t know what to do. Do you need to just go off it? Or do something to make them change or at least demand changes? And you feel, which is one of the many arguments, when I did a podcast with Mark Zuckerberg, was, “Well, we’re free.” And I was like, “So people without money have to put up with shit?” It was kind of interesting, but that’s what he thinks. He’s giving you all this great stuff, so put up or shut up, kind of thing.
Cindy Cohn: I think so too. Increasingly, there’s certainly a lot of work to be done to put pressure on Facebook to be better. Most of the pressure that we put on them is to empower users, right? And one of the best ways to empower users is to let them go. I have been saying that Facebook doesn’t really have users or customers, they have hostages, at this point. And letting people go, giving them the tools so that they can take their data and leave and go somewhere else, or use open APIs.
I mean, I think the other thing that we all really need to resist is the idea that there is only one network or only two networks. We need to develop, this is some of the work that my friend Brewster Kahle is doing at the Internet Archives to build decentralized webs so that we don’t have to put so much pressure on Facebook. Because people can leave and there are other good options.
This idea that there’s five giant platforms and everything is there...
Well, it depends on what you’re platforming for, right? If you talk about Amazon, which is the back end of so many other things, that the problem that I think is most what we’re talking about is, well, first of all, how did we get here? Because the internet was designed to be the great decentralized place. And then, how do we get out?
We’re here with Cindy Cohn, the executive director of the Electronic Frontier Foundation, and Claire Boyle, the managing editor of McSweeney’s Quarterly Concern. We’re talking about a new issue of McSweeney’s that EFF worked on with them called “The End of Trust.” And it’s about all kinds of issues with different writers discussing. We talked about some of it, including tech nihilism and other things.
I want to talk a little bit more about the issue and some more stuff. Claire, talk about some more of the stuff and then I want to get into these bigger issues about what you just brought up, Cindy, about five platforms, or whatever, how many. I think there’s two. I agree with you that possibly there’s three, and where we go from here and how it evolved into this.
Claire Boyle: Sure. So we got a great piece with Snowden.
Claire Boyle: Yeah, Edward Snowden, in which his attorney at the ACLU, Ben Weisner interviews him, and it’s this really endearing conversation. So the concept is that Snowden is explaining blockchain to his lawyer. But they also have this really great dynamic because they have a closeness that you can tell through the interview. So that’s a favorite for that reason.
There’s also this really great line in the intro that Ben wrote about them getting together over vodka for Ben and milkshakes for Snowden. Just love that image of them. And I think it’s a really good representation of the issue as a whole because you’ve got this expert who has such a wealth of information trying to explain it to a larger audience. And I think that’s how we see this issue as a whole as ... what McSweeney’s was trying to do was take EFF’s knowledge and introduce it to a larger audience, make it palatable, make it understandable.
And another interview, actually, that does that really well is the Trevor Paglen/Julia Angwin conversation that they did moderated by Reyhan Harmanci, who is also incredibly smart and knowledgeable about these issues, but kind of plays the part of the reader asking questions and clarifying and digging into the issues.
And the issues they talk about. Julia’s been big on privacy issues, absolutely, and started a new company.
Claire Boyle: Right, mm-hmm.
And this is about just privacy in general?
Claire Boyle: Yeah. That one is kind of a capstone interview. They talk about mass surveillance primarily. But it’s kind of a great ... It comes early in the issue for this reason. It kind of lays out a primer for what you’re about to ...
What about copyright? Copyright is covered in this, Cindy?
Cindy Cohn: I don’t think there’s too much copyright in it because it really was aimed more at privacy and surveillance and stuff. This is not reflected in the volume, but you know a lot of the surveillance techniques that we’re now seeing deployed in other contexts were developed under copyright, right? Where the industry insisted that a lot of technologies be built so that you could watch what users do because you can’t trust users.
Cindy Cohn: The story was that you can’t trust users because of copyright, but now ...
Which they were trying to figure it out, the copyright that was posted on all these platforms, which is the fight of the last era, right?
Cindy Cohn: That’s right.
Of needy companies getting their stuff hijacked by mostly Googles and others of the world.
Cindy Cohn: Right. But when you implement a system like the one they’re trying to do in Europe right now, or the ...
Explain that. Explain.
Cindy Cohn: Yeah, so there is a part of the European copyright directive that is being negotiated right now. It just got kicked over for a little while and the pressure will be on anybody who hosts other people’s speech, which is all the platforms, that they essentially have to put in filters. That anything that you might upload to the internet has to go through some filter that gets checked to make sure that it hasn’t been claimed as a copyright infringement by somebody else before it gets posted.
Right. A lot of this does have to do with how lazy they were in doing this in the first place by just opening platforms up, you know what I mean? They didn’t put systems in place so you think any system is a problem.
Cindy Cohn: Well, we haven’t yet seen one and I think there are really reasons why we won’t ever see one that’s not massively over-broad or under-broad and just wrong.
Cindy Cohn: Filters don’t work, and as we’ve seen just most recently with Tumblr trying to get rid of adult content.
They over-get rid of.
Cindy Cohn: They over get rid of and they under get rid of. So it doesn’t work in either way. Filters really cannot substitute for human judgment.
There was a story in the New York Times showing that some of the stuff they wanted to keep off they didn’t and the stuff they didn’t mean to keep off they did.
Cindy Cohn: Yeah, on BoingBoing, Cory did this hilarious kind of recursive thing where he did a story about the things that were supposed to be able to be on Tumblr based on Tumblr’s own blog post and then Tumblr took it down and then he wrote about it again and then Tumblr took it down. They had this little circle.
Cindy Cohn: Where he kept writing and this was all based on the stuff that Tumblr itself had told the world was okay.
Cindy Cohn: So filters don’t work. They have never worked. We’ve seen this all the way back from the breastfeeding mothers ...
Right, and the picture from Vietnam.
Cindy Cohn: The picture from Vietnam. So this idea that you can technology your way out of these kinds of problems, it never works.
Well, don’t you know AI is gonna save us from this?
Cindy Cohn: You see, AI is not gonna save us. And there’s a great piece in here by Virginia Eubanks, who has been tracking the problems with algorithmic decision making, especially their disproportionate effect on the poor. Once again, the poor and people of color have this stuff deployed against them before it comes to the rest of us.
Oh, it’s utterly gonna be about criminal activity. I just don’t, you know, everyone says it’s gonna be about health. It’s gonna be about this. It’s gonna be about ... It’s gonna be about watching everybody.
Cindy Cohn: It is about watching everybody. And when you build systems with this idea that you can technologically watch everybody and magically tell who’s good and who’s bad through any kind of technology, they’re gonna fail.
Which is why, frankly, I think you don’t have to be digital. The idea that Congress shall make no law respecting freedom of speech was in our Constitution isn’t because everybody always thought all speech was great. It’s because this fundamental problem that you give too much power to somebody when you give them the power to decide who gets to speak and who doesn’t get to speak.
Cindy Cohn: And adding a machine to that mix doesn’t make it better, it makes it worse.
Yeah. Yeah. Humans are pretty bad, though.
Cindy Cohn: Well, at least it’s humans, right? At least we’ve got this idea of judges and people who ...
Right, and we don’t question machines. We don’t question machines when we think they’re right and often they are. They often, with diagnostics and different things like the ...
Cindy Cohn: Computers can do lots of things.
Cindy Cohn: They cannot look into the heart of man and decide whether you’re good or bad, though.
Right. There was a great story in the Times actually also, besides the one we’re gonna talk about in a minute, about chess and how much better they’ve gotten at chess. There’s a new Google chess player that now understands the beauty of chess versus just the brutal ... Their old solutions were just brutal attacks and now it’s being subtle, which is fascinating.
Cindy Cohn: When you’ve got a system with a bunch of set rules and those rules are not widely interpretable and you’ve got a huge set of data that you can tell whether it’s right or it’s wrong, you can get a machine to automate in ways that seem really miraculous, but when you get beyond that relatively narrow scope of things where you’re trying to do things where you really need context, where you really need things like that, when you don’t have good data, right? If you use policing data to try to predict who’s a criminal, you’re not predicting who’s a criminal, you’re predicting who the police think is a criminal, which we all know is a very different set of things.
Cindy Cohn: So these problems permeate it and they’re gonna permeate any system. I’m getting back to the copyright ...
I have a formula for that. It’s called crap in, crap out.
Cindy Cohn: Totally. Totally.
So, Claire, what did you learn from this, from going through it? Now you’re a neophyte, a novice, in this area, what did you take away from …?
Claire Boyle: Sure. Yeah, so I think the biggest thing that I learned from working on this issue with EFF is I definitely went into this issue and encountered this attitude all around me as I was working on it, this idea that I have nothing to hide, so what’s the problem with this kind of surveillance? And this attitude is an incredibly privileged attitude to have.
Because you’re not under siege.
Claire Boyle: Right. You’re not under siege. You have nothing to hide. It’s a very white perspective. It’s an upper-middle-class perspective. And there are communities that we need to protect. We have our journalists. We have our activists. We have our muckrakers and we have communities that are disproportionately watched regardless and are vulnerable to this kind of surveillance.
Thinking about these issues as individual rights is, I think, the biggest problem from the get-go instead of thinking of this as a collective right. This privacy protection against surveillance is a collective right and we have to work all of us together to provide it for ...
Well, it’s also not started that way, Cindy. I mean, it’s a creep in a lot of ways and you can do these things. It’s this sort of boiling the frog kind of thing. As you’re used to it and then you let Nest in your home, then you let this in your home and this is easier for Amazon to have a key to your house and on and on and on.
And it’s met with ... What’s interesting, I was talking about this with an antitrust lawyer the other day and they’re like, “The problem is we can’t prove consumer harm because people like it.” You know what I mean? You have to change the idea of what antitrust is, for example, because it’s more and more convenient, it’s more and more stuff that you want. Can you talk a little bit about that? Because surveillance can be not sort of malevolent-seeming in the beginning.
Cindy Cohn: Yeah, I think that it’s one of the things that humans have a hard time with, right? They have a really hard time with a small thing that happens here that could lead to a big problem down the line, and they have a really hard time if you can’t see direct causation between one and the other. And surveillance is one of those issues where both of those things exist and so it’s hard for people to think.
It’s hard for people in our age and our generation right now. You know, people who lived through World War II, you know there’s not a Holocaust survivor who doesn’t understand the risk of having big databases in the hands of anybody that will allow you to be tracked. And certainly people around the world, when I go to Cambodia, I go to any place where there’s been a massive human rights violation, almost all of them have been fed by the data that is collected by the people because that’s how you figure out the people you want from the people you don’t, whether it’s Hutus and Tutsis or whatever.
Yup. I had an argument with one of the Google founders about this, and they’re like, “We’re nice people.” I go, “Today.” And I said, “Who knows who’s gonna run it? Your information in the hands of a bad person, I don’t wanna think about it.”
Cindy Cohn: Yeah, and even a good person who doesn’t understand the broader context can cause trouble, but yes. And you know, if you build it, they will come. This data doesn’t stay in the hands of angels. It certainly doesn’t all around the world and it’s one of the things that, for instance, that Apple said to the FBI around the opening of the iPhone, they said, “Well, even if we do it for you, what do we know when the next government comes walking …?” Now that’s not Apple holding the data, that’s Apple holding the keys that lets them into your data, but it’s the same problem.
It is the case that I think people have been, you know, kind of the tiny little cuts on people’s privacy have been visible but they’ve all felt tiny. What you haven’t seen is this gigantic infrastructure back behind things that is collecting all that data, organizing that data, analyzing all that data, and increasingly running it through algorithms.
Which only a few companies have the power, right? You and I were talking about ... but the two companies I think would be Facebook and Google, or Google particularly who have ...
Cindy Cohn: For sure in terms of the consumer collecting of the data. There’s a lot of AI being sold, though, to back-end companies. Your data doesn’t just stay. I mean, Facebook, Amazon has a tremendous amount of data on all of us. They’re pretty jealous of it. Facebook will say they don’t share it, but they kind of de facto share all the information you would need to get there.
They say they don’t sell it. They share it.
Cindy Cohn: Yeah, and the way that they share it is also just a little disingenuous, but there’s a lot of companies that are scooping up a lot of this data.
Like Acxiom and the others, yeah.
Cindy Cohn: You know, Google has DoubleClick. So Google itself is collecting a bunch of data, but DoubleClick, which is the ad network, is collecting a huge amount of data and from lots and lots of places.
We have a tool that EFF built called Privacy Badger where we help protect people against their party cookies and other things, and my team there does a lot of analysis of who’s doing the most tracking, and Google by far — head and shoulders by far, because it’s about 80 percent of what you do online is trackable by Google at this point.
Right. Through your maps now and everything else.
Cindy Cohn: Everything. So they are big, but there’s a lot of other companies that have access to this data, that do stuff with this data they’re doing. So it’s important to talk about Google but it’s also important not to get too myopic, because even if Google didn’t exist anymore or wasn’t doing anything bad with them, there’s a lot more companies.
We have to think a little more, I think, holistically about what’s our approach to our data and what are the models we want to think about for our data that we want to apply to the big guys that we see, but also a lot of the big guys that we don’t see.
Where we don’t see. Now Claire, one of the things that people think about is sensors. Did you have stuff about that, sensors and surveillance like in China, actual cameras and things like that?
Claire Boyle: Yeah, we had a lot of people talking about this citizen... I forget what it’s called, the kind of citizen rating system that they’re starting to put in place.
Put in China.
Cindy Cohn: Social scoring.
Claire Boyle: Social scoring.
Social scoring, which was an episode of “Black Mirror,” but go ahead. Go ahead.
Claire Boyle: And is now happening in the real world.
Right. Why wouldn’t it?
Claire Boyle: Yeah it’s bonkers. I really didn’t ... It sounds like science fiction. I actually didn’t believe it the first time I heard it. Which is essentially that everything you do, if you show up late to work, if you jaywalk, that all is being watched, being recorded, and being folded into your score and that affects if you can buy movie tickets, if you can get visas, if you can, whatever it may be, like that will affect how you interact with the world. And it’s already being proven to change behavior dramatically. I think a lot of people touched on the fact that that’s not inconceivable for happening here.
Yeah. Is there a sense of why they’re allowing it in China? I think it’s gonna come here. I think there’s already things happening here with a lot of companies that do this in order to get credit. Now credit’s always been that thing they judge. There’s a whole problem with credit scores, of course. People are unfairly ... and so they’re like, “We’ll make it better. We’ll make it so we can look at your social feed, we can look at this.”
And it probably does make a better judgment of whether you’re gonna default on a loan or whatever they’re trying to give you, but at the same time, it adds more and more data that may be misconstrued of whether you’re ... to other things. There’s one issue of getting a credit. There’s another issue of how you behave as a citizen.
Claire Boyle: Right, right. And it’s like, how far do you want to take that and once it becomes ... You know, you get deducted for ...
Claire Boyle: Jaywalking. Or criticizing the government.
I would totally be a bad citizen. I jaywalk all the time.
Claire Boyle: You know, speaking ill of the people in power, then that becomes really dangerous and frightening.
And who decides those things?
Claire Boyle: Right.
Cindy, what do you think of China right now in terms of ... Because one of my things I talk about a lot is that one of the ... Again, when I interviewed Mark Zuckerberg, he’s like, “Well you may not like what I’m doing, but what they’re doing in China is worse.”
So I called it the “Xi or me” argument. And I’m like, “I don’t like either of you.” Like, “I don’t like him more, but I definitely don’t like you either.” And it was a really interesting question. It’s like, should we let China run the next internet age? And this is a country that absolutely does surveillance probably better than any other country at this point.
Cindy Cohn: Yeah I mean I think that they’re ...
Cindy Cohn: There are real concerns here and they can point the way to some things that are especially troubling. I think our ability to control what happens in China is pretty limited. It’s important that we pay attention to it.
Cindy Cohn: But you know, for instance, one of the problems that they had in China with their facial recognition system was that it — just like the facial recognition systems here — it really does a really poor job of recognizing people of color.
What the government of China did, at least according to the news report I read, is they went to the government of Zimbabwe and they bought all of the driver’s license and identity cards data from the government of Zimbabwe so that they could train up their systems on black faces. So you know one of the things that we hear a lot about artificial intelligence and the kinds of machine learning decision making as well, the answer to a problem to the system is just more data. But that oughta be really frightening to people.
Yes it is, and that’s in fact, I had Kai-Fu Lee talking about that. He said China’s gonna prevail because it has just more data. One of the reasons Google might be wanting to — Google denies this but I don’t believe them — that they want to go to China because they need more data. Because the Chinese companies are beating them on that issue.
Cindy Cohn: Well maybe, beating in some weird thing.
Cindy Cohn: I mean like this weird race to the bottom that everybody’s in.
Yes. But the more data is all it wants.
Cindy Cohn: Yeah and I just think that it’s tremendously troubling. I think we do have to start getting some handles around what are we doing with artificial intelligence. What’s okay to do with it? What’s not okay to do with it? What are the rules?
In California they did away with money bail and now they’re trying to use these risk assessments that are based on flawed police data and the answer isn’t more data, right? The answer is you just can’t use machine learning for certain kinds of decisions. For China, the other issue for I think Americans to take seriously is the Great Firewall of China was built by Cisco. It was built by an American company.
It sure was.
Cindy Cohn: What is the responsibility of American companies for building repressive tools that other governments use?
Which is the debate around Google going in there.
Cindy Cohn: There’s another big debate around Project Dragonfly, which is great. EFF is older than Google and we were in the mix when they decided to go into China the first time and in the mix when they decided to come out. And now, at least temporarily, it looks like they’re pulling the plug, but I think companies and ... American people here are building a lot of the technologies that are being used or training them. We have to have a serious conversation about what does it mean to be repression’s little helper and is that what we think is okay.
Right. Although you know their argument is, we’ll make them better.
Cindy Cohn: Yeah.
And I’m like, “They’ll make you worse.” Right?
Cindy Cohn: Let’s just say, as people who are technologists and scientists, I believe in evidence-based decision making. There is no evidence whatsoever that engagement with the Chinese government has been an improvement for human rights.
Cindy Cohn: That was the story about most-favored nation status, that’s been the story over and over again. The data is clear that engagement does not bring better human rights.
Cindy Cohn: So if that’s your tool, you need a different tool.
Right. But we have no leverage. Yeah.
Cindy Cohn: Well, we might have leverage if we wanted to pull them, but I think that the idea that merely engaging with any country, but certainly China ...
Cindy Cohn: Would lead to better human rights abuses has been thoroughly disproven.
Right. Claire, was there a favorite article that you liked?
Claire Boyle: Don’t you dare ask me that.
I just did. Is there a favorite topic?
Claire Boyle: They were all wonderful. So I think that one that I really dove into and was really interesting for me to work on was a piece called “The Digital Blues” by Jennifer Kabat.
Who is Jennifer...?
Claire Boyle: Jennifer Kabat is a writer, nonfiction essayist, who took this year-long quest to figure out ... She was just aggravated by this question of why everything in the digital space is blue, all the icons, all the links, all the everything is blue. Why is blue everywhere?
Why is it?
Claire Boyle: And it’s this really awesome piece that kind of chronicles her being driven mad by this hunt and jumping from people to people trying to figure out ... She talks to people who study the eye and how the eye works and she talks to people who put news together. She talks to everyone across the board and she talks to people who designed the internet from the get-go and it was, she came up with this conclusion that everyone — she jumped from person to person — came up with the same idea that blue is trust and blue is safety and blue is health and linked it back to these early attitudes when the internet was just a little inkling of an idea.
Claire Boyle: Blue links, which apparently the first link was green, I think, but it quickly turned blue.
I was there, I don’t recall. It was blue for sure right at the beginning.
Claire Boyle: This was a very utopian idea at the beginning of the internet with all the rainbows and the globes and all these icons and that the blue is this holdover for this utopian idea of the internet, but it’s kind of been used in this nefarious way to mask what has changed and mask these big tech giants who are doing these ...
Right. Well we’re gonna talk about that, where we came from, when we get back. We’re talking with Cindy Cohn of the Electronic Frontier Foundation and Claire Boyle, the managing editor of McSweeney’s Quarterly Concern. They just put out a literary journal, just put out a first nonfiction collection. It’s about privacy and the last of trust, called “The End of Trust.”
Claire was just talking about this idea of utopia, that this was gonna be a good thing. You and I were around, Cindy, when that was, I started covering the internet in 1992, ’91, so pretty early, very early on when it first went commercial, essentially, and that was the concept. That’s why I got it.
It’s like the idea of a democratic ... away from the big gatekeepers, away from the decision makers, largely white men who are running all the big newspapers, the networks, everything else, at least in this country, and all around the world. It was sort of a freeing idea. Where are we now? Where do you imagine we are now, because now it seems the chickens are coming home and they’re roosting quite a bit, like all these issues.
Cindy Cohn: Well, you know, I think that the reality is that more people have access to a microphone and a voice that reaches everyone in the world.
Than ever before.
Cindy Cohn: Than ever before, so to a certain extent, the internet did succeed on one of the things that made it really exciting for us.
So everybody gets to speak.
Cindy Cohn: Everybody gets to speak. The physical distance between you and your loved ones now makes no difference in how close you can be to them. I think that that’s a piece of a miracle that the digital networks brought that sometimes we all take for granted, now that I can just pick up my phone and I can be talking to somebody on the whole other side of the world, and we can have a real conversation and I don’t have to wait for the mail or hope that it all goes through. These things are all still with us.
What’s happened is that we have slid away from the idea of a decentralized system to one that is increasingly centralized, and there’s a lot of bad benefits from that. I think the other early thing that I call one of the original sins of the internet is we didn’t build a secure system. We built a tremendously insecure system, and so the data breaches, the data leaks, the kind of surveillance, governmental surveillance, which is something I’ve spent the last dozen years trying to stop.
And just the way it’s been built, the way it’s been built. I mean, I always say the Russians were customers of Facebook. They didn’t hack. They didn’t have to hack.
Cindy Cohn: No, they didn’t have to. I mean, the surveillance business model, which really wasn’t the first business model of the internet, it really came out in the 2000s, this idea that the way ... I always say, you know, Moses didn’t come down off of a mountain saying that surveillance advertising is the business model of the internet, it wasn’t even the first one, but this business model has resulted in a lot of things that are problematic for people right now. It’s resulted in these gigantic databases, it’s resulted in this gigantic gathering of data about us that’s now being mined and used. It’s resulted in the tremendous power of the tech companies.
There’s interesting research now coming out about who gets that advertising money, and is it really the case that consumers are benefited by the marginal difference in targeting compared to the amount of extra money that advertising are spending for that marginal bit, and the platforms are just walking away with the money in the middle.
Cindy Cohn: My friend, Cory Doctorow, points out that the growth of the digital platforms really coincided with the abandonment of traditional antitrust principles in the United States, and this idea that the only thing that matters is whether consumers have a financial harm is not the original antitrust idea either. It came out of the late 1990s, and has become, I think, a real problem for talking seriously about the problems of the big platforms right now.
Well, there’s so many. We didn’t even get into free speech. You’re coming back to talk about free speech because that’s a ...
Cindy Cohn: I would be delighted to come back and talk ...
Because that’s another one.
Cindy Cohn: Yeah, I know.
It’s really problematic. There was a story in the New York Times about who ... I don’t think they can manage it. That’s my feeling is that it’s unmanageable the way they built it, and it’s grown to this proportion that it’s unfixable by anybody.
Cindy Cohn: Well, I think that’s right, except I think that one of the things we have to do is we have to get away from the idea that there’s just one. We have to really start pushing on a decentralized web. There needs to be multiple places for people to speak, there needs to be multiple ways for people to do it, and the idea that one big company ... the New York Times today has a story that is very consistent with the work that we’ve done over the last 10 years tracking how badly the big platforms screw ...
No, they’re incompetent and immoral at the same time.
Cindy Cohn: It’s a hard thing to do.
Not immoral, amoral. They’re amoral and incompetent, which is such a delightful combination.
Cindy Cohn: It’s a very hard thing to do. You know, the United States State Department puts out a report about the human rights status of all the countries in the world. They have a dedicated staff that is on the ground in countries all around the world, and they come out with their country reports every year. They get it wrong too, but that’s the ...
I think, and the government in Sweden does this, governments do this kind of analysis of each other that is sophisticated and careful, and it takes them all year and a huge amount of time on the ground to be able to understand what’s going on. Facebook’s trying to do it, frankly, with ... I don’t think they could ever have the resources to do it well, and even if they did, they’d be behind the times.
This is why we need a model other than censorship as a way to respond to the bad things that happen online. The censorship model will always fail, and it will fail in ways that hurt marginalized voices. One of the other pieces of the censorship thing, I know we’re not talking about it right now, but is that we have been tracking this for a very long time, and if you don’t pay attention to power and how power works with the big companies and people who are speaking, then you’ll miss how censorship really works. Censorship always disempowers the people who are marginalized.
Cindy Cohn: That’s why you go to Twitter, and if you’re a movie star, you get satisfaction if somebody does something to you, and if you’re a nobody, you get kicked off the platform.
I have another expression, kiss up and kick down.
Cindy Cohn: Yep.
People who kiss up and kick down are my least-favorite people.
Cindy Cohn: But the model has to be ... just pounding on Facebook to do this better is not going to work.
They can’t. They can’t.
Cindy Cohn: It’s not going to work, and so we have to have other models.
Right. So Claire, when you look at this, was there a solutions-based thing to do here, because you don’t want to leave everybody like, “Ah, Jesus Christ, this is the worst.”
Claire Boyle: Sure, yeah, and I think that was something that was really important for us when we were putting this together is that it felt like we weren’t just ...
Not like, “Ay yi yi.”
Claire Boyle: Throwing a bunch of ... I talked to my mom the other day, and she just said, “Your quarterly is making me very anxious and unhappy.”
Cindy Cohn: Ooh.
Claire Boyle: But she hasn’t finished it yet, so maybe she’ll feel empowered by the end, but yeah, so we wanted to kind of weave in these solutions and this feeling that we can pull this back up together. So, Soraya Okuda does a really wonderful piece that’s about encryption, and it’s kind of a how-to with a bunch of different steps of things that you can do to protect yourself online, really basic easy things, but also kind of talking about the importance of encryption and the importance of ...
So not just tape on your computer over the camera. I have that.
Claire Boyle: But that was my first step.
Yeah. I think I have an EFF one that covers the camera.
Claire Boyle: It’s a very easy first step.
Cindy Cohn: Yeah, we have those camera covers.
Yeah, I know. I think I have yours.
Claire Boyle: I put one of those on my phone the other day, on my phone camera.
Oh, on the back, yeah.
Claire Boyle: And then the first phone conversation I had, I realized it was over the speaker too, so careful with that.
Yeah. There’s got to be a way you could do it so ... I think Apple should do it. Apple should do something to make it easy, that you have one so you can close it, but you can’t close it. My always thing is when I went to see Mark Zuckerberg, he had everything covered, and I’m like, “Okay, if I has everything covered …”
Claire Boyle: Right, totally.
“I really need to have everything covered.” So go ahead, so more solutions. This is to deal with encryption and do these things.
Claire Boyle: Mm-hmm, encryption and just generally keeping yourself protected online. Passwords, be careful what you plug in, using all the awesome Privacy Badger and Https Everywhere and these things that EFF have created.
Some of it is so hard to use, I even find.
Claire Boyle: These are very easy to use. You just click a download button.
No, they are, but I’m saying a lot of the stuff that you have is hard, the communication, although it’s gotten easier, like Signal and the others, although WhatsApp is owned by Facebook, again although it’s encrypted, allegedly.
Cindy Cohn: Yeah, and WhatsApp still has really good encryption. The fact that the founders just left Facebook should keep us all keeping an eye on it, but they use the Signal protocol which was originally developed, it was called OTR, so they use a good protocol and it’s open source, and so those are the things that you can look at. But yeah, it is still too hard. It’s been too hard since the ‘90s.
It’s too hard. The other day, I was struggling with, I forget which one it was. I was like, “Why am I struggling with this? I shouldn’t ...”
Cindy Cohn: Right. I mean, these companies have outsourced the idea of keeping yourself secure to you, and that’s why everything I’m on, people are like, “What can I do to keep myself secure?” And we have all sorts of things for them, but the truth is, you need to write your congressman and tell them that it’s time to start talking about making these information, information fiduciaries. It’s time to start paying attention to building secure systems. The police in this country should not be blocking people from having really strong encryption on their devices and on their systems.
Well, that’s good. We’re like five minutes away from one of those fights again.
Cindy Cohn: Yeah. They’re going to happen again, and so why is the government working against our security? They should be supporting our security. They should be investing in it.
But interesting in that fight, some people in the government, like Ash Carter was for Apple and James Comey was against Apple.
Cindy Cohn: Yeah, and it’s been 20 years in the making, but I’m very happy to have allies in the government now who understand the power of strong encryption, and former people. We have Michael Chertoff, who was the original DHS head, is on our side. A lot of the intelligence agency people are ...
Well, they know.
Cindy Cohn: They know how important it is, and they’re not lying to people anymore about how important it is, which is great, but we need to have all hands on deck to try to build a more secure network, and we can do that. The technology exists. The technology isn’t the hard part. It’s getting it deployed and getting it easy for people to use.
Well, it’s more of an afterthought to them. I don’t even think it’s even malevolence. They don’t even think about it. So what else is hopeful?
Claire Boyle: Yeah, hopeful.
Hopeful. What did you feel like after editing this thing?
Claire Boyle: I definitely felt some sense of overwhelmed by all of this awakening to all this, and then I went on a tear where I was just trying to educate everyone around me, get everyone on Signal, did a little work on that over the holidays with my family. But yeah, I think right now I’m just so much more aware of it all around me.
Once we started, right when we started editing this, I realized that it was everywhere, like everyone is talking about this. This is not going away, so I’m in an education phase right now. I’m just trying to read everything I can, and I think that’s the first step to this.
Did it make you think of going off a lot of these devices?
Claire Boyle: I think that there’s a lot of realism in this, like we’re not going back to flip phones. I’ve got to have my Google Maps. I think that it was really important to us to have that realism that we are moving forward. I’ve deleted my Facebook. That’s a start.
Mm-hmm. Anything else?
Claire Boyle: What have I changed? I added all of the Privacy Badger and all that stuff. Yeah, I’m learning.
Yeah. How did you feel after it? Well, you were already ensconced in this.
Cindy Cohn: I mean, I think that the thing that was most important to me, and I talk about it in the foreword, is that we don’t just leave people in the valley of despair, and that we talk about the way out, and I think ...
That’s tech nihilism.
Cindy Cohn: Yeah, and I think that that’s really important, and I think that we can do it. I mean, we have grabbed ahold of ourselves as a country and we’ve solved harder problems than digital security.
Cindy Cohn: But we have to. We have to think of it that way. All the individual things that people can do are really important, but if the prevailing idea is that it’s your individual alone responsibility ...
Yeah, defend your house.
Cindy Cohn: Yeah, you know.
It is. It’s like that, defend your house.
Cindy Cohn: It feels a little like that.
Milk your cow.
Cindy Cohn: Or here’s your car. Why don’t you go out and research brakes and then you can have them.
That’s a very good analogue.
Cindy Cohn: We don’t do that as a society and we shouldn’t do that here. That’s going to require political action, it’s going to require legal action, it’s going to require regular old activism. The work that Soraya and others of my colleagues do, where they’re teaching encryption techniques to people who need that, we talk about privacy as a team sport.
Right, and so I want to finish up, Cindy. Talk about where we are politically right now with the Democratic Congress, once they open the government again, if they open the government again. What do you see happening from a regulation standpoint? You’ve got over in Europe, they’re overreaching in a lot of ways.
At the same time, privacy is critical, it’s a much bigger deal there, so part of the things they’re doing there I assume you support, part of them you don’t. California has passed a privacy bill which some people think was enough, but it’s certainly the furthest along of any of them, and there’s no national privacy bill. Where do you see it going, very briefly? Where do you imagine we’re going to go in the next year?
Cindy Cohn: Well, I have some dreams. I’m a really poor predictor of the future. I do think we’re going to start to see the GDPR in Europe begin to have real effects, and the first couple of cases that the regulators pick to take are going to be really important, and we’ve been in communication with them about some of the things coming down the pike. Those are going to be interesting.
We’ve got to stop this copyright directive in Europe. That’s a disaster. It’s a filter machine. Here in the US, our California privacy bill needs a lot of work, and I think it’s going to spur a very big push for a federal privacy bill, and the thing to watch for is whether it includes preemption or not.
Cindy Cohn: Preemption is when a federal bill comes in and it makes all the state bills not valid anymore, and there’s a lot of efforts by the big platforms to try to use a federal bill to squash any stronger privacy protections that we might get from state law. A federal privacy approach is a really good idea, but not if it squelches the stronger state ones.
And what would ... so in California, it’s a flawed bill, but it’s the first massive, big privacy bill. How could that change here under ...
Cindy Cohn: Well, here I think there’s a lot of efforts to try to dumb it down, and then there’s our efforts in the ACLU’s to try to bring it back up again. The biggest thing for us is the private right of action. That is, you can have all the privacy rules in the rules you want, but if nobody’s actually empowered to protect themselves, they’re going to end up being paper or only as good as the California attorney general wants them to be, which can be good, but it could be really bad.
We really think that people should be empowered to protect their own privacy, and that includes taking somebody to court when they’ve violated your privacy, and that’s critical to us to get added to this bill. It was something that was in an earlier version that got stripped out in the end.
Mm-hmm, all those lawsuits...
Cindy Cohn: Yeah. Well, I mean, a few good lawsuits will change a lot of behavior.
I agree with you, but that’s what the big companies didn’t want ...
Cindy Cohn: Well, that’s why they didn’t, but that’s why we need to have it, right? I think it’s pretty easy to track what would be best for you right now by looking at some of the things that have the big companies worried because they’re the big privacy violators. Not all of them. Apple’s been pretty good on this. I don’t want to tar across the board.
Right. Neither do they.
Cindy Cohn: Yeah.
They’ve been trying to point out the differences quite a bit.
Cindy Cohn: Well, you know, look ...
Cindy Cohn: I like a race to the top. If you think it’s in your business interest to protect ...
Well, it’s funny. The Facebook move is like, “Oh he’s doing it because of his business.” I’m like, “Yeah, good. I don’t care.”
Cindy Cohn: Yeah. No, when competition works for you, that’s the best kind of competition, right?
Cindy Cohn: You as the user. So we’ll see that. There’s an idea floating in Congress right now and there’s a bill that Senator Schatz just introduced to try to change the idea of the people who hold our data into being a data fiduciary, and this is like your lawyer. Fiduciary is a legal word for somebody who has a special duty of loyalty to you, different kinds of duties, and I think shifting the people who hold our all data from companies who can do whatever they want as long as they get us to click on something, to companies who have an independent duty to us, to be loyal to us and to not do things that are against our interest, shifting that role to something more like your accountant or your lawyer or somebody else can really have a very different frame on our relationship with these companies.
And what about Section 230 of the Communications Decency Act?
Cindy Cohn: Well, I’m a big fan of Section 230, and I was unhappy with the SESTA-FOSTA Bill which was a law that got passed last year that’s chip ...
It’s chipping away at it. Some people think they will act better if they don’t have the immunity they’ve had.
Cindy Cohn: Well, it depends on who the “they” is, but I think that right now what we’re seeing is Tumblr’s getting rid of all adult content, the pressure that they might be responsible for that.
This is a bill that gives broad immunity to platforms, just for people who don’t know.
Cindy Cohn: Yeah, Section CDA 230 says that if you host somebody’s speech, you’re not responsible for what they say civilly. You’re still criminally responsible, but you’re not civilly responsible, and there’s some state law, federal law stuff that gets into the lawyer weeds, and there are a lot of people who are mad at the big platforms who want to chip away at this.
Cindy Cohn: And the thing that we are very worried about is that what they don’t see is that it’s the users who suffer. If the platforms are worried about liability ...
They’ll become more censorious.
Cindy Cohn: They become more censorious, they limit what you can do, and again, Tumblr has just decided that ...
Again, I think it ...
Cindy Cohn: Female-presenting nipples are a problem, like this is the kind of stuff you get.
Well, they are, really, Cindy. Female-presenting nipples have been a problem since the beginning of time. Sorry.
Cindy Cohn: It takes me all the way back to remind me of one my first things at EFF that I did was I tried to help these breastfeeding mothers who were sharing pictures of latching.
Those malevolent breastfeeding mothers.
Cindy Cohn: And getting kicked off of ... and here we are. It’s 20 years later and we’re have the same stupid fight.
Nobody likes boobs, Cindy. It’s a big issue.
Cindy Cohn: Ah. Such a problem.
So just lastly, Claire, what would be your next issue? End of trust? Is it going to be First Amendment? Are you going to do another one of these?
Claire Boyle: Yeah, right, while you guys were talking about that speculative science fiction, I think we found our next issue.
Okay, all right, but what ...
Claire Boyle: We’re sticking with fiction for the next couple of issues.
Fiction issues, but I think you should bring up First Amendment. I think it’s the biggest ... how we speak to each in these platforms.
Claire Boyle: Yeah. We’ll talk, Cindy.
That become amplified. I’ve written they’ve become weaponized in a way that we never understood, and so what it does is it pulls away from the real debates, which are ... I mean, it really, it creates a problem for everybody, especially when these companies are in charge of deciding what speech is.
Cindy Cohn: I think that’s right, and again, there’s a whole lot of people out there who are trying to get the rest of us to decide that free speech is a bad thing, and we should be really clear-eyed, and this is an autoimmune disorder that a lot of people who really don’t like speech themselves, none of those people care about speech that they disagree with. They’re just trying to use these tools to try to get the rest of us to decide that free speech is a bad thing, and if we fall for it, we’re going to lose something much bigger, and this is a concern that I and lots of other people have been raising.
Well, the big debate of the next year I think, is going to be fascinating. Thank you so much. We’ve had Cindy Cohn from the EFF and Claire Boyle from McSweeney’s. She’s the managing editor. They are talking about a new issue of McSweeney’s called “The End of Trust.” I urge you to read it. Where can you get it, Claire?
Claire Boyle: We’re actually going to a second printing of it, so you can buy it on our store site, and we also have a downloadable version on EFF’s website.
Great. Okay, fantastic. It was great talking to you, and you’re coming back on to talk about First Amendment, Cindy. That’s next.
Cindy Cohn: Oh, I’d be delighted.
We’re going to have a whole long show about it. Thank you two for coming on the show, and thanks to you all for listening.
This article originally appeared on Recode.net.