No one was willing to fess up when the Guardian journalist Carole Cadwalladr first reported that Cambridge Analytica had weaponized Facebook for political ends — so she went looking for ex-employees. She found Christopher Wylie.
“Basically, it was his idea to get this Facebook data,” Cadwalladr recalled on the latest episode of Recode Decode with Kara Swisher. “... He was this sort of secret weapon because having a person who could speak to it personally about what he had seen and witnessed and somebody who was so articulate and could frame what this meant and the dangers of it, I could see how that would just catapult this story into a different level of attention and so that was why I essentially spent the entire next year on [Cambridge Analytica].”
Two years later, Facebook is still dealing with the fallout of the scandal, and the broader tech industry is mired in criticism and backlash from both consumers and employees. And on the new podcast, Cadwalladr wondered how many other Christopher Wylies are out there.
“There are people all across Silicon Valley who know stuff who are not speaking about it,” she said. “It’s like, ‘Are you really okay with this stuff? Are you really okay with your company’s leadership? Are you really sure that they are doing everything that they can? Because you should be troubled and you are part of it. You are part of this. You can’t pretend that you don’t know anymore.’
“I think ‘techno-fascism’ is one of the ways that I kind of started thinking about it, really, which is that this technology favors populist authoritarians,” Cadwalladr added. “And that is what we’re seeing all across the world. And every day we see more of it, and we see the way that they’re communicating with each other, and they are growing in strength and numbers, and they are being facilitated by these technology companies, and you, employees, are part of that.”
Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Carole.
Kara Swisher: Hi, I’m Kara Swisher, editor-at-large of Recode. You may know me as a self-appointed chief justice of Facebook’s new Supreme Court, but in my spare time I talk tech and you’re listening to Recode Decode from the Vox Media Podcast Network.
Today in the red chair is someone I’m very excited to have here. Carole Cadwalladr, the reporter and writer for the Guardian [and] the Observer, who really blew the lid off Facebook’s Cambridge Analytica scandal last year. Earlier this year, she also gave an amazing TED talk about Silicon Valley and fake news, and she says that she has recently been sued by one of the “bad boys of Brexit.” We’ll talk about that and more. Carole, welcome to Recode Decode.
Carole Cadwalladr: Hello. Thank you so much for inviting me, Kara. It’s very exciting.
It’s no problem. We are in London, England. Correct?
We are. We’re in London, England.
That’s where I’m doing this. I had to come all the way here to get you on my podcast. I’m actually here for a few days and doing some different podcasts with people here and also one of my Pivot episodes, but I really wanted to talk to you. You’re the one person I really wanted to talk to.
Where should we start? I don’t quite know where to start. Why don’t we talk about your background? Let’s talk about how you got to where you got and why you started writing about Facebook. Then we’ll get into Cambridge Analytica and where it’s going, and your TED talk, which was this worldwide sensation, I think. It’s stuff we’ve been talking about in Silicon Valley a long time, but it really set off a lot of talking.
I want to talk about your time here because, as I understand it, people are trying to say you’re a conspiracy theorist and, as it turns out, your conspiracies are true, so you’re not a theorist in many ways. But let’s start with that. Talk about how you got to where you got and why you started writing about tech.
I actually started writing about tech a long time ago. A year ago, I used to say, “Well, I was just a feature writer and I stumbled across this story,” but actually, the genesis goes back a long time in that I went to a TED Conference in 2005, I think it was, when it was still a secret conference for millionaires. I heard talk after talk about how technology was going to save the world and the crowd was going to come together and we were going to cure cancer and it was going to be amazing.
Yeah. That one’s been going on for a while.
Yeah. I was kind of exposed to it then, and I thought, “This is ...”
What did you think when you were sitting there, at these ... I’ve been to zillions of these talks, it’s crazy. They went on and on and on. It was all so self-congratulatory, like, “Aren’t we the ...”
But it was genuinely illuminating for me at the time.
Right. Because why? You’re coming from somewhere else, so you’re not steeped in it.
Exactly. Exactly. So, I heard ... Wikipedia was still fresh and new, it felt at the time. Jimmy Wales stood up there and explained how this magical thing operated, beyond something which just hadn’t been seen before and didn’t seem like it would be possible. And so it really did just sort of blow my mind. I just started getting interested in it and starting to write ...
It blew your mind positively or negatively?
Oh, very positively. I was very squarely on the tech utopian.
Yes, yes, yes!
I utterly tech utopian.
Better living through apps!
Yeah. It was just going to be amazing, this bright new future. So I started covering the topic, but in the very much ... I’m a feature writer, I write for a lay audience, I’ve never been inside the technology section. It wasn’t for people who were interested in tech, it was about the ...
... the cultural and the political implications of this kind of stuff. And at the same time, I carried on with my normal job which was, I wrote across the newspaper and interviews and reportage and op-eds and things. But we’ve all been on that journey of seeing the downsides of this kind of technology and getting concerned about the monopolies of it. And so I also started covering that. For example, I went undercover in an Amazon warehouse in Wales for a week. Maybe six, seven years ago now?
You wanted to see what conditions were like? There had been some reporting on conditions?
Yeah, it was just fascinating and sort of barbaric, really ... Not barbaric, but brutal. We were working 12-, 13-, 14-hour days, walking 16 miles a shift. And these people, this was really jobs of last resort and they didn’t ... This was in South Wales, which is very much a crucible of the labor movement. But unions weren’t inside the warehouse. Nobody really knew what was going on inside. This thing of walking half a mile to your break to have five minutes to sit down, to walk back to your workstation. And these were really jobs of last resort. That was just sort of one aspect of it. I went out to Silicon Valley several times to interview people, different stories. I’d had the experience of Google being very cross with me for ... I did an interview with Ray Kurzweil they were very cross about.
Oh, Ray, yeah. Explain who Ray is so that people who are ...
Ray Kurzweil is the ...
He’s never going to die.
He’s never going to die. He’s going to make certain.
He takes a lot of vitamins.
He’s a futurist.
Do you know what I said to him once? I go, “No matter what ... “ There’s a line from Moonstruck that Olympia Dukakis says to her husband when he was cheating on her. She goes, “No matter what you do, Victor, you’re going to die.” It was great.
Not Ray. He’s discovered the secret.
He works for Google.
He now works ... I got an interview with him before he started working for Google and this is what they got very upset about, because by the time the article came out, he was working for them. And there he was saying that machine intelligence is going to overtake human intelligence by the year, I think it’s 2028?
Anyway, actually, one of the things which was very pivotal in how I started thinking about this particular issue of technology and democracy was I went to a TechCrunch Disrupt conference in San Francisco.
Oh, man, Carole.
And just met these thousands of entrepreneurs who are all out to disrupt some industry or another. I remember meeting somebody who wants to disrupt socks. I never quite understood that. But seeing how we’ve ... I’ve experienced myself the way that journalism has been disrupted. Technology destroyed the business model for newspapers. We saw how it disrupted the music industry. It was just a few weeks before the US presidential election that I was noticing this constellation of different news stories and thinking, “Well, you can’t really have an election like you used to.” Hillary Clinton’s emails had been leaked.
Right. Quite strategically.
Well, now we now know incredibly strategically, and there were the first reports coming out of Macedonia about these teenagers writing fake news articles for profit. And so I just thought, “Well, this is technology disrupting politics.” And I thought, “Well, somebody must’ve written that. Somebody must have written this piece.” I Googled it and I was like, “Oh, nobody has written this piece.” So I just did it as a sort of short op-ed comment piece. And then we had the US presidential election and Trump was elected. There was that moment of shock and the first suggestions about the use of technology and the platforms and Mark Zuckerberg saying it was ludicrous, the idea that Facebook had played any role.
No. Zero. “Zero,” is I think the word he ...
He doesn’t know how to use the word ludicrous. But go ahead.
My editor said, “Do a long feature piece on ... Let’s do a thing on this phenomenon, on fake news or whatever.” That was two weeks after the presidential election, November 2016.
Yeah, and this was the time ... What Carole is referring to is, Mark was interviewed at an event and he said there was no, there is zero chance there was any influence by ... Zero chance.
And then they walked that back to 1.1 percent or 1 or a 0.11 percent.
Whatever the lobbyist is saying at the time. Anyway, and I’ve just been, basically, I’ve been ever since that day I have literally been ...
What got you to do that? Because a lot of people didn’t. The Americans certainly weren’t. And they were sort of following along. They just accepted the emails. I’d been writing about its social impact of technology on people’s partisanship and stuff like that. But what ... Is it being an outsider or what do you think? It was just dead obvious or because of Brexit here?
It was part ... I think as well, I’d also covered … one of the stories I had done a big feature on was Anonymous and LulzSec and that whole world. There was one story in particular, it just kind of captured my attention, although it turned out to be a red herring in some senses. But there’d been, just days before the US presidential election, there was a DDOS attack on a nation state, on a country. This was hackers took down the entire internet for ... I think it was Sierra Leone?
I can’t remember now. And I just thought, “Isn’t that kind of ... That’s astonishing, isn’t it?” There was a suggestion that this was a trial run so that you could actually try and do this to a state in America, for example, or the idea of trying to do this to an entire country. It was just the way that sort of technology sits so much at the heart of these democratic systems. As I say, it was a constellation of different stories. Just this, having just written about the whole philosophy around the idea of disruption. There’s this professor of business, I think Clay Christensen, and he called it the “mudslide hypothesis,” which is a kind of small change, a small technological advantage ...
Yeah. Can become this tsunami and has overturned established industries.
I think his example was IBM. So it was really the idea of ... Well, actually we’ve seen it happen. We’ve seen it happen across all these industries. It really does look like politics is next.
Is now. Right.
That is not politics ...
It’s the entire ...
It is democracy.
Talk about getting to Cambridge Analytica, because that really shifted things. Facebook for the most part have been saying, “Well, maybe there was a little influence. Maybe there was some Russian purchasing,” but they really weren’t discussing it. I remember bothering them at the time and there wasn’t a lot of idea that there might have been a problem on the platform.
And I had been at that 2008 event where he talked about third-party information. I remember that very vividly and thinking, “Is that a good thing for this idiot company to be getting the information off of Facebook?” Facebook wasn’t very big at the time. It wasn’t as big. It was big, but it wasn’t 2.6 billion people. Talk about that. How did you move to that?
The focus of the first big piece I did, which was just a couple of weeks after the election, was Google, actually. Every talk I do, I always talk about it because it just still ...
Oh, Google is just a big important part of it.
But it was this thing about this ... When I started looking at the topic and I got really ... I just started going, “So how does Google Search actually work?” And so this playing around with it and putting in terms. And this was this thing, I put in, “Jews,” into the search bar and you know with Google you make it into a question. So I put, “Are Jews.” The suggestion I got was, “Are Jews evil?” That was the top one.
And you don’t even have to press return.
No. It fills you in. Yeah.
It just fills your results.
And the entire page of results ... I was like, “I didn’t ask that question. Google has suggested a question and now it’s answering it.” Every single one of those answers was, “Yes, Jews are evil.” And then at the bottom it said, suggestions, “What do you want to search for next?” And the suggestion was, “Did the Holocaust happen?”
So I was like, “Well, okay, let’s click that link.” And I clicked that link. Every single result was saying, “No, the Holocaust didn’t happen.” The top result went to Stormfront, which is a Nazi website.
So it was this kind of like ... Trump had just been elected. It was this dark and stormy November night. I was like, “What on Earth am I looking at here?” And I was like, “Is it just me?” I was trying on different browsers, I was trying different search terms. I got exactly the same thing with a whole range of searching, including women. When I did, “women,” and I put, “Are women,” again I got, “Are women evil?” And there’s this thing when Google is really certain of the answer, like 100 percent certain, it puts the answer in a box. And so for, “Are women evil?” it puts the answer in a box and it said, “Yes, women are evil because every woman contains a little bit of prostitute inside them.”
That was from where? Where was the link?
It was from some crazy site.
Crazy site. That was the one they suggested.
Anyhow, I was screenshotting things and thinking, “What on Earth am I seeing?” Then my really lucky break was a day later, it was only a day later, I stumbled across this academic who’s then at a small school in the United States, Jonathan Albright. He had just started mapping, trying to map this sort of ecosystem of these far-right websites and linking. What he was discovering was he took a list, a list had been published of these websites, which were publishing fake news articles. He used a tool which looked at all the links going in and out and then he mapped them.
What he saw — and he’d just done that when I got him on the telephone — he was freaked out. I was freaked out. He was freaked out and he was saying, “It’s like a cancer. You can see it strangling all the mainstream sources of news and information.” And it sort of worked out, there’s something systemic at work here whereby it’s eclipsing what should be coming up.
Because they’re using it well.
Because they’re using it brilliantly.
They’re using the tools as they were built, as they were architected. I always say there’s an architecture you can ... In the beginning, Google was more context, accuracy, and speed. And then it turned into virality, speed, and engagement. Especially Facebook, same thing. When you change the parameters of the architecture, you get different things.
I think a lot of right-wing entities, which had been sort of zeroed out of media in general, sort of sidelined into mainstream media, found ... Everyone was like, “Oh, dictators don’t like the internet.” I’m like, “No, they like it. Of course they like it. It’s a huge opportunity for fringe groups. It’s a huge opportunity for different views because they become on equal footing.” Which I think ...
That’s right. That’s right. And I think one of the first things, again, it was at TED actually when I heard Evgeny Morozov talk about the way that Lukashenko in Belarus had discovered the internet was this great friend. So it was brilliant.
Erdogan. Everywhere, Turkey, everywhere.
You got interested in the how this was built, which most Silicon Valley people say, “Hey, it’s just a benign platform.” That was their excuse. “Hey, we’re just putting ... This is what people are searching for. We’re just giving ...”
Yeah. And there was this idea that Google would try to ... in their responses to me they refused to acknowledge. First of all, they refused to acknowledge there was anything wrong here, that there was any problem at all. And then they were like, “Well, it just reflects what people are searching for.”
Right. Yep. Paths are made by walking.
Which was insane. They’re supposed to be organizing the world’s information and the idea of delivering quality results. How could that be? I thought, “How long have these results been out there? Who is out ...” I always thought of this sort of teenage kid in their bedroom, just interested in who the Nazis are. You did like, “Is Hitler evil?” and you’ve got a whole page of results saying, “No, Hitler was actually a really good guy.”
Anyway, what was so interesting, that first story, that first response from Google, this has been what I’ve spent the last two-and-a-half years dealing with, which was this total lack of accountability from the platforms, this denial of responsibility and then the counterattack. The way they went actually on the attack against us.
Normally what happened is, this is what I did, I wrote these longform features, I got into a topic, I covered it, and then I moved on. When I published this piece, I was like, “Surely, the world is going to say, ‘This is outrageous. We must do something. We’ve got to fix this. The net has been poisoned. What is ... There’s this sort of shadow internet almost out there.’” And that didn’t happen. And Google, just by refusing to engage, they started hand-changing some of the results. So I thought, “Okay, well I’ve just got to keep writing about this.” So I just kept for the next five or six weeks, I just carried on writing about it.
I did things that — this made them really incensed — I took out an advert. I said, “Ah-ha. How do I change the results to say that yes, the Holocaust did happen?” So I thought, “I know. I’ll take out a Google ad.” So I took out a Google ad to get that to the top of the search results. That made them very furious.
Anyway, and then Christmas Eve, one day of the year, we don’t publish on Christmas Day. And that was the day Google sent in this massive legal complaint to the Guardian. So I was dealing with that until with our lawyer, our poor, poor put-upon head of legal at the Guardian, Gill Phillips, who has been very instrumental in this whole story. I do find it chilling.
And they were alleging ...
This thing of trying to shut the story down. Trying to shut the story down by going after the reporter, by going after the newspaper.
To say ...
To say they were claiming problems with my reporting and claiming something was inaccurate. It’s, they pick on upon a tiny detail to distract from ...
The bigger story.
This, you know ...
But you persisted.
Yeah, we persisted. But then what was funny was that I actually got waylaid because that was when Cambridge Analytica got on my case.
And I was desperate. I’ve always wanted to go back to Google and I was like, “YouTube’s such a sewer!” I’m desperate to go and do something on them. But in the meantime, I started getting these crazy letters from this company, Cambridge Analytica. They’d had one mention in the first piece. I had said, “They work for the Trump campaign and they worked in Brexit,” because that’s what their website said and that’s what the articles had said.
And they started writing to me and saying, “No, no, no, no, no, no. That’s not true. We never worked in Brexit.” So we started writing back to them and saying, “Well, here’s where your CEO said that you worked for the Leave campaign and here’s where the Leave campaign said that you hired them.” And they would be like, “Yes, yes, yes, but that’s not true. And you need to take that out and correct that.” And so this went on about three times and I was like, “What on Earth is going on here?”
Actually, then what happened is our reader’s editor got in touch. He was sort of like, “Well, let’s actually just find out what happened.” He got in touch with this guy called Andy Wigmore who worked on the Leave campaign, worked on Nigel Farage’s Leave campaign, and asked him and he said, “Yes, we did use Cambridge Analytica. We just didn’t pay them.” I was like, “Oh, that’s kind of interesting because that’s kind of like a gift, isn’t it?” And I was like, “Don’t gifts have to be declared?”
I trotted off to go for a coffee with Andy and that conversation became the basis for my first big piece on Cambridge Analytica. Cambridge Analytica, and also Robert Mercer and Steve Bannon was this very ... It was just so interesting. Again, it was before there’d been any, it was before Jane Mayer had written her big piece on Robert Mercer for the New Yorker.
Robert Mercer’s a wealthy donor to Trump.
Roger Mercer’s this key character. He’s this hedge fund billionaire and he had been the biggest donor to Trump and he had funded Breitbart.
Right. And his daughter.
Which is the far-right news network of which Steve Bannon was the editor-in-chief. There were various things that he funded. One of the other things he’d funded had been Cambridge Analytica. And Cambridge Analytica had been very instrumental in the Trump campaign. Here was these guys saying, “Yes,” and, “We use them in Brexit.”
So there was a nexus of people.
There was this nexus and he said that, “Well, of course they wanted to help us because Brexit was the petri dish for Trump.” And that was very explicit and it’s because, of course, we know we’re the same family. We’re using the same techniques. Steve Bannon and Nigel Farage are good friends. Trump and Brexit are all part of the same thing. Anyway, that was the first big story I did on Cambridge Analytica. And even then, it was out there, this information about how Cambridge Analytica had somehow got hold of all this Facebook data.
When I looked at the cuts, there was this piece in the Guardian, in December, 2015, by a journalist called Harry Davies. That was his scoop. He’d found that out. But, at the time, Cambridge Analytica wasn’t working for Trump. It was working for Ted Cruz. Facebook just did what the tech companies do, which is it just denied it and refuse to comment. And that was it.
So you started to see the links between the Facebook data and Cambridge Analytica and the Russians.
Yes, yes. And this concerted right-wing attempt to disrupt the mainstream media. That was what was so fascinating about it. Bannon and Mercer had these various different strategies, which was all about disrupting the mainstream media system and worked. And there was another one which was they also funded this thing, the Government Accountability Institute. That, for example, with that they did really deep research into Hillary Clinton and then they fed those stories into the New York Times, amongst other places. It was ...
So here you are with these pieces, which we’re going to get back to. You have these pieces of different things that they were doing that you were slowly working on. But I think what you had developed was a distrust of these companies ...
... in terms of what they were saying, because they were like, “We are just benign. We just have information. Anybody can use it.” I think that’s pretty much their ...
Yeah, I think so. Also it was just this thing, which was in particular, which was that nothing had any traction anymore. So, it was publishing this stuff about Cambridge Analytica on Facebook and everybody was like, “This is terrible!” And then the next outrage hit the news cycle.
Right. Which is the point. It’s meant to keep you exhausted, just so you don’t ...
Which is the point. And this was what I’m kind of like, “Well, I’ve got to keep going. I keep needing to do the story again in a different way.” And really, and that was when ... The break I had was when I found Christopher Wylie.
Basically I was just kind of just getting these denials. Cambridge Analytica was just denying stuff. Facebook was just denying stuff. The Leave campaign was just denying stuff. But what had happened, the first article I did on Cambridge Analytica, I mean, it was this great success in some ways. It kicked off two big, official investigations in Britain. It kicked off one investigation by the Electoral Commission into, had they declared everything, the campaign spending?
That they had done for Brexit.
That investigation is now with the Metropolitan Police. We’re still waiting on ... The election commissioner said, “No, they didn’t.” And there’s now police investigation into that. And then the other investigation which was kicked off was by our information commissioner’s office. That investigation became this massive inquiry into data and politics, which is now the biggest data investigation of all time. They’ve had 70 full-time people working on it. It’s the one which has now fined Facebook the maximum amount, etc. So it precipitated that. But at the same time there is just so much more out there.
Anyway, so that was my thought, right? I’ve got to find some ex-employees. I need to find somebody to talk to me. And so there was this kind of laborious process of approaching people and being rebuffed, blah, blah, and eventually I found somebody and he said ... Well, as soon as I started talking about Facebook data and he said, “Well, you need to find Christopher Wiley.” I was like, “Who’s Christopher Wiley?” He said, “Well, he’s this Canadian guy and he’s the guy who got the Facebook data,” and then I looked on the internet. Chris, very smart, nothing to connect him to Cambridge Analytica or the parent company, SCL. But I tracked him down and he’d been sort of waiting, I think.
For somebody to find him.
Somebody to find him, and he was sort of surprised that nobody had found him until then and it kicked off this ... The first telephone conversation I had with him, he was in Canada at the time, it was about seven hours long. I mean, it was absolutely mind-blowing.
He was one of the partners there, correct?
He had been the research director of ...
Okay, he was a high-ranking exec.
Yeah, yeah, that’s right, he was. And basically, it was his idea to get this Facebook data and to do this personality testing.
And to misuse the data, which is what Facebook said is that they took data and misused it and Facebook had no idea they would not follow the Facebook rules. That was Facebook’s excuse.
Yes. But we can now go stronger than that because what they did was … the information commissioner in Britain has made the ruling that what Facebook did was illegal, in that it allowed Cambridge Analytica to break the law.
I mean, it broke the law. It’s not just data abuse. It’s illegal data abuse. I think that is a kind of important distinction and Chris is this amazing character and it was all very well to have somebody saying, “Oh yes, X, Y, and Z.” But I was like, “Well, can you prove any of this?” He was like, “Well, yeah, I’ve got the receipts.” He literally had the receipts for the Facebook data and he had the contracts. He had the founding contracts for Cambridge Analytica, the company, and then we started going through stuff. More and more things came out. He went back and looked at emails and then found emails of Alexander Cogan, who was the psychologist at Cambridge University who’d harvested the data, talking about ...
Which Cambridge Analytica used.
Yes, sorry. Yes. Talking about his trips to Russia at the time and then Chris pulling out this pitch and discovered that he’d done this research, which Cambridge Analytica were pitching to Lukoil. So they were pitching …
… how to target American voters to the biggest Russian oil company. I mean, just stuff didn’t make any sense. It was just super weird.
Anyway, the next article was in May 2017. It was called “The Great Brexit Robbery,” and it was really about this scheme of links between Brexit and Trump and Russia and how you could see the connections between the individuals and these companies and the data and the money. And at that point, we thought, well Chris was kind of interested in ... We’d already started talking about him coming forward as a named whistleblower. But he had to break a nondisclosure agreement. So it was difficult and it was legally complicated and immediately I published that article, then Cambridge Analytica started threatening the Guardian.
It was trying to sue us for defamation, for special damages, to sue us in Britain and in America and this was … it’s owned by Robert Mercer, who has the deepest pockets and it was really serious and really quite scary and almost at like an existential level.
The thing I think we were all chilled by was the way that Peter Thiel took down Gawker. You have an ideologically motivated billionaire who took down a media organization, and here I was writing this story about an ideologically motivated billionaire who backed Trump. The vice president of the company was Steve Bannon.
Working in the White House.
At the time, he was in the White House. It was really honestly kind of chilling and quite scary thinking about what you were up against at the time, and now so much of this has been normalized and Steve Bannon’s not in the ... But at the time ...
He’s running all over Europe now, making trouble.
But we were looking at Cambridge Analytica had just won contracts with the State Department. It had been reported that it was going to get a contract with the Pentagon. So it had all of this information, 230 million American voters.
Using Facebook data.
Using these incredibly detailed profiles about people.
Facebook had pushed back on you several times on these stories, including right before you published them.
So on that one, they followed the same strategy. So this was exactly the same strategy. They refused to comment and they pretended it didn’t happen and this is why I realized that sort of Chris had this ... He was this sort of secret weapon because having a person who could speak to it personally about what he had seen and witnessed and somebody who was so articulate and could frame what this meant and the dangers of it, I could see how that would just catapult this story into a different level of attention and so that was why I essentially spent the entire next year on ... I mean, it was a year, solidly, full-time, all day every day working on that story to get it out.
One of the things that ... First, they pushed back on you and threatened legal action. So they did not take it, right? In a lot of ways, Cambridge Analytica is the story. That’s the company you’re going after. But Facebook is ...
I have no idea. I mean, absolutely. We always knew this story was going to be devastating to Cambridge Analytica. I had no idea of the scale and impact this would have on Facebook, really didn’t see that coming at all. And in many ways it was the way that Facebook reacted and responded which was the sort of killer. It kind of blew itself up on our doorstep.
Well, their argument initially was aggressive and then it was, “We didn’t know.” I think “we didn’t know” is their basic argument.
They spent sort of three days figuring that out. So what was extraordinary was that, all through this period I’ve been writing story after story about ... My story is about Cambridge Analytica, even whilst I’m working with Chris. They’ve carried on and whilst we’ve had this threat of being sued, we wrote a 35-page legal letter back to Cambridge. It took our lawyers a week. I mean, it was just a huge effort from the Guardian news organization to keep going with this story.
And at no point did Facebook ever say, “Oh well actually, okay, this is what happened. We know this. We’re sorry.” We just never got a comment and then we go through ... Because our libel laws here are so much stricter, it’s so much more difficult to publish this stuff. So we go through a very strict protocol before publication, everything in writing to them. So we put in our questions on the Monday, and we’d had an agreement, New York Times, who I also did the story with, and with Channel Four News that we would put in our questions on the same day.
Facebook, they didn’t respond for three, four days, and then I get a call from them and they said, “We’re going to be sending you a response, a written response tomorrow. But just to let you know that we’re very, very categoric about this. This is not a data breach, just so you know.”
They kept saying that. No one says it was.
They kept on saying, “It’s not a data breach.”
I remember they called. I’m like, “No one said it was a breach.”
We were still figuring out what the headline was at that point. So I got off the phone and my colleague Emma Graham Harrison, who I wrote the new stories with, she said, “What did they say?” I said, “Oh, it was a bit weird.” They sort of said, “Yeah, we’re going to write to you.” But they’re very, very clear, it’s not a data breach. And she was like, “Hm, data breach. Hm, yes.”
Why are they mentioning that?
So that became the headline. But then the next day, I couldn’t believe it. It’d been so hard to get through the hurdles with Cambridge Analytica to get into this position to publish. Day before publication, Facebook having had this information, now remember this, for more than two years, it then sends us ... They hired these fancy lawyers in London to write us a letter. Didn’t do this to the New York Times obviously, and saying, “Actually, this is highly defamatory and we will take legal action if you persist in publishing these falsehoods,” and went into another panic. The day before publication, we’re in these intense legal meetings. We’re ringing the ICO. Well, actually, they haven’t got a leg to stand on. This is ridiculous.
So we’re like, okay, we’re plunging forward. Then what happens is it is 1:00 in the morning the night before publication and then we discover Facebook have put out a press release in the middle of the night, British time, saying, “Oh, we’re kicking Cambridge Analytica off our platform.” So they tried to run a spoiler story.
That sounds like them. Honestly, that’s what I’d do if I were them. But go ahead. It wouldn’t do the first thing, but go ahead.
So it’s kind of exhausting because the New York Times are saying, “Well we need to publish now,” and we’re like, “No, we need to hold off. We need to hold the line. We need to publish together. We need to stick to the plan.” Anyway, so we all do. We just bring it forward a bit. We publish on the Saturday and then we have that three-day silence. I mean, it was sort of phenomenal in that Facebook just went into this internal tailspin and just didn’t know how to respond. And then eventually, I think it was day four, Mark Zuckerberg came out and sort of said, “Oh I’m so sorry.”
Yeah, “I’m so sorry.” We have heard those before in the United States of America. That was your first “I’m sorry,” because it’s my 50th.
So, the impact. What has been the impact? And then it went on and then as more showed, it wasn’t just ... It was Cambridge Analytica, but a lot more.
It goes off into so many branches. It goes into “fake news.” It goes into disinformation. It goes into just ugliness that people ... real feelings, real racist and other things. Today, there was the story of the Customs and Border Patrol having a Facebook group, which is just appalling. Now, I’m not sure that’s Facebook’s fault, but I do know it wouldn’t exist without Facebook. Do you know what I mean? Can you blame them or can you say ... I’ve had an ongoing arguing with them. They’re like, “If you had a robbery, would you blame the car?” I said, “You’re the gun. You’re the gun!”
“That’s what you are! You’re not a car!”
Yes, that’s right. It’s that thing, isn’t it? Yeah. Facebook doesn’t kill people. People kill people.
Right. Exactly the kind of thing. And so it’s gone off into so many ... and it’s gone into data hacking too, which they had an issue with, not just Facebook, but everyone on the internet. It’s data hacking. It’s privacy. It goes off into so many different avenues off this one idea that maybe things aren’t quite so kosher with how these things are run.
So where are we now? Let me fast-forward you.
Well, I mean, for a year I’d had the tech bros saying to me …
“You’re a bummer.”
… that all companies do this. There’s nothing special about Cambridge Analytica.
You’ve over-estimated it.
It’s snake oil.
Right. It wasn’t as bad as ...
They’d seen article after article about this and I just want to go back into the special features of this, which was Cambridge Analytica came ... It was at the SCL, the parent group. It was a military contractor. Okay? It was a propaganda firm which had worked in Afghanistan and Iraq. This was no ordinary data firm.
Also, I think that the different angle we had on it as well is that the sort of technology reporters in San Francisco, reporting on technology firms just came at the story from a different angle, and coming at it from Europe where we had laws, where it was, “Is this legal?” We now know, no, actually it wasn’t legal. But that is still unwinding. So just in terms of what Facebook did with Cambridge, just this bit of the story, there are so many investigations going on. So the FBI is investigating. The Department of Justice is investigating. The Federal Trade Commission is investigating. The Securities and Exchange Commission is investigating, and that’s just in America. We know that the FTC is looking. I mean, it’s been said that it’s looking at a fine into the billions.
Yeah, I wrote a column saying they need to be $50 billion.
Just this week, but ...
Since there’s no laws, we need a big old fine.
Yeah, there are also these investigations. Just this week it was announced in Italy they imposed ... Was it a billion euros?
I don’t know that one. But go on.
Or was it a million euros? I’ve kind of had the scale of this thing. It’s so crazy. I can’t even keep up with them. So that is still very much unwinding. I find the SEC investigation into what Facebook did with Cambridge Analytica fascinating, because I think that’s kind of the scary ones in many ways for them. It’s ones where directors get held responsible, and what we’ve seen with Facebook ... I think when they didn’t know how to respond to this, they’ve refused to say who knew what, when about Cambridge Analytica.
That question, who knew what when, has not been answered and I think that’s one of the reasons why Mark Zuckerberg, for example, won’t come to Britain and answer the questions of our legislators. That’s one of the things he’s ...
To be clear about that, he has been invited by different MPs from different commissions.
But it’s just this thing I find totally scandalous is that Mark Zuckerberg has been asked multiple times by our parliament to come and answer questions about Facebook and particularly about Cambridge Analytica, and then what happened is because our parliament could get no traction, no answers from Facebook on these things, it then banded together, we had, I think it’s 12 other countries, so countries like Canada and Australia and Argentina, and they formed this grand international committee and Mark Zuckerberg has refused to go and answer questions to them. And I mean, this is more than half a billion people being represented here. So it’s this incredible disdain to the rest of the world, essentially. We’re just colonial subjects of Facebook.
Where do you imagine it going, then? Because the business has never been better. People are leaving. There’s all kinds of different things going at the same time. The FTC has shown a little more teeth, probably not enough.
What do you think?
The government ... we’ll get into that in the next section. But I mean, what direction are you going then from this?
Well, I was treated like a wild conspiracy theorist during this whole time.
Including here in Britain.
Absolutely here in Britain and then it turned out, oh, it’s all true and it’s actually much worse even than I reported. Everything we find out about it is actually much worse than we thought. And here in Britain, my investigation has very much been also around these far-right figures and their links into Russia and to the American far right. And again, I get treated like a crazy conspiracy theorist, and it’s absolutely strategic because a way of attacking the story is to attack me and it goes on on a daily basis and it’s a really tough publishing landscape here because so much ...
We did a good talk with Maria Ressa. It’s the same thing except in her case it’s quite dangerous. It’s actually, this is a brutal dictator.
Here, it’s to ruin your reputation and to make [people] think, “Okay, she was right about that but now she’s gone too far.”
Yeah, yeah, and it’s relentless and it’s coordinated and most of the press is very right-wing here and we have our national broadcaster and they’ve been very scared to cover the story in so many ways, and so there’s been this sort of absence on so much of this and we just don’t have the resources. Journalism just doesn’t have the resources here. So this isn’t a big team. The Guardian has been this amazing organization, which has sort of backed this story. But at the same time, it just hasn’t had the money to put any extra reporting resources into this.
So you seem frustrated and sort of ... Here you are, you’ve broken this thing and as you said, it doesn’t get traction. It doesn’t matter. I don’t believe that. I do think it has traction. I just don’t know what kind of traction it has. Right? Because what happens in the United States is there’s one horrible thing Trump says after another and you forgot the last one he said and that’s the whole point of it. Exhaustion is the whole point of social media, so that you give up at some point or you become tired and exhausted and overwhelmed, or you get impugned.
In your case, with crazy conspiracy theories. In my case, “She’s such a bummer. She’s so mean and I don’t know how you can be mean to billionaires.” I don’t even get it. It’s like there’s no amount of mean that they shouldn’t be able to take on some levels, and so it’s in personal terms, “mean,” “bummer,” “negative,” “overly negative,” “don’t you like tech?” That’s the kind of things I get, which is interesting.
But they still talk to you, which I find really fascinating. I loved your interview with Mark Zuckerberg.
That was a disaster for him. He’s never going to do one again. Every interview he does with me is a disaster. It’s really fascinating. I think they’re great, actually. I don’t think they’re a disaster. I think they’re great for Mark Zuckerberg because you start to see the mentality. Right? You know what I mean? I don’t think that’s a bad thing for him.
I mean, I just thought it was such a sort of amazingly telling moment when you tried to press him on ... You asked him about Myanmar and you’re trying to break through to him. “How did that make you feel?”
And how does that make ... and you kept on asking it and he kept on being unable to answer it.
Right, right. I don’t think he was, by any means, being disingenuous. I think he couldn’t answer. It was really interesting. I’ve been around people who are disingenuous. I understand liars. It’s a very different level of ... He has no ability to take responsibility, even though I think he’s not the kind of ... Again, people I’ve covered have been really unctuous, awful people. It’s not the same thing. So it’s really an odd thing to be sort of pressing someone who just can’t even compute. I don’t know how else to put it.
Initially I was kind of like, is it just that it’s the lawyer’s answer? Then you were like, “Actually this is dissociation,” and that’s the thing which I find the most chilling, actually, of it, which is that, you have these people who can’t respond on a human level. I mean, I just can’t. I mean, how do you compute that? You’re the head of this company.
The United Nations have said that you have helped ferment the mass killing of people.
To not have a reckoning with yourself, to not want to make amends ...
Well, what happens is ... He doesn’t do it as much, but a lot of them get into this victim mentality, like, “Hey, you’re a ...” and become super aggressive. The people on the part of Facebook have gotten super aggressive, who have no power, by the way. So I don’t really care what they think in lots of ways. But it’s a really interesting reaction. It’s victimization. It’s, if you don’t fail, you don’t do things. Suddenly all of a sudden Silicon Valley has been shooting off tweets around like, “You have to fail,” and, “Only those who criticize don’t create,” and I’m like, “I create and I criticize. Nice to meet you.” It’s really an interesting thing.
So talk about this TED talk that you did that really got a lot of traction, speaking of traction and I think it really did. So you were invited there where you had gotten your initial, “Wow, tech is fantastic.” You were back there.
Yes, because I’ve been a reporter at TED. So I knew kind of what a big deal it was and I was so terrified of public speaking and I’ve kind of forced myself, doing this story, to start doing panels and then ... But a TED talk was sort of at another level of sort of terrifying-ness. But I knew that it was this opportunity to talk directly to the people who are in these companies making these decisions and who are responsible and so I really did want to... I think that interview you did with Zuckerberg, I think that did inspire me in some ways. I wanted to break through to them as people, and as people who are responsible for creating this world that we have now found ourselves in and we know they didn’t set out to to do so. Nobody did.
Maybe Cambridge Analytica did.
Yeah, maybe Cambridge Analytica.
No, they absolutely ... Come on.
Yeah. So they’re an extraordinary company. There’s still so much about Cambridge Analytica. But they worked in, I think, 154 elections around the world. We’ve only been scratching the surface. We’ve actually got no idea what Cambridge Analytica did for Trump.
I think that’s the point. You’re never going to find out. That’s the whole point of the Russia ...
Don’t count on it.
Some of the Russian stuff you’ll never be able to quantify and if you can’t quantify it, people say, “Well you didn’t switch the election because Hillary Clinton was a bitch.” That’s why. I’m like, maybe, but it’s also this. You can’t quantify.
I know. But you know what? This is where the power of kind of conscience and of people having a moral conscience is so important and that’s the thing I think I was trying to appeal to, because we saw with Christopher Wiley, you have one person who decides to speak out and it has this incredible power. There are people all across Silicon Valley who know stuff who are not speaking about it and like I say, the thing of Cambridge Analytica, there were these employees who worked on the Trump election, young Europeans who’ve not spoken out. I mean, maybe this is one thing to say on your podcast. It’s listened to by people. But it’s listened by people in Silicon Valley and it’s like, “Are you really okay with this stuff? Are you really okay with your company’s leadership? Are you really sure that they are doing everything that they can? Because you should be troubled and you are part of it. You are part of this. You can’t pretend that you don’t know anymore.”
I mean, before, we didn’t know and I don’t think employees knew the full story. But it’s becoming clearer and clearer and it is deeply, deeply troubling and the way ... I mean, I think techno-fascism is one of the ways that I kind of started thinking about it, really, which is that this technology favors populist authoritarians. And that is what we’re seeing all across the world. And every day we see more of it, and we see the way that they’re communicating with each other, and they are growing in strength and numbers, and they are being facilitated by these technology companies, and you, employees, are part of that.
I just did a really interesting interview with the head of AWS about facial recognition and it was, I don’t know if you saw what he said: “We’re not responsible for how people use our technology.” And I was like, “Yes you are.” And then no ... It was an interesting back and forth. Again, terrifically successful executive, lovely guy. And it was really interesting, the mentality of, “Hey, we just make this stuff, we can’t be ...”
What solutions do you find? Your speech was so impassioned and so you’ve got to have a conscience, you’ve got to have ethics. And it’s something I’ve written about, they need ethics, they need to take humanities courses, they need to do this. What do you think prevents it and what do you think some of this, besides just constantly reminding them of that? What do you imagine the solution to be? Great reporting.
I think shame.
That’s my job for years now, it’s not working. What do you imagine it to be? Besides shame, the solutions. How do you educate a whole new group of technology people that this is not right? They have to think really hard.
But I think it’s this. I think one of the things is being, that’s it’s just being technology ... the technology industry has existed within this bubble. Technologists creating technology for ... And it’s this: We need all sorts of other types of people. We need philosophers and ethicists and ...
We need the people who are being harmed by this technology represented. And I find it incredibly troubling that this ... One of the most troubling things I find is that ... so, one of the reasons that we’re very grateful to partner with the New York Times, for example, as well on this story is that it’s only the United States that can legislate against these companies and in that sense and only the US press that they pay any attention to. And we did this big workshop with these journalists from Bangladesh and India and Sri Lanka and Pakistan who were telling me about the ways that the technology is being used in their countries.
They’ve got no chance of holding Facebook and Google to account. So in Britain, where we speak the same language and we have this shared culture and much more news media is shared ... We had no chance. So that I feel there is this real responsibility in the United States to also press the case for the rest of the world and for what is going on with social media and the rest of the world.
Right. Agreed. And what do you imagine should happen? What would you, if you could ... Facebook has proposed this sort of, it is true, a Supreme Court to look at their stuff, which I think is just ridiculous. You know, “We want you on ...” They didn’t ask me, but I’d be like, “No, thank you. It’s not my company and I didn’t cause this mess and I’m not going to clean it up for you.” But what do you imagine would work?
I mean, I think there’s an awful lot that governments can do. One of the things which I think became very clear during the Christchurch massacre was that that video was going viral across the world.
The two platforms, especially.
And they refuse to take responsibility for it. Just, I say this again, is turn off the uploads if you cannot control what content is being uploaded. And I think forcing the platforms to take responsibility as publishers is — which, Britain is making some moves towards now — is a vital sort of first step. Don’t let them get away with it. I mean, this is incitement to violence. It’s incitement and we have laws against that and so we need to enforce the laws we have and we need new laws. So it is on us as citizens as well. This is one of the things I think is on us as citizens: To put pressure on our lawmakers and to get laws changed. And this is on us.
What’s interesting is that Trump and others in the administration are going after Facebook, have antipathy towards technology, which is ... I was like, “They’re your best friends, my friend. Between Twitter and Facebook, you should throw them a party in the White House, complete with McDonald’s hamburgers. It will be great.” And by the way, they like that food, so good luck. But so what do you make of that?
I mean, it’s just very interesting, isn’t it, that way that it’s crossed the aisle now? And I mean, I think, a year and a half ago the biggest threat Facebook saw coming was from Europe. I don’t know if that’s true anymore. I think the United States ...
Do you think?
I think that ... We’re as superficial, mile wide and foot deep as we ever were on everything. They have hired all kinds of lobbyists. The stuff I’m seeing now is really interesting. I’m spending a lot of time in Washington going, “No,” every time someone ... I hear all their arguments. I’m like, “That’s not true.” I just sit there and tell legislators it’s not true. “That’s not true.” And they need to look at this and they need to do their job. Which is hard, because as a journalist you’re like, “Why am I advocating?” But it’s not advocating. It’s like, don’t have people lie to you about what they’re doing. And what’s interesting is, at the heart of it, I think a lot of these things can be wonderful, can be great. Like a lot of it. I love Twitter, I’d love full communication.
And what’s fascinating is the pushback on, “We can say anything we want.” They tie themselves to the First Amendment when this has nothing to do with the First Amendment. It doesn’t. I mean, it’s free speech. It’s not free speech. It’s not. You don’t have free speech and you’re not a public square and you’re a private company and to try to tell citizens, these people are billionaires, off of your data. They’re not helping you, they’re helping themselves and hurting you. And so that’s the message I think that’s hard to get through.
But I mean with one of the ones which was, he came last week, so one of the things which we find most painful here in Britain is our ex-deputy prime minister, Nick Clegg, [Facebook’s VP of communications] ...
Clegg. Tell me about Nick Clegg. He hasn’t called me. I’m waiting by the phone, Nick. Call me.
He’s here? Have you asked him for an interview?
I am waiting. No. I should, I should.
Funny, I asked him for an interview last week because I found out the new New Statesman, okay, which is a niche left-wing magazine here, there’s a friend, he’d done an interview with this guy before, friendly interviewer. He’s giving them an interview.
He’s not going to give me one. The ladies aren’t getting the interview. Nick! Do you not like the ladies? We’re real sweet, we’re real docile.
Nick took my TED talk very personally.
And it’s interesting ...
And this is the guy who attacked Facebook before, correct? Or had been?
And he was a Liberal Democrat. He sort of gave speeches against monopolies and ... But the backstory to Nick Clegg is that he was in this centrist party, the Liberal Democrats, and he helped David Cameron. He didn’t get him enough majority to form a government, and Nick Clegg’s party propped him up and he would say, “Well, he moderated some of them, harsher policies” and others would say, “Well, actually he enabled them to carry out this austerity program and he also reneged upon all of his campaign,” and it devastated the Lib Dems. They hugely lost their membership base. Anyway, he then loses his seat in Parliament and he trots off to Silicon Valley to take this job. And it’s just, he ... Oh, sorry.
That’s all right. Please go on.
The cynicism of it. The cynicism of it. He’s somebody ... And the reason he took it so personally, he took my TED talk so personally, is because ...
Did you get a plaintive text or something?
My colleague got a very plaintive text from him, but he took it very personally because, and he’s hit back, I find this... He went on the BBC last week to spread actual misinformation. Because he turned up at this panel event two weeks ago and I sat in the audience and asked him a question. And what we know is that during the referendum, Facebook was the site of multiple illegal acts, okay, that took place. So these campaigns used Facebook to break our electoral laws ... So we control money in our elections. That’s one of the basics of our electoral laws.
They discovered you can just spend any amount of money on Facebook and nobody will know. And so we’ve got these investigations going on at the moment, but our electoral commission has concluded that two campaigns broke the law, and broke the law by a massive amount. And this is massive electoral fraud. But in terms of what the big picture is, of what exactly happened during the referendum of how much money was overspent, of who was targeted and with what ads, etc., is that Facebook has all of that information. And that’s the thing it’s refusing to tell Parliament.
And Nick Clegg was passionately anti-Brexit. So the idea somehow that the company he’s now shilling for and covering for, is he’s somehow implicated. He’s just gone on the defensive. And when I started asking him about where the data is and he said to me, very crossly, he’s like, “The information commissioner has the data!” And I said, “No, this isn’t a question about Cambridge Analytica. I’m not accusing Cambridge Analytica of hijacking Brexit. It’s not, there’s no suggestion that it was Cambridge Analytica. So you’re just failing to ...” But he’s using that, he’s deliberately answering a question which hasn’t been asked by saying, “Well, there are some conspiracy theorists who say that Cambridge Analytica caused Brexit. Let me tell you, this is absolute nonsense.” Nobody is saying that!
But last week he was here in Britain and he said that and the BBC reported it and, bingo. You know, there he is earning his money for his boss.
He wrote an essay, too, that was pretty appalling. Sorry, Nick. Nick, I’m open minded. Let’s have lunch. I’m always charmed by a British accent, in any case.
Oh Kara, I can’t believe you’ve said that.
I’m — listen. You know what? Always good to talk. Anyway. I get good interviews out of them, which is important. See, you liked that interview with Mark Zuckerberg. I don’t think there’ll be another though, in that case.
What do you imagine will happen now? Which one of these companies do you think needs most legislation — and then we’ll finish up — in the United States? Because you’re right, it has to be done in the United States. It can’t be done ...
It has to be done in the United States. I don’t know. It’s really hard to ...
You can impose all kinds of things where Mark Zuckerberg will never be able to come to Britain or Jack Dorsey or the Google guys or whatever.
It’s just all so terrifying. I mean, I saw Jack Dorsey at an event a few weeks ago, so I asked him, “How do you feel about the fact that the president of the United States looks like he might start a nuclear war on your platform? How do you feel about that?”
And how, in the fact that he breaks your terms and conditions every single day and you don’t do anything about it.
He’s a newsmaker, right? Is that their argument?
I don’t think they even have an argument. They just say it’s really difficult and they kind of, I mean, that is difficult, in fairness that ...
So what would you ... Okay, I’m going to ask about my final question. Carole, if you’re running Facebook/Twitter/Google, what would you do?
I mean, some of it is just money. You know what, that the thing I find is that there’s like, “Oh these big problems are so big and we’re training our AI and la la la.”
So one of the most compelling arguments I find with Facebook is that one in seven people who work in Facebook moderation work on German content and that’s because in Germany there are laws against hate speech and so they’ve got to enforce it. Now, a colleague worked out that if you were to employ German levels of content moderation globally, it would ... I mean, it’s something like half a percent of Facebook turnover or something. I mean, just employ more people and pay them properly.
Right. You did see Casey Newton’s piece on the content moderators?
So they don’t pay very well.
And they give them no ...
But you can pay people properly. You can train them properly.
And give them proper psychological help. Yeah.
And you can just have more of them.
They should be working for Facebook.
I mean, there is no argument. There is no excuse. That is just bottom line. That is just failure to take responsibility.
Employing more people.
So employing people. Thing is, I think they ... we just don’t trust them to do the right thing, and that’s the sort of imperative of ...
That’s where legislators are the ones who really have to step up, I think. And again, that’s where I think it comes to us as people, as citizens, to be paying attention to this.
So Carole, tell me about being sued by one of the “bad boys of Brexit,” if that’s the recent thing, which has gotten your ...
Yeah, so this was one day within 24 hours of me reporting about Steve Bannon’s connections to the man who might be our future prime minister. Looks like he will be our future prime minister, Boris Johnson. I have a torrent of articles about me unleashed on these right-wing blogs and I get a legal letter from Arron Banks. He is Nigel Farage’s funder. He’s one of the guys who call themselves the bad boys of Brexit, and it’s incredible. He’s suing me for defamation, threatening to sue me for my TED talk.
So he’s citing two talks I gave in public. One was a TED talk. The other one was this event called The Convention, and, I mean, it’s this really chilling way in Britain that millionaires ...
And his allegation is ...
But I said ... Politically motivated millionaires can try and silence journalists through litigation. We had it with Cambridge Analytica, we had it with Facebook, and I’ve got Arron Banks. And Arron Banks is not going after the Guardian. He’s going after ... Or TED. Because he can’t, because it’s an American organization. He’s going after me, as an individual. And yeah, so he sued me because I said in my TED talk, I said, “I’m not even ...” Here we go. Let’s just say it again, Arron. You can add this one to the bill. I said ... so Arron Banks used ... It was his connection to Cambridge Analytica that sort of set me off on this whole story. And I subsequently, after I broke the big Cambridge Analytica story last spring, in the summer I did another big story about... I got hold of a stash of emails, and it was about how Arron Banks had been making these covert trips to the Russian embassy in London in the lead-up to the referendum.
And he was offered gold and diamond deals by the Russian ambassador. And this is the Russian ambassador who’s named in Robert Mueller’s indictments as being a conduit — essentially a communication channel — between the Trump campaign and the Kremlin. There’s something I just want to go into very briefly.
After the referendum, Arron Banks and Nigel Farage continued traveling between London and then, they were on the Trump campaign and they were still going into the Russian embassy here. So these connections between these Brexiteers, between the Trump campaign and between the Kremlin, there in black and white. For two years after they had been making those visits to the Russian embassy, Arron Banks lied about this. He said he’d had one lunch with the Russian ambassador. So we published this stuff and I call him out and parliament’s also in an official report, also published all of this stuff. So it’s just intimidation.
It’s just trying to get you to stand out.
It’s intimidation, but it’s still something I have to take seriously. I have to have a lawyer. I mean, it’s insane and it’s just bullying. And as I say, it’s in coordination with all these other elements who go after me with these nasty attack stories. Even Boris Johnson, he gave it. He was finally asked about this relationship with Steve Bannon and on this radio show he said, “Oh, it’s complete codswallop. It’s complete codswallop,” and that’s what they call me. This is the nickname that I get from these people.
Well, you’ve got a nickname.
If we believe in the freedom of the press and we believe that it’s important to have a free and functioning press, we should all be horrified at this. And it’s just another example. It’s exactly what Trump is doing in calling the press the enemy of the people, etc.
Well, I was just gonna say, “Hello, I’m the enemy of the people”.
Got to be the enemy of some people.
Exactly. And as I say, litigation is an extra weapon you can use in Britain.
And, I’ve just ... I mean, I’ve multiple ones, but this in particular. And that they’re just going after me as an individual is just particularly nasty.
So Carole, are you still a tech ... Do you still love tech? Because it started off that way.
I mean, I have so many attacks and I’ve had a really, really tough couple of weeks, but I get this amazing support on social media and that has been ... Using the resource of social media to sort of, to communicate this story has been really vital. So I can see the utility of this stuff, but it just scares me. I mean, I think I go back to that sort of, the alarm I felt when I found those Nazi results and the alarm I feel about what is happening day to day in Britain and in America and these other countries. I mean, we should all be really, really chilled at what is happening in the world.
And I care about that. I just kind of, I feel that I’m in a position to try and do something about it. So I feel this compulsion to try and do something, but it is difficult and it has a personal toll. And having all of these different people and individuals and companies coming after me on this very personal, vicious level is ...
I get it.
It’s just hard.
It’s hard. And I’m ...
Many of them are bots, you do know that?
Yeah, many of them, but here particularly, this sort of pernicious right-wing media, which I have to deal with is really tough, but the ...
We have none of that in the United States, you know?
But there are...
Rupert’s an American citizen now.
We made him one. Good God.
I know, and that’s something that we just ...
He’s an immigrant, you know? Rupert Murdoch, in case you’re interested.
And that again, I mean, the thing is we haven’t even talked to is the connections, the transatlantic connections here are so strong and so more multitudinous and the money which flows between the states and between Britain and the way that we’re a kind of bridge head between the far right in America and the far right in Europe and in Russia, which supports the far right in Europe, is a sort of really key aspect of this. And we can just see the way that Brexit is weakening our ties to Europe, making us more prey to Trump’s America, your America.
Is there any glimmer of hope? Do you think it’s through legislation, through smart legislators?
I think there is. I think that is ... I mean, I think there have have been ... There’s certain ... I saw Knock Down the House, that Netflix documentary, and that was the first thing which has sort of cheered me up for ages, actually.
Yeah. She’s tough.
Yeah, and it’s that we can overcome some of this stuff, but it really does take ...
Well, you see the attacks on her. Look at those CBP people, those appalling, let me just say.
But we need people to step up and we need people to take on the fight. And I get why people don’t want to do that. But actually, this is where, as I say, people in Silicon Valley really need to look inside.
So what’s your next focus?
Well, it’s still, it’s this, I mean ...
The links between the far right, alt ...
There’s still so much of the Cambridge Analytica story which hasn’t been reported out, and in Britain, it just mutates and changes form so that Nigel Farage’s new party, the Brexit party, it’s now using PayPal to try and circumvent the electoral finance laws.
And two weeks ago I did ... Just over a week, no, one week ago, I did a story about Boris Johnson’s links to Steve Bannon. I mean, this stuff isn’t stopping. It’s absolutely going on right now in real time and it is incredibly underreported here in Britain. So I haven’t felt able to sort of just step back and go, “Okay, I’ve done my bit, take it on, team.”
So you will persist!
So I have been, yeah. Until, I don’t know. Yeah. Until I finally crack up, I suppose.
You’re not going to crack up, Carole.
Thank you Kara. That’s nice to have your confidence.
And you are a great journalist and anybody who says different will have to go through me. As they say.
They don’t want to go through me.
No, that’s true.
Me and Megan Rapinoe have your back. You have to join the Militia Etheridge, do you know about that?
It’s lesbians. We get real mean. We’re real mean. We call it the Militia Etheridge. We’re not taking any shit.
That’s great. Can you just like come over here and sort of kick some shit?
Oh, you got plenty over here.
Okay, that ...
We will. Well, anyway, I appreciate you for coming on the show. You are a wonderful journalist. You have done amazing work and for those who are critical of her, you’d better take a second look because what she’s saying is 100 percent true and we really appreciate that speech you gave. It was really amazing.
It hit a lot of people and it did not ... It had a lot of traction, in case you’re interested. Anyway, thank you for coming on this show.
Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.