Here’s the transcript of Recode’s interview with Facebook CEO Mark Zuckerberg about the Cambridge Analytica controversy and more

Sharing is caring, except this time.
Uncovering and explaining how our digital world is changing — and changing us.

Facebook CEO Mark Zuckerberg gave interviews yesterday to several news organizations, including Recode, in an attempt to stem the fast-growing controversy about misuse of user data by a third-party developer, Cambridge Analytica.

In a wide-ranging interview, he admitted the social networking giant may have made mistakes in opening up its network so much a decade ago and that it led to the recent problems. Zuckerberg said that fixing those issues will now cost the company “many millions” of dollars.

As Facebook’s stock continued to get hammered because of Wall Street worries about the impact in its business, Zuckerberg also said he was “open” to testifying to Congress, even as legislators ever more loudly call for his appearance in hearings.

And that is not all Silicon Valley's most famous mogul said, which is why we are posting the transcript of the 20-minute interview, which was conducted by Kara Swisher and Kurt Wagner of Recode.

A short amount of cross-talk about setting up the taping of the interview at the start was removed, but here is the interview (with some small adjustments to explain references made).

Kara Swisher: As you know from us emailing, I’m very interested in tough substantive discussions and questions about this, so that’s why I’ve been so adamant. So let’s just get started. Talk a little bit about the things you announced today. Let’s have you explain each of them very briefly.

Mark Zuckerberg: Sure. At a high level, this is a major breach of trust issue, and our high-level responsibility is to make sure that this doesn’t happen again. So, if you look at the problem, it kind of breaks down into a couple of areas. One is making sure that going forward, developers can’t get access to more data than they should. The good news there is that actually the most important changes to the platform we made in 2014, three or four years ago, to restrict apps like [researcher Aleksandr Kogan’s] from being able to access a person’s “friends” data in addition to theirs.

So that was the most important thing, but then what we did on our platform is we also are closing down a number of other policies. Like, for example, if you haven’t used an app in three months, the app will lose the ability to clear your data without you reconfirming it, and a number of things like that. So, that’s kind of category 1 going forward. And again, the good news there is that as of three or four years ago, new apps weren’t able to do what happened here. So this is largely ... this issue is resolved going forward for a while.

Then there’s going backwards, which is before 2014, what are all the apps that got access to more data than people would be comfortable with? And which of them were good actors, like legitimate companies, good intent developers, and which one of them were scams, right? Like, what Aleksandr Kogan was doing, basically using the platform to gather a bunch of information, sell it or share it in some sketchy way. So what we announced there is, we’re going to do a full investigation of every single app that had access to a large amount of people’s data, before 2014 when we lost out the platform, and if we detect anything suspicious, we’re basically going to send in a team to do a full forensic audit, to confirm that no Facebook data is being used in an improper way.

And of course, any developer that isn’t comfortable with that, then we’ll just ban them from the platform. If we find anything that is bad, then we’ll of course also ban the developer, but we will then also notify and tell people, everyone whose data has been affected. Which we’re also going to do here.

KS: So that begs the question ... this started off in 2007, 2008 when you were [launching] Facebook Connect, a lot of this stuff started very early, and I remember being at that event where you talked about this. Open and sharing, and it was helpful to growing your platform, obviously. Why wasn’t this done before? What’s in the mentality of your engineers of Facebook where you didn’t suspect this could be a problem?

Well, I don’t think it’s engineers.

KS: Well, whatever. People [at Facebook].

So, in 2007 we launched the platform.

KS: Yep.

The vision, if you remember is to help make apps social.

KS: Right.

So, the examples we had were, you know, your calendar should have your friend’s birthday. Your address book should have your friend’s picture. In order to do that, you basically need to make it so a person can log into an app and not just port their own data over, but also be able to bring some data from their friends as well. That was the vision, and a bunch of good stuff got created. There were a bunch of games that people liked. Music experiences, things like Spotify Travel, you know, things like Airbnb they were using it. But there was also a lot of scammy stuff.

There’s this values tension playing out between the value of data portability, right? Being able to take your data and some social data ... To be able to create new experiences on the one hand, and privacy on the other hand, and just making sure that everything is as locked down as possible.

You know, frankly, I just got that wrong. I was maybe too idealistic on the side of data portability, that it would create more good experiences. And it created some, but I think what the clear feedback was from our community was that people value privacy a lot more. And they would rather have their data locked down and be sure that nothing bad will ever happen to it than be able to easily take it and have social experiences in other places. So, over time, we have been just kind of narrowing it down. And 2014 was a really big ...

KS: I get that. 2014 you absolutely did that. But I’m talking about the ... You know — and I’ve argued with [Facebook executives] about this — this anticipation of problems, of possible bad actors on this platform. Do you all have enough mentality, or do you not see ... I want to understand what happens within Facebook that you don’t see that this is so subject to abuse. How do you think about that, and what is your responsibility?

Yeah. Well, I hope we’re getting there. I think we remain idealistic, but I think also understand what our responsibility is to protect people now. And I think the reality is is that in the past we had a good enough appreciation of some of this stuff. And some of it was that we were a smaller company, so some of the issues and some of these bad actors just targeted us less, because we were smaller. But we certainly weren’t in a target of nation states trying to influence elections back when we only had 100 million people in the community.

But I do think part of this comes from these idealistic values of openness and data portability and things that I think the tech community holds really dear, but are in some conflict with some of these other values, are in protecting people privately, right? And a lot of the most sensitive issues that we faced today are conflicts between our real values, right? Freedom of speech and hate speech and offensive content. Where is the line, right? And the reality is that different people are drawn to different places, we serve people in a lot of countries around the world, a lot of different opinions on that.

KS: Right, so where’s your opinion right now? Sorry to interrupt.

On that one specifically?

KS: Yeah.

You know, what I would really like to do is find a way to get our policies set in the way that reflects the values of the community so I’m not the one making those decisions. Right? I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world. So there are going to be things that we never allow, right, like terrorist recruitment and ... We do, I think, in terms of the different issues that come up, a relatively very good job on making sure that terrorist content is off the platform. But things like where is the line on hate speech? I mean, who chose me to be the person that ...

KS: Well. Okay ...

I have to, because [I lead Facebook], but I’d rather not.

KS: I’m going to push back on that, because values are what we argue about. And companies have values, and they have, you know, the New York Times has a set of values that they won’t cross and they make decisions. Why are you so uncomfortable making those value decisions? You run the platform. It is more than just a benign platform that is neutral. It just isn’t. I don’t know, we can disagree on that, we obviously disagree on this. But why are you uncomfortable doing that?

Well, I just want to make the decisions as well as possible, and I think that there is likely a better process, which I haven’t figured out yet. So, for now, it’s my job, right? And I am responsible for it. But I just wish that there were a way ... a process where we could more accurately reflect the values of the community in different places. And then in the community standards, have that be more dynamic in different places. But I haven’t figured it out yet. So I’m just giving this as an example of attention that we debate internally, but clearly until we come up with a reasonable way to do that, that is our job, and I do well in that.

Kurt Wagner: Hey, Mark, this is Kurt. I’m curious, you talked about going back and trying to figure out if there were other developers that had used your API before 2014, and checking were there any other bad actors that maybe you guys missed at the time. I’m curious how you actually go about doing that, and if it’s actually possible at this point to go out and detect, you know, if someone collected data in 2012, if that data still exists.

Well, the short answer is the data isn’t on our servers so it would require us sending out forensic auditors to different apps. The basic process that we’ve worked out — and this is a lot of what we were trying to figure out over the last couple of days and why it took a little while to get this post out — is we do know all the apps that registered for Facebook and all the people who are on Facebook who register for those apps and have a log of the different data requests that the developer has made.

So we can get a sense of what are reputable companies, what are companies that were doing unusual things ... Like, that either requested data in spurts, or requested more data than it seemed like they needed to have. And anyone who either has a ton of data or something unusual, we’re going to take the next step of having them go through an audit. And that is not a process that we can control, they will have to sign up for it. But we’ll send in teams, who will go through their servers and just see how they’re handling data. If they still have access to data that they’re not supposed to, then we’ll shut them down and notify ... and tell everyone whose data was affected.

This is a complex process. It’s not going to be overnight. It’s going to be expensive for us to run, and it’s going to take a while. But look, given the situation here, that we had a developer that signed a legal certification saying that they deleted the data, now two years later we’re back here and it seems like they didn’t, what choice do we have? This is our responsibility to our community is to make sure that we go out and do this. So, even though it’s going to be hard and not something that our engineers can just do sitting in their offices here, I still think we have to go do this.

KW: Did you ever think of doing these kinds of audits before 2014? Or even when you got that signed contract from ... or, excuse me, signed statement I guess, from Cambridge Analytica, did you think, “Well, we need to actually go out and check to make sure that they’re telling us the truth.” Why didn’t you do this kind of stuff earlier, or did you think about doing this earlier?

In retrospect, it was clearly a mistake. Right? The basic chronology here is in 2015, a journalist from the Guardian pointed out to us that it seemed like the developer Aleksandr Kogan had sold shared data to Cambridge Analytica and a few other firms. So as soon as we learned that, we took down the app, and we demanded that Kogan, Cambridge Analytica and all the other folks give up the formal, legal certification that they didn’t have any other data. And, at the time, Cambridge Analytica told us that not only do we not have the data and it’s deleted, but so we actually never got access to raw Facebook data. Right? What they said was, this app that Kogan built, it was a personality quiz app, and instead of raw data they got access to some derived data, some personality scores for people. And they said that they used it in some models and it ended up not being useful so they just got rid of it.

So, given that, that they said that they never had the data and deleted what derivative data that they had, at the time it didn’t seem like we needed to go further on that. But look, in retrospect it was clearly a mistake. I’m explaining to you the situation at the time and the actions that we took, but I’m not trying to say it was the right thing to do. I think given what we know now, we clearly should have followed up, and we’re never going to make that mistake again.

I think we let the community down, and I feel really bad and I’m sorry about that. So that’s why we’re going to go and do these broad audits.

KS: All right, when you think about that idea of ... it’s not exactly a “mistakes were made” kind of argument, but you are kind of making that. That idea. I want to understand, what systems are going to be in place, but it’s sort of, you know, the horses are out of the barn door. Can you actually go get that data from them? Are you ... It’s everywhere, I would assume. I’ve been told by many, many people that have access to your data, I was thinking of companies like RockYou and all kinds of things from a million years ago that have a lot of your data ... Can you actually get it back? I don’t think you can. I can’t imagine you can.

Not always. But the goal isn’t to get the data back from RockYou. You know, people gave their data to RockYou. So RockYou has the right to have the data. What RockYou does not have the right to do is share the data or sell it to someone without people’s consent. And part of the audits and what we’re going to do is see whether those business practices were in place, and if so we can kind of follow that trail and make sure that developers who might be downstream of that comply or they’re going to get banned from our platform overall.

It isn’t perfect. But I do think that this is going to be a major deterrent going backwards. I think it will clean up a lot of data, and going forward the more important thing is just preventing this from happening in the first place, and that’s going to be solved by restricting the amount of data that developers can have access to. So I feel more confident that that’s going to work, starting in 2014 and going forward. Again, for the last few years already it hasn’t been possible for developers to get access to that much.

KS: Let me ask just two more quick questions.

[Here, there is logistical cross-talk with a person on his staff, since Zuckerberg had to head to an employee meeting.]

All right, I’m talking to you while walking over there for Q&A.

KS: All right, the cost of this? And are you going to testify in front of Congress? And if so, when?

You know, I’m open to doing that. I think that the way that we look at testifying in front of Congress is that ... We actually do this fairly regularly, right? There are high-profile ones like the Russian investigation, but there are lots of different topics that Congress needs and wants to know about. And the way that we approach it is that our responsibility is to make sure that they have access to all the information that they need to have. So I’m open to doing it.

KS: What is “open”? Is that a “yes” or a “no”?

Well.

KS: They want you, Mark.

Well look, I am not 100 percent sure that’s right. But the point of congressional testimony is to make sure that Congress gets the data in the information context that they need. Typically, there is someone at Facebook whose full-time job is going to be focused on whatever the area is. Whether it’s legal compliance, or security. So I think most of the time if what they’re really focused on is getting access to the person who is going to be most knowledgeable on that thing, there will be someone better. But I’m sure that someday, there will be a topic that I am the person who has the most knowledge on it, and I would be happy to do it then.

KW: Mark, can you give us a sense of the timing and cost for this? Like, the audits that you’re talking about. Is there any sense of how quickly you could do it and what kind of cost it would be to the company?

I think it depends on what we find. But we’re going to be investigating and reviewing tens of thousands of apps from before 2014, and assuming that there’s some suspicious activity we’re probably going to be doing a number of formal audits, so I think this is going to be pretty expensive. You know, the conversations we have been having internally on this is, “Are there enough people who are trained auditors in the world to do the number of audits that we’re going to need quickly?” But I think this is going to cost many millions of dollars and take a number of months and hopefully not longer than that in order to get this fully complete.

KS: Okay, last question Mark, and then you can go. How badly do you think Facebook has been hurt by this, and you yourself, the reputation of Facebook?

I think it’s been a pretty big deal. The No. 1 thing that people care about is privacy and the handling of their data. You know, if you think about it, the most fundamental thing that our services are, whether it’s Facebook or Whatsapp or Instagram, is this question of, “Can I put content into it?” Right? Whether it’s a photo or a video or a text message. And will that go to the people I want to send it to and only those people? And whenever there is a breach of that, that undermines the fundamental point of these services. So I think it’s a pretty big deal, and that’s why we’re trying to make sure we fully understand what’s going on, and make sure that this doesn’t happen again. I’m sure there will be different mistakes in the future, but let’s not make this one again.

KS: Yes, let’s not. Okay, Mark, I really appreciate you talking to us.

KW: Okay, Mark.

KS: Thank you so much, I know you have to talk to your employees ...

I’m walking into my Q&A now. All right, see ya.

This article originally appeared on Recode.net.

Back to top ↑