Kara Swisher: I don’t think we have to say much. Facebook has been in the news. You may have heard them, seen them, everywhere.
Peter Kafka: Let’s bring ‘em up.
Kara Swisher: Let’s bring ‘em up. Sheryl Sandberg, CEO of Facebook, and Mike Schroepfer, CTO of Facebook.
Sheryl Sandberg: We are between all of you and a late dinner.
Kara Swisher: No, no, no. We have lots to say.
Peter Kafka: There’s time.
Kara Swisher: We have lots to talk about. Thank you for coming, first of all.
Sheryl Sandberg: Thank you for having us.
Kara Swisher: This has been an obviously news-filled year for you all. I told Sheryl this was going to be tougher than usual.
Sheryl Sandberg: Bring it on.
Kara Swisher: All right, excellent.
Sheryl Sandberg: We’re ready.
Kara Swisher: So why wasn’t anybody fired at Facebook over the situation with Cambridge Analytica?
Sheryl Sandberg: You should start with easy questions.
Kara Swisher: No, no, I think I’ll start there, and in three parts. Why wasn’t anyone fired, who should’ve been fired ... and that’s enough.
Sheryl Sandberg: Okay, well we’ll do the third part. So, Mark has said very clearly on Cambridge Analytica that he designed the platform and he designed the policies, and he holds himself responsible.
The controls in the company and this are under me, I hold myself responsible for the ones we didn’t have. And look, Shroep and I are here, we run the company.
We do fire people at Facebook. We don’t chop them out and make examples of them, that’s not how we are, because we want a culture of responsibility up top and we take it.
And the thing for us, and I think what underlies your question is, “Do we know that we were late?” Not just on the data for Cambridge Analytica, but on fake news, on misinformation, on elections, and what are we doing about it. And we definitely know we’re late, we have said we’re sorry, but sorry’s not the point.
Kara Swisher: Yeah.
Sheryl Sandberg: What’s the point, the point is the action ...
Kara Swisher: Your ads are lovely, but go ahead.
Sheryl Sandberg: Well, thank you. But the point is the action we’re taking, and on all of these fronts, we’re really thinking about the responsibility we take in a very different way.
When you think about the history of Facebook — and you’ve been following us and part of it for a long time — you know, for the last 10-12 years, we’ve been really focused on social experiences and building those and enabling those. What the world would look like if people knew it was your birthday. I was just in Houston, people found people and saved people through Harvey because they were posting publicly on Facebook, those good use cases. But I don’t think we were focused enough on the bad, and when you have humanity on a platform, you get the beauty and you get the ugliness.
And where we are now, is really understanding the responsibility we have to more proactively see the problems and prevent them.
Kara Swisher: So ... go ahead.
Peter Kafka: I’m gonna play kind of good cop.
Sheryl Sandberg: That’s hard with Kara. She’s such a good cop.
Peter Kafka: Isn’t the problem not that someone screwed up but that you built this architecture that’s fundamentally open to — whether it’s Cambridge Analytica or the election stuff or any of the problems that have been surfacing the last couple of years where it’s built for scale, its software is built for sort of minimal oversight, and you want the humans to sort of populate it with content and use it. Automated ads systems, it seems like, James Murdoch referred to it as this giant attack surface, that you built this thing that’s actually working in the way you initially thought it was going to work, you just didn’t realize what you built.
Sheryl Sandberg: I don’t know that ...
Kara Swisher: One of you can start with the ...
Mike Schroepfer: I was gonna jump ... do you think that what you raised is a real fundamental tension between sort of giving tools that are easy for people to have free expression, and then for keeping people safe. Because yes, if you really want to lock everything down, you censor everything and have human reviewers read every single post someone puts on there, but I don’t think that’s what people actually want.
And what we’re trying to balance is easy tools for you to be able to post, share, photos, links, whatever you want with anyone you want, but to make sure the really bad stuff, the abuses, hate speech, bullying, economic abuses, or this election interference, to get that stuff off the platform, while not still taking free speech ...
Peter Kafka: Is that doable with something that’s fundamentally going to be built from software and automation that reaches two billion people around the world?
Mike Schroepfer: We do think it ...
Sheryl Sandberg: Not perfectly.
Mike Schroepfer: This is what we’re in right now, is trying to do this really well. I think it’s a combination of humans and technology to make this work, and to figure out in each society, in each culture, where’s the line between political speech and hate speech? How do we make sure that we get things that work for everyone, all across the world, that sort of on an unprecedented scale?
Kara Swisher: Let’s go back a little bit further. You just said that we didn’t see the negative parts, that people ... I don’t know if you meant human beings but I know quite a few bad ones. But the idea is, what is in your culture that didn’t see that? Like you’re saying, I know Mark apologized, the ads, it seems like everyone’s doing an apology ad. What is it in the culture, because I can remember a meeting when Facebook Live happened, where I actually, when I was shown it, I said, “What are you going to do about when someone murders someone or commits suicide or beats someone up?” And I think the product manager’s like, “Kara, you’re so negative,” and I’m like, “What?” Like, I’m sorry, humanity is really awful in many ways.
Sheryl Sandberg: I think that is part of the tension, and Live is a good example. When Live happened, there was a lot of good, really catchy in terms of sharing. People really enjoyed the experience, but there were things that were wrong, and we actually on that one, I think, moved very quickly. We got down to human review of anything live within minutes, which is actually hard to do operationally but we got there. And that was really important, and there have been things that were taken down off Live right away. There have also been things that have happened in Live that we were able to really intervene and help people.
So, with all of it, I think it goes to the point you were making which is a very good point. We built an open platform. It is a platform where so many people come on and share. They’re going to do the good, and they’re going to do the bad. It’s not that we’re ever going to prevent all of it. We will never say that, but we can get better. We can be more transparent. We can put a lot more resources and a lot more thought with technology, automation and people.
We’re also really working on being more transparent, that we think is a huge part of the answer. So content policy. Free expression is fundamental to Facebook, it’s a very deep value for us, but so is a safe community. And as Shroep was saying, those values really rub against each other.
We’ve published our community standards, but now we went ahead and published the internal guidelines that people use to judge, because what is free expression for one person is hate for another. We worked with over 100 experts around the world. We published those, we had a lot of good feedback and we’re going to keep iterating. We also published our results, so we have out there now, how many pieces of, you know, 1.3 billion fake accounts taken down in six months. ISIS and Al-Qaeda content, we’re getting 99 percent of it before it’s reported. Sexual content, we’re getting 96 percent. Hate speech, 38 percent before it’s reported.
So we can see where the areas we need to invest and by being open about that, we think we can get people to help us because ...
Peter Kafka: Your overall plan, right, is in your term we’re going to hire a lot of people, Mark said we’re going to hire so many people to audit our political ads, we’re going to lose money on political ads this cycle. But x number of years out from now, the software will be good enough that we’re going to solve most of this. Do you guys believe, sort of top down, that eventually you can solve the software problem with another software problem?
Mike Schroepfer: I mean, you can see this in the numbers that Sheryl just talked about, because if we published numbers of, for example, objectionable content, nudity and pornography, identified by people first and reported to us, vs. identified by automated systems. Several years ago, it would’ve been 0 percent and 100 percent, all generally reported by people. Now it’s 96 percent automated by AI systems and you’re seeing this again, with violence the number is 86, with hate speech it’s 38 percent, because it’s harder, it’s more nuanced, it’s more in the frontier of development.
Peter Kafka: But it’s solvable?
Mike Schroepfer: But we’ve seen years, and this goes back many, many years of development to make these systems work, and we see quarter over quarter steady progress.
I’m, as a technologist, I was very worried about some of these harder problems. We’ve made more progress in the last six months than I thought was actually possible. So that gives me a lot of optimism to do this from a technology perspective over time.
Sheryl Sandberg: I think one of the things people worry about is, do we think we can automate everything? Do we think we can be neutral? No.
Mike Schroepfer: No.
Sheryl Sandberg: No. We want to get things when they’re loaded so the difference between something being loaded, and technology pulling it down before anyone sees it, that’s much better. That’s what we’re able to do with ISIS and Al-Qaeda content. We’re able to do that more with photos and adult sexual content. Hate speech as language, more nuanced. So the automation which helps us get it down before its seen, that’s great. But there are humans building the technology and we understand that.
Sheryl Sandberg: And there are humans making the decisions on the rules, and I think at every stage there’s gonna have to be some technology and some human review.
Kara Swisher: Let’s go back to how it happened, because I don’t think that was the answer in the congressional hearing. They seemed riveted by your terms of service and what your actual business model was. That was an impressive display of intelligence by our politicians. And it’s so funny ...
Sheryl Sandberg: You should run, you could be in Congress, and then you could ...
Kara Swisher: We’ll get into that in a second. When they were asking these questions, one of the things they didn’t ask is exactly what happened. Like, exactly how did this occur? And I’m really interested in — and one of the things Mark said a lot was — we take a broader responsibility now.
Why didn’t you take a broader responsibility for what is in the culture that creates that? And I don’t think it’s malevolence. I don’t, I’m not accusing you of that. I think, what is it that’s part of the Facebook culture that didn’t see this coming, in terms ... and walk through Cambridge Analytica first.
Mike Schroepfer: Do you want to go through the timeline, do you want to talk about ...
Kara Swisher: Timeline, yes. So let’s talk about the actual time, because I have heard some versions of it but I remember being at the 2008 event where you opened on the platform. You needed to bring people in, you needed subscribers, you created this open thing, you handed out the data. What did you think was gonna, what happened there? So walk through that.
Mike Schroepfer: Can you go all the way back to 2007, 2008 when the platform first launched. The idea was, people are using the service Facebook, they want to take their data with them to a third-party app, to make it social, to enhance it. Over the years, there’s a lot of pressure to say, don’t be a walled garden, let people take their data and easily bring it to another application.
Kara Swisher: You also needed to give it to them to come on the platform, you needed some sort of candy to attract all these app developers in, correct? Or something to get them to use ...
Mike Schroepfer: Well, they wanted to build these, great. I mean, many of these are name-brand companies now and they, like many people, thought that apps are better with your friends and with the social data. In the early years, it was a lot, the idea — and I think this gets back to your optimism versus pessimism for these things. I think for the entrepreneurs in the audience, you’ll know that as an entrepreneur, you get told “no” and “your idea’s stupid” nine times out of 10 during the day. You have to some degree take it as an optimistic attitude to bring something new into the world, whether it’s a new product, a new company or a new future in a product.
So I think you start there when you’re building these things, and for the platform, what we spent a lot of time on was, “Look, people are smart. They’re ultimately using a third-party app, so whatever Facebook data they take, they’re also putting new data into that app.” So they have to trust that app and understand what it is. Our job is to kind of give them the notice on what’s happening. We built these ... I remember spending iteration after iteration on how exactly do we design this dialogue to make it super clear exactly what data you’re getting from Facebook and bringing to the third-party app, because if the customer knows what’s happening, they can make informed decisions.
And that was really the focus of the platform in the early years. As the platform got bigger and things scaled, this is when in 2014 we said, “Look, we want to kind of restrict access to these things. We want to do more proactive review of applications.” So all new apps had to get reviewed by our staff, and to get to your specific question about the timeline that happened here, this app was built around that time frame, and then we heard in December of 2015 via media reports that an app developer had basically gotten Facebook data once people installed it, and then resold it to a third party.
Kara Swisher: Why via media reports? You’re super smart people, I’m pretty certain. So where does it break down there, that you didn’t know what was going on?
Mike Schroepfer: The problem is we can’t observe the actual data transfer that happens there. I don’t actually even know physically how the data went from one to the other. There isn’t a channel that we have some sort of control over. Again, as a consumer you’re ultimately trusting a third party with your data. Whatever data you brought from Facebook, whatever data, you’re taking these personality quizzes and you’re inputting new data in there. That’s a relationship with that developer that you have to trust that they’ll be responsible with the data they’re using.
Whether it’s on Facebook or some map you downloaded from an app store, so we didn’t observe that until we heard about it through third-party reports. That’s when the events went into motion where we ...
Kara Swisher: That was 2015.
Mike Schroepfer: Yeah, and the first thing, we immediately disabled the app from the platform so it couldn’t have further access. The goal was to figure out who had this data, how do we make sure that that data is deleted and is safe, and that’s what kind of happened in that time frame.
And the reason this has come up this year again in 2018 is subsequent reports that despite agreements to the fact that they had deleted the data, they may not have. That’s when we resurfaced and looked through all of these things. We had made many platform changes, as I said, in 2014, which made a lot of this not possible because you could no longer pull friends’ data.
That’s when we’ve really just taken this much sharper, more pessimistic view on everything in the company. It’s the biggest cultural shift I’ve ever seen in the 10 years I’ve been there, which is just top to bottom. Not just what are all the great things that can happen, but what are all the ways people can abuse this? What are all the theoretical ways this can happen? How do we make product changes? How do we make policy changes? How do we invest our resources differently, both in security and content review, and in product development?
And that’s like a process that’s ongoing, right? We’ve been working on this since then.
Peter Kafka: You grew at an enormous rate, not by accident. You’ve created systems to help you grow. There’s growth hacking as a term. Looking back, do you wish that you had grown more slowly, or reined in growth, or been more thoughtful at the ways you’re growing and how much of that is attributed to the problems you have today?
Mike Schroepfer: I do, sorry, I don’t.
Sheryl Sandberg: No, go ahead.
Kara Swisher: Well, let’s do Sheryl first. Go ahead, Sheryl.
Sheryl Sandberg: Looking back, we definitely wish we had put more controls in place. We got legal certification that Cambridge Analytica didn’t have the data, we didn’t audit them, and now we’re waiting for the government. We still want to, we’re going to. We wish we had taken more firm steps.
Kara Swisher: You had those media reports, why didn’t you?
Sheryl Sandberg: We did go back and got ...
Kara Swisher: They said they didn’t, but then ...
Sheryl Sandberg: They legally certified they had deleted the data. We did not go and audit.
Kara Swisher: Why not?
Sheryl Sandberg: Which is what we’re doing now. It always looks obvious in the hindsight, we absolutely should have. But to your question, I think we can grow and we can continue to grow but we can also have controls in place. Those exist in long things, you know ...
Peter Kafka: Right, now that you’re two billion people, Mark says we’re going to slow down and we’re going to be more thoughtful about it. If you’re cynical, you might suggest it’s easier to say this now than it was five or 10 years ago.
Kara Swisher: Absolutely.
Sheryl Sandberg: Well, we always had some controls in place but I don’t think they were enough. So let’s talk about data because that’s what we were talking about. We always had ways for people to control your data. You always could go in and choose to share your data with apps, and you could always delete. Always there. What did we do now? We put it at the top of everyone’s news feed, a very easy way. Here’s all the apps you’ve connected to, here’s how you easily delete them. So I think we’re really building on what we did before. The reason we were able to do that so quickly is all of those controls existed. We already had all of those controls, they were just harder to find for people and we made them easier.
So we are building on some of the controls we had before as we address some of these, and in some of the areas, we’re going much further. It’s also the case that threats change. Let’s talk about Herman, let’s talk about the election. If you go back to 2016 and you think about what people were worried about in terms of nation, states or election security, it was largely spamming, phishing, hacking. That’s what people were worried about. A lot of the Sony emails, a lot of people hacking into systems. We were on a really good tech team. We were very protective on that, and we didn’t have problems a lot of other platforms had. We didn’t see coming — and I don’t think we were alone in this, but it’s on us — we didn’t see coming a different kind of more insidious threat.
But once we saw it, we did publish a white paper. We found the ads, and now we look forward to the next elections and we understand that threat, and we’re taking very strong steps.
Kara Swisher: But you all did know that you were gonna make a lot of money. Well, we’ll get to all that election spending in a minute.
Sheryl Sandberg: Yeah. You want to ...
Kara Swisher: Go ahead.
Sheryl Sandberg: On elections, this is important. So, we realized we didn’t see the new threat coming. We were focused on the old threat, and now we understand that this is the kind of threat we have. We’ve taken very aggressive steps. There are elections going on all over the world as we speak, and also coming up to the 2018.
So fake accounts, we’ve publicly reported this. We’ve pulled down 1.3 billion in the last six months. Importantly, we pulled down fake accounts in Alabama, a Macedonian troll farm that we found that was trying to spread fake information in that election. We worked with the German government, pulled down fake accounts there, 30,000 in France that could’ve effected theirs, so we are showing that we are able to meet those threats. Probably not perfectly.
You know, we can talk as much as you want about fake news, the really aggressive steps we’re taking there. We’re now set up with third-party fact-checkers with the AP in 50 states looking at local and state news coming into our midterm election, and ads transparency, which — I know Senator Warner’s here. In that area, we’re not waiting for legislation. He has a bill out there, we’re supportive of that bill, but we built the tool that that bill requires, and it’s live, so anyone can see any of the political and issue ads.
So the issue for us is we were slow. We are learning from our mistakes and taking action. We’re also pretty humble about this. We understand that now we’re protecting against this threat, but we have to have a different mindset of trying to see around the corner and the next threat.
Peter Kafka: Were you surprised when you read the Mueller indictment and saw how few people they committed to that effort, how little money they had to spend to sort of fill Facebook with spam?
Kara Swisher: It wasn’t hard to do.
Peter Kafka: It was not a giant team.
Sheryl Sandberg: I mean, even the IRA ads, it was a small amount of money that went far, and that is why we have to take such strong steps. You know, in going back, a lot of the source of these things, if you think about fake news or elections, are fake accounts. So fake accounts are a big part of this thing here.
The other thing is really disrupting the economic incentives. A lot of fake news, a lot of it is politically motivated, but it’s also economically motivated. People want to write outlandish headlines so that they can get clicks so that they can make money, so we’ve taken very aggressive steps to go after the economic incentives, kick people out of ad networks, make sure they can’t make money. In all of this, we’re going to have to solve today’s problems but also see ahead to the new ones. I think one of the most important things we’re doing are around transparency.
Kara Swisher: For users?
Sheryl Sandberg: Transparency for people. Right now, it’s actually pretty amazing if you go in and look at it. You can go in and see any ad running that has political or issue content, directed at anyone. The problem before is that if you weren’t in the targeting group — you know, if I were targeting Peter — you couldn’t see those ads, but now it’s open and transparent for everyone. I think that is going to help people surface problems. I think people are going to find more things, and that will help us learn, pull them down and build the tools to prevent those in a more automated way.
Kara Swisher: So one of the things that you’re talking before, it’s like there’s this before and after. Like, “Oh, no. We got woke over here at Facebook” or something.
Sheryl Sandberg: I think there is a before and after ...
Kara Swisher: Yeah, all right. It does.
Sheryl Sandberg: ... and I think that’s appropriate.
Kara Swisher: Right. But many people now have hostility toward Facebook and toward the tech industry because of it. Yeah, obviously people, fair or not, blame Facebook for the election, or its part in it. I think its part in it and other things. What’s the, from your perspective, the overall impact on the tech industry and the country at large? What responsibility do you actually feel for what happened?
Sheryl Sandberg: For what happened in the election or just in general? In the election?
Kara Swisher: Both.
Sheryl Sandberg: Look, the story of this election is going to be studied for a long time, and I don’t think any of us have perfect answers. We’re committed to helping to find those answers. I think we’re unique. We’ve set up an election commission. We’re giving them access to data, third parties. They are researchers. They are going to report publicly on what they find, and we’re cooperating deeply with that. I don’t think we know ...
Kara Swisher: What about ...
Mike Schroepfer: I mean, I think at the heart of this is you’re asking a question about responsibility. People have responsibility for impact of the tools they build, not just the existence of those tools.
Kara Swisher: Yes. That’s the drum I like to bang.
Mike Schroepfer: Right, and I think that the days in tech of just, “Hey, I built these tools. I’m not responsible for what happens with them,” are sort of over. You really need to have a deep responsibility to think about not just the good that these tools can have, but the bad, and what are all the things we can do to guard against it, and is the weight of these tools more in the positive than the negative. I think that’s, again, the big cultural shift that I think a lot of people in tech have to make to really think about this in advance, not just after the thing was created.
Sheryl Sandberg: And I think tech has long been, as an industry, pretty insular, and I think that’s changing, too.
Kara Swisher: Really? I know.
Sheryl Sandberg: Yeah.
Kara Swisher: No, I think that it’s true.
Sheryl Sandberg: Yes, so that’s changing, too, so what’s going on right now with elections, we’re working much more closely. Like some of the stuff we found in Germany, we found with the German government. We worked with the French government. We’re working with local authorities around the world. I think, again, opening up our community standards, opening up to be more transparent, that enables people to find things, and everyone to work together, because a lot of these threats, we definitely take responsibility, but bad actors will go from platform to platform, and so the more we can cooperate, and our industry is doing a better and better job at that. When we find a bad actor, we are cooperating on that and some of the legal changes have allowed so that we can pull them down and so can everyone else. I think that’s important.
Peter Kafka: Very much in the hearings, and how comically inept some of those questions seemed to be.
Sheryl Sandberg: She said that ...
Peter Kafka: Yeah. I know.
Sheryl Sandberg: But she’ll run, and then she’ll be up there, and then she’ll understand.
Peter Kafka: But even people who were knowledgeable, I mean, if you ask two different people what’s wrong with Facebook, they’re going to give you different answers. Some are upset about the Russian ads and some are upset about Diamond and Silk. Is that the names? Did I get it right?
Sheryl Sandberg: Uh-huh.
Kara Swisher: Yeah.
Peter Kafka: Do you feel like the U.S. government and governments in general are really ready to engage in a technical discussion about how to fix specific problems with Facebook or other parts of the tech business?
Kara Swisher: Yeah, because I mean, a lot of people are like, “Oh, Mark did well.” But largely because he didn’t sweat, apparently, but I mean, it was ridiculously low bar. Everyone was, “Mark did well.” I was like, “No, they did badly. Sure, he did better than they did.” Which was not very hard, but that’s right. Are they able to understand and legislate well, because some of the calls are for breaking you up, for example?
Sheryl Sandberg: So the question of regulation is a real one and deep one. It’s not a question of people say, “Should Facebook be regulated? Should other companies, our industry, be regulated?” We are regulated. We’re regulated on privacy. We’re all regulated under GDPR, which Evan talked about.
Kara Swisher: Not much. Compared to other industries. I think most ...
Sheryl Sandberg: And it’s not really a question of if there’s more regulation, the question is what regulation. We’re working closely with regulators around the world. Evan made a really important point that we feel deeply, too, which is regulation often, actually, entrenches big companies.
Kara Swisher: Yeah.
Sheryl Sandberg: So GDPR, I think we’ve done a very good job complying with. We’ve put up expensive to build systems and tools and controls for people.
Kara Swisher: You have a lot of lawyers, too. Yeah.
Sheryl Sandberg: Yeah, but if you look at what we built, if we were a startup 10 years ago, we wouldn’t be able to build all those settings and get them out, and we’re making those available to the world. So, we’re supportive of the right regulation that supports innovation that is based on an understanding of the technology, and that is good for people, and there are some of those examples.
Peter Kafka: But we spent 20 minutes talking about what a complex system you’ve built and how difficult it was for you to figure out the problems and how you’re going through it now. I mean, do you really imagine this is something where a bureaucrat or legislators are going to be able to sort of keep up to date with what’s going on with your various platforms?
Sheryl Sandberg: I mean, look. It’s hard. There are examples. There are funny examples from history. In the United Kingdom, when the car was invented, they passed a law saying that in order to operate a motor vehicle, you needed two people. One behind the wheel and one walking in front of the car with a red flag.
Peter Kafka: That’s pretty good.
Sheryl Sandberg: That will absolutely save lives, but you don’t get the car. I mean, I’ll ask the audience a question. Who here answers a call if there’s no caller ID? If you don’t see the number? Raise your hand if you will answer that call.
A couple of you. I’m going to call you. But most people won’t. When caller ID first came out, the state of California tried to pass a law against it because it was considered a violation of the caller’s privacy that you would know where that is. So, there are laws that are clearly either contemplated or passed that are bad ideas. There are also laws that are good ideas. I think people feel pretty good about GDPR and the controls it’s given people, and so it’s our job to work closely with regulators and legislators all over the world so that if there’s more regulation, and when there’s more regulation, it’s the right regulation.
Kara Swisher: So, let me ask that, Schrep first. Should Facebook be broken up? Google and Facebook, should they be broken up?
Mike Schroepfer: Look, I think there’s two things. One, is there competition in the market, and if you look at many of the products we build, if you want to share a video, YouTube’s a better place to do it. If you want to have a public conversation, Twitter’s a great place to do it. If you want to send a message, there’s Snapchat, there’s WeChat, there’s LINE, there’s any number of things out there that you can use, iMessage, just to send those messages, so consumers are smart. They use the products that they want. We’re a very small part of the overall ads business, so I think we’re honest when say we feel like we feel competition all the time.
The other thing we are able to do in tackling a lot of these issues, they are the same across the platforms. We’re able to take the same technology we’re using in Facebook to deal with objectionable content, hate speech and bullying and immediately apply it to Instagram at a massive scale, and I think that’s a really big benefit for where we are today.
Kara Swisher: So your answer is no?
Mike Schroepfer: No.
Kara Swisher: Okay. What about you?
Sheryl Sandberg: No. For all the same reasons.
Kara Swisher: Yeah.
Sheryl Sandberg: You want me to say more?
Kara Swisher: Yes, please.
Sheryl Sandberg: Yeah, I mean, look. This is a question fundamentally about competition and what are the benefits to consumers of being together. I think it’s what Schrep said. I’ll share a specific example. If you were doing child-exploitative content, WhatsApp’s encrypted, but we know who you are from Facebook. We can take your account down on WhatsApp, too. So there are real benefits, and I think the real question is do consumers have a choice? I think along every product we have, there is a lot of choice out there.
Peter Kafka: Do you think you’ll be allowed to buy another WhatsApp or another Oculus or do major acquisitions like that now in the way you’ve been able to in the past? Microsoft essentially was really restricted in terms of what they can buy. Google, there’s much more eyes on them because of their size. It seems like you guys are going to be there now.
Sheryl Sandberg: Well, certainly as you get bigger, there’s more scrutiny of acquisitions, and there should be. So, we’ll see. It really depends what it is. If it was in something that wasn’t core to what we were doing in a new area, like Oculus was, I think it would probably be allowed.
Peter Kafka: How are you getting along with your fellow tech giants? You’ve been competing with Google for a long time. Tim Cook recently told Kara some things that in the real world would be considered very mild, sort of dinner conversation, but in Silicon Valley apparently it was considered a rough attack. How are you getting along with Apple and Google?
Kara Swisher: Yeah. What did you think of what he said?
Peter Kafka: Or that one?
Sheryl Sandberg: Yeah, I mean, look, the conversation that you had with Tim and the stuff Apple’s saying is important to them. Right? They have a product they feel strongly about. Won’t shock you to know that Mark and I strongly disagree with their characterization of our product. We’re proud of the business model we built. We have an ad-supported business that allows people all around the world to use a product for free, and if you’re trying to connect the whole world, that’s pretty important. So we respectfully disagree.
Kara Swisher: Okay. What about you?
Mike Schroepfer: Same. I mean, I think that the thing that I wish we could spend more time on is the substance of these issues, because there’s times when you can get nice clippy soundbites and sort of kick someone when it’s popular and they’re down. That’s us right now. I get it. We, in many ways, deserve it.
Kara Swisher: He did go on and have a very cogent discussion about it, but go ahead.
Mike Schroepfer: But I think there’s lots of questions on trade-offs, so how do you build a product that the whole world can use, like what are the different business models that work? Can every consumer afford a $10-a-month subscription or a $700 device?
Kara Swisher: Right.
Mike Schroepfer: And for billions of people around the world, like, no, not yet. So I think that there are trade-offs there. And in all of these things ... And I think, as an engineer, what frustrates me is there are deep issues in a lot of these things and mistakes that we’ve made and things I really wish we had done differently, but in many cases, you face these really hard trade-offs, which is you can have more of something and then you’re going to have less of something else. I can make you more secure. We’re going to make some mistakes and take down some things that we shouldn’t have taken down. That is a balance in all of this.
Peter Kafka: Are you guys thinking about an alternate Facebook that’s ad free and/or paid? Is that a product that you’re working on?
Mike Schroepfer: I’m sorry?
Peter Kafka: Are you working on a paid product?
Sheryl Sandberg: I mean, we’ve looked at subscriptions, and we’ll continue to look at them, but we’re committed to continuing to provide a free service, because it’s core to the mission of what we do.
Kara Swisher: But how far are you along on a paid service?
Sheryl Sandberg: We’re looking. We’ve always looked. But really the heart of the product is a free service, and again, we think that’s really important.
Kara Swisher: I’ll try you. How far are you along on a paid service? I don’t like that answer.
Mike Schroepfer: I’m not trying to be one of the people that’s fired over all of this tonight.
Kara Swisher: Well, no one’s getting fired, apparently.
Peter Kafka: I want to ask about ...
Sheryl Sandberg: That is not ... That is not what we said.
Peter Kafka: You guys do sell some stuff. Everyone in the audience has an Oculus Go. It’s 200 bucks, right?
Sheryl Sandberg: Yeah.
Mike Schroepfer: Yeah.
Peter Kafka: Explain again why you’re in the hardware business, and that’s not your first hardware product.
Kara Swisher: It’s because they’re really ...
Sheryl Sandberg: Yeah, you really like it.
Kara Swisher: They’re super sorry about the Russians, so everybody gets an Oculus Go. No, I’m teasing. Thank you.
Audience member: [Yay!]
Sheryl Sandberg: Thank you.
Kara Swisher: Yes, let’s talk about VR.
Sheryl Sandberg: Did you hear what he said? Accessible VR.
Mike Schroepfer: Accessible VR. Can we bring him up onstage?
Peter Kafka: Yeah, yeah, yeah, yeah.
Mike Schroepfer: He’s doing a better job on it.
Sheryl Sandberg: He likes it.
Kara Swisher: But talk about VR. This is like because it’s ...
Peter Kafka: I want to ask two questions. One, why are you making your own hardware and selling it? And two, why are you in VR? Again, Tim Cook says you should be in AR, not VR.
Mike Schroepfer: Okay, great questions. We’ll do the first one first, which is if you see the Oculus Go, it’s a $199 product that you pull out of your bag and you put on your head and you’re in VR. No headphones, it’s got built-in speakers. It’s at a price point that many people can actually afford. Doesn’t require a PC or your phone to dock it or anything else like that. There’s a tremendous amount of engineering that goes into making that product sellable at that price. I know when we set that target, the team was like, “You can’t do it.” Right. So the only way we know how to do it is by doing all the work ourselves so that we can make the right trade-off needed to sell this product and build the ecosystem around it.
Towards the bigger question of like hey, why VR at all, it’s the only technology I can think of that’s going to build the closest thing we have to a teleporter or transporter from Star Trek, which is, “I want to be somewhere else or with someone else very far away, and I can’t afford or don’t have the time to take the long flight to get there.” And VR ... There’s an app that’s coming out that will take you to the Natural History Museum in London. You can actually pick up specimens from the drawer that are so sensitive the scientists themselves can’t touch them because they’ll break them. They’re fossilized. You can make them different sizes. You can see what it’s like to see a pterodactyl in flight, right, and that’s an experience that we can bring to hundreds of millions or billions of people and children all over the world through VR. I don’t know of a technology that can do that.
Peter Kafka: And you think this is a mass product — because it has not taken off yet — and you think that’s just a function of expense and difficulty in getting the stuff up and running?
Mike Schroepfer: I think this is an early, early market and an early product, and so we’re pushing the market forward here. I think, as someone in the audience here said is, this device is the first one that didn’t involve a bunch of asterisks on the end, which is like, “Then do this, then do this,” and by then you’re like, “I’m done. I got something else to do.” This is just put it on, you are in Jurassic World looking at dinosaurs. You’re in the Natural History Museum. You’re seeing an NBA game courtside. Right? These are things that you can’t scale out via any other technology in this way.
Peter Kafka: It’s a different experience for guys, right, to like get in early on a market and then to buy it, spend a lot of money on a company where the market doesn’t exist, as opposed to WhatsApp, Instagram. You built these things. They were widely used by lots and lots of people. You made it big early about ...
Mike Schroepfer: I mean, remember, I helped with onboarding Instagram when it was 10 million users when we bought it. It’s grown quite a bit since then.
Sheryl Sandberg: It was a good buy.
Mike Schroepfer: But you’re right. I think it’s a new market, there’s a lot of new technology. As amazing as the Go is, we have multi years of sort of R&D in the labs that we’re ready to bring to subsequent products that can kind of take this even further. So, when you look at a space and you say, “If I can build the product, I know people will love it.” Then you ask the question, “Well, can we build it?” I’m pretty sure we can in VR.
When you go to AR, everyone’s like, “Yeah, it’d be amazing to have these super awesome glasses that give me this full 3-D world,” and then you actually go and you look at the physics of it and battery life and all of the rest of it and say, “There’s a bunch of stuff that doesn’t exist in the world yet that we need to go invent to make that happen.” So we’re working on that too, but I don’t think that’s coming to market in 2019.
Kara Swisher: And Sheryl, was that a reaction? I mean, you all tried the phone and others succeeded, at Google and Apple particularly. Is it a reaction to having to be in the hardware market or do you ever imagine Facebook going back into phones?
Sheryl Sandberg: I don’t think we’re talking about going back into phones. I think this is an exciting new area, it’s possibly a new platform, it can be a very social experience, and we’re excited about it.
Kara Swisher: All right, last question and then we’ll get to ... How do you think this has affected your business for the long term this past year? And you’re not as public as you are, Sheryl; how has it affected your image?
Sheryl Sandberg: I don’t think any of our individual images are the point. The point is the responsibility we bear for the platform and protecting people going forward.
You know, in terms of the business, we don’t make decisions for the short run. We don’t have to and we shouldn’t; I don’t think any company should have to. But we have founder control and protections in place, and we’re very clear that we’re gonna make the investments we need to make. I don’t think there’s a trade-off between a business over any reasonable timeframe.
Kara Swisher: Has it been effective ... You’re the one that interfaces with advertisers the most.
Sheryl Sandberg: Yeah, and we’ve had a handful of advertisers pull, some have already come back. I don’t think it’s effected our short-term business. But it effects ...
Peter Kafka: What about your behavior or engagement? You measure how they feel about it.
Sheryl Sandberg: Yeah. I mean, we’ve looked, there’s certainly an impact, but I don’t think it’s detrimental right now to the current business. But it matters, and we’re investing because we want to do the right thing. We’ve always wanted to do the right thing. I think we were very taken with the social experiences and now we’re very taken with the need to provide safety, security, integrity on our platform.
We also — again — approach this understanding that this is gonna be an arms race. This is gonna be an arms race; we’re gonna do some things, someone else is gonna do something, we’re gonna have to do better. And there are risks ahead of us we have not yet seen, and so we want to make sure we’re working closely with other companies, working closely with government, closely with civil society around the world so that we deeply understand what’s happening on our platform.
We also really want to protect the good. I mentioned this. I was in Houston, I met this guy, he owned a taco store. When Hurricane Harvey happened, he had lots of food but no ability to bring it to anyone. He met a guy on Facebook who owned a taco truck; they were competitors, they didn’t know each other. He put his food in the truck and they used Facebook to see where people are checking in and drove around feeding them. That doesn’t mean that every day on Facebook something happens, and I don’t mean that to be Pollyannish, but it matters. And we care. We care about preventing that. Now those people were all able to be fed because they had shared publicly on Facebook where they are, and so people have to trust us, that they can share not just in an emergency but in a daily place during an election, during a difficult time for them personally or a difficult time for a country. People have to trust us.
And so the responsibility we take to earn people’s trust and take real action to prevent the harm while protecting the good, we’re about as serious as we know how to be about it.
Kara Swisher: I’m gonna ask this one more time: How has it affected you? I know it’s not about you personally, but you know, Evan just talked about the difficult of doing Wall Street stuff. I want you each to say, if you can, if you have human emotions ... No, I’m teasing you.
Mike Schroepfer: I am programmed with human emotions.
Sheryl Sandberg: Kara, Kara.
Kara Swisher: No, I’m trying to get you to say it. Like what ...
Sheryl Sandberg: Have you read my books?
Kara Swisher: Yes, I have. Yes, yep. But how has it affected you?
Sheryl Sandberg: You don’t get to say that to me.
Kara Swisher: What has it changed ... I know that. I know. I mean, what is ...
Mike Schroepfer: She can say it to me.
Kara Swisher: How has it affected you as an executive? How about as an executive, not as a ...
Mike Schroepfer: I am programmed with human emotions. Very advanced sub-system, so ...
Kara Swisher: You worked together forever. Something broke here.
Mike Schroepfer: Look, it’s never fun to ... We all read the news every day and see everyone, you know, mad at us and upset at us and hating on us, and as an individual, that’s not fun. But I don’t think anyone ... Like I don’t think we deserve any sympathy, because our job is to build this platform in a way that makes sense. And you know, the fact that there are some real issues there makes it harder because it’s not just BS you can wave away, but say like, man, terrible stuff happens on the platform all the time. And that’s the stuff that really gets you down, is this awful thing happened. My gosh, what are all the things I wish we could’ve done to fix it and what are all the things we’re gonna do now? And it’s sort of this ... It’s not fun to be in but it’s really important work. And so I don’t know if that helps, it’s ...
Kara Swisher: And you?
Sheryl Sandberg: It’s hard, but it should be. I mean, it’s hard because ...
Kara Swisher: So what did you learn as an executive?
Sheryl Sandberg: I learned that we needed to invest more in safety and security, I learned that we needed to try to find the new threat. And I sit here feeling pretty confident that we’re doing a much better job than we were before on the threats we know of today, and feeling a lot of, you know, need to figure out what the next threats are and knowing that we won’t do it alone and knowing that we need to work in a much more transparent and open way, because I think that’s the only way we’ll be able to find the next threat. But I take that really seriously.
Peter Kafka: Did you get the answer you want?
Kara Swisher: I think I did, yes.
Peter Kafka: Can we open up to the audience?
Sheryl Sandberg: You know what I’m saying.
Don Graham: So I’m Don Graham of Graham Holdings. I want to identify myself as, for a long time and always, a friend of Facebook, but for years not an insider; I don’t know what’s going on inside.
Kara Swisher: You were on the board.
Don Graham: I was, up to three years ago, Kara.
Kara Swisher: Still.
Don Graham: So I want to say, this is not the greatest compliment you’ll ever have, that the Kara/Peter questioning here is a much better version of the conversation that the senators and Mark had in Washington, D.C.
Kara Swisher: We’re gonna run.
Don Graham: You’ll get better compliments in your life. But there’s one thing that’s happened since that conversation, and that’s that Facebook has actually announced to us users a series of what sound like very difficult changes that you’ve made on the platform. And I wish that, since you’ve recapitulated the conversation with Mark, Sheryl would summarize the changes you’ve made and also tell this audience, some of whom I think are on Facebook, about product changes that you’re planning to make in the next few months to address the questions that Kara and Peter have asked.
Peter Kafka: Let’s do the news part. Tell us what’s coming.
Kara Swisher: And clear history.
Mike Schroepfer: New news, you’re saying?
Don Graham: Yeah.
Mike Schroepfer: Still trying to get me fired tonight.
Kara Swisher: You’re not getting fired.
Mike Schroepfer: Thank you, Don, it’s a great question. So there’s initiatives in each of these categories, so I’ll try to just give a high level ... And I think the fundamental of it is, Sheryl said is, a whole heck of a lot more transparency on what’s happening and a whole heck of a lot more proactive taking these things down. So when you look at news, it’s down-ranking of clickbait-y articles, disrupting economic incentives. So if you’re being sent to a site that’s basically just this ad farm, we figure that out and down-rank that. That’s using third-party fact-checkers, as you mentioned, in all 50 states. You know, showing up for local elections. It’s “about this article,” so you can get more information on the providence of the article. Lots of things to help consumers better understand exactly what’s happening in the news. That’s just, you know, a small set of things overall in news.
In the broader platform, there’s been a number of places where we just looked at every nook and cranny of the platform and figured out where can we either just completely deprecate APIs or require more review in all cases so that we’re reviewing the applications not just for what they do but making sure that there’s sort of minimal use of data in all regards. We put a notice in front of everyone in Facebook about what apps they’ve used, so you can go in and see what apps you’ve used, delete them if you don’t want. If you haven’t used an app in 90 days, we’ll auto-disconnect it from your account so it can’t ... You know, you use an app and then three months later it pulls more data. That’s just stopped, that’s broken.
Kara Swisher: All right, new. New.
Peter Kafka: What’s the new stuff?
Mike Schroepfer: That’s all stuff we launched in the last like two months.
Kara Swisher: All right, new ...
Sheryl Sandberg: Clear history.
Peter Kafka: We’re not gonna get new ...
Mike Schroepfer: Clear history is the thing that we announced but haven’t yet shipped, right? Which is if I want to disconnect all Facebook data from my Facebook account, like kinda clearing my history in my browser or clearing my cookies, you can do that.
Peter Kafka: Can I extract my information from Facebook if I’m not a Facebook user? Can I go to you and say, “What do you know about me, and by the way, can I have that data back?”
Mike Schroepfer: The challenge is we don’t have a personal profile for you on Facebook, so we don’t actually even know how to identify you as that data ...
Peter Kafka: Are you guys thinking about how to solve that?
Mike Schroepfer: Well again, in many cases you have cookie data from a device or from a browser, but I don’t know which person this is associated with, and so it’s pretty hard to get that data back for an individual.
But there’s, you know, we can ... I don’t know if that fully answers the question. Is there anything else that I ...
Sheryl Sandberg: Yeah, well, clear history. We’re taking the GDPR settings and controls, they’re out in Europe obviously, but they’ll be coming to the rest of the world in the next number of months.
So along the way, things around data, things around news, things around elections, things around fake accounts and content.
Peter Kafka: Sam, don’t ask them to disclose new product stuff.
Sam Schwartz: Hi, Sam Schwartz from Comcast. We talked a lot about transparency, especially around the source of ads, Russian bots, those kinds of things. But I’m really confused about the News Feed algorithm itself right on Facebook and transparency around that. I see all kinds of stories on mine, I never know why they’re ranked in the order that they are, and studies show that that News Feed has the power to influence the moods of billions of people.
How do we grade you on that awesome responsibility to ... As a for-profit company, I would assume some of that’s done for my benefit and some of it’s done for yours. How do we grade your curation of the News Feed?
Mike Schroepfer: It’s a great question. I mean, one of the things that, when you look at ... First of all, the content of your feed is dominated by who you friend and what pages you like. So the easiest way to adjust the content of your News Feed is to adjust that. And then on a story basis, we’re introducing more and more controls to allow you to sort of mute a particular person, to unfollow them so you can still be friends but maybe their stuff doesn’t show up in feed, and ads, for example. There’s a really useful control that says “why am I seeing this ad?” that gives you pretty great detailed information on exactly why this ad was shown to you.
So I think in every case what we’re trying to do is give you controls; when you’re looking at this like why did I get this story, we should help you answer that question on the spot, which is what we found is most effective to answer these sorts of things.
Kara Swisher: And in terms of hiring people ... Sheryl, you talked about this, this crowdsourcing of what our news source is. Where is that? The idea that you would have your community rank sources.
Sheryl Sandberg: So we’ve done a lot in news, right? Probably the most important thing we’ve done in news, which is taking down distribution across the board for news partners and news publishers, is that we’ve really taken very strong action on clickbait, on sensationalism, and then we did meaningful social interactions; really getting back to the heart of what Facebook was, which was really a place to connect with family and friends. We heard from people that they wanted more friends and family, less video, less public content, less news, so those signals got taken into account.
We also really care about psychological well-being. And so we started looking at this and we’re gonna continue to look at this, this research is ongoing, and we found that when people are interacting with content where they’re actively engaged, friends, family, they “Like,” they comment, they share, that’s very positive, but it’s not as positive when you’re a passive consumer. So that also meant the signals went that we had more friends and family, less news.
Then within news, we want news that’s trusted, that’s real, accurate information, and we also really care about local. And this is hard. There’s no perfect way to do this. But what we did on trusted is we went out to the community at large and we asked people to identify news sources they were familiar with, not that they read, but they were familiar with — because if you hadn’t heard of them, it’s not fair to rank them — and then do you trust? That was one signal that was used to increase distribution for some news sources and decrease distribution for others, and really hit, I think, some of the more sensational sources.
We’re also prioritizing informative, again working hard with third-party fact-checkers, to mark and really dramatically decrease the distribution of fake news and also prioritize local. We’ve announced that we’re gonna be supporting local news; we are gonna make sure that people see local news, and hopefully accurate local news, in their News Feed. We think it’s important.
Peter Kafka: Someone’s gotta make that local news and someone’s gotta figure out a business model that makes that local news possible, and that’s ... I mean, you guys are playing around the edges for that, but it seems like if you just wanted to cut people a check, that would help a lot of local newsrooms survive.
Sheryl Sandberg: Well, we’re thinking about what we do in local news and considering things.
Peter Kafka: Okay. Real quick.
Ina Fried: Ina with Axios. On Cambridge Analytica, I get in 2015 they certified, they deleted it, and you thought they deleted it. The question I haven’t heard a clear answer to is when the Trump campaign, you had people working within it, when they suddenly had all this data on voters, how was no one either hired up at Facebook or working with the campaign suspicious of where did they get that data?
Sheryl Sandberg: They didn’t have any data that we could’ve identified as ours. To this day, we still don’t actually know what data that Cambridge Analytica had. We are trying to do an audit, the British government came in and put our audit on hold so they could do theirs, but we did not see data that we thought was from Facebook; otherwise we would have done that.
Ina Fried: Did you see a suspicious amount of data that they knew more about voters or that ... Not really?
Sheryl Sandberg: No, not really.
Audience Member: In Sri Lanka, the government recently had to shut down Facebook because of fake news that led to violence. And so I was just wondering practically, on the ground in international locations, what are you doing in order to combat those sorts of situations?
Mike Schroepfer: Yeah, there’s been issues in Sri Lanka and Myanmar and others, and you know I talked earlier, this is the worst thing to see, is when people weaponize this platform and it causes real-world harm. The challenge here is getting, as you say, people on the ground in the country who understand the landscape, the cultural landscape, the nuances of the languages, the NGOs to work with, the folks to work with there, to help understand where the issues are and where we need to intervene. And so that’s been our focus, is to literally just get more people on the ground in each of these countries who can focus on that and then have product teams in the company who, when they get feedback about changes we need to make there, who can deal with that.
We’re also looking at technological solutions. For a lot of the AI tools that we’ve built, they require large amounts of training data, for those familiar, and that training data’s readily available in the bigger languages. But in languages like Burmese, they’re just ... It’s not as good. And so it’s actually one of the core focuses of our lab, is to figure out how to take, you know, a classifier in one language like English and transmute it over to a language of very little data like Burmese so we can immediately deploy some of the technology we’ve built for other languages there. We’re kind of doing all these paths in parallel because we want to solve this as quickly as we can.
Kara Swisher: All right, last question from me: Do you feel like your company does understand the responsibility you have now and do you think that has — obviously Mark, you — does your whole management team feel that?
Sheryl Sandberg: I think we feel it really deeply. I think we’re making huge investments, really huge investments; they’ll hit our profitability. We think those are the right things to do and I think we know it’s an arms race, that we sit here knowing what today’s problems are, feeling more responsibility for the future, and knowing we need to protect people who are using our platform.
Mike Schroepfer: I mean, as I said earlier, it’s the biggest cultural shift I’ve seen in the company in the whole time I’ve been there by a pretty wide margin.
Kara Swisher: All right, thank you so much.
Mike Schroepfer: Thank you.
This article originally appeared on Recode.net.