On the latest episode of Recode Decode, hosted by Kara Swisher, Facebook CEO Mark Zuckerberg sat down with Kara to talk about Cambridge Analytica, why Infowars is still on Facebook and the danger of over-regulation, among many other topics.
You can listen to our entire conversation right now in the audio player below. If you prefer to listen on your phone, Recode Decode is available wherever you listen to podcasts — including Apple Podcasts, Spotify, Pocket Casts and Overcast.
Below is the full transcript of the conversation. You can read a more condensed, lightly edited version here.
Kara Swisher: Mark, thank you so much for talking to me.
Mark Zuckerberg: Happy to do it.
This is our first interview in years, right? We’ve seen each other.
Yeah. Happy to do it.
No problem. I’m gonna start off with the news of the day. You saw the Putin/Trump press conference, essentially.
I saw the news about it.
You saw the news about it. Tell me what you think about his idea that there is no evidence that the Russians used social media, and did different things during the election.
Well the evidence that we’ve seen is quite clear, that the Russians did try to interfere with the election. What we saw-
This is on Facebook?
Yes. All of what we saw is on Facebook. Then we’ve tried to cooperate with the government and the different investigations that are going on. They obviously have much more context than this. But what we saw, before the election, was this Russian hacking group, part of Russian military intelligence, that I guess our government calls APT28. They were trying to do more traditional methods of hacking: Phishing people’s accounts, just getting access to people’s accounts that way.
We identified this, actually, in the middle of 2015 and notified the FBI. When we saw similar activity through the campaign in 2016, that they were trying to phish people’s accounts in both the DNC and RNC, we notified some of the people over there as well, [who] we thought were at risk. Later we also identified that they had set up a fake account and fake pages under the banner of, connected to this thing, DCLeaks, in order to seed stolen information that they had gotten to journalists.
We, around the time of the election, had given this context to the FBI. They’ve clearly gone much further now, at this point, in terms of putting the whole story together. You could see that in the indictments that Mueller just issued over the last week or so. That’s the part that I actually think we got, and were on top of.
Now, there’s a whole other area of election interference that we were slower to identify. That’s around the coordinated information operations that they were trying to run, and that was a different group. Instead of APT28, that was this group, IRA, the Internet Research Agency, which basically was just setting up a network of fake accounts, in order to spread divisive information.
Yeah. Misinformation. Divisive information.
Using advertising content in a variety of ways.
Yeah. Well both advertising and organic — so setting up pages and using the free products.
Once we became aware of this, which we think we were too slow to being on top of that, but once we became aware of this, we developed this whole roadmap and set of techniques to go and handle that type of security threat in addition to the type of phishing and more traditional cyber attacks that we had seen before. That takes us through all the elections that we have seen since then. There’s the French presidential election, the German election, the Alabama special election, the Mexican election recently, and there were elections all around the world.
Now the playbook is, we build AI tools to go find these fake accounts, find coordinated networks of inauthentic activity and take them down; we make it much harder for anyone to advertise in ways that they shouldn’t be. A lot of tools around ad transparency, to make it so that anyone who is advertising, especially around political issue ads, will have a lot of the information, a very high standard of transparency. Higher than what you have in TV or print, or other kinds of ads there. And in the U.S., we’re also even going so far as verifying the identify and location of every single advertiser-
Who wants to run a political or issue ad, which, for a lot of folks, legitimate folks, has slowed down the process of buying ads, which I think can have its own costs for discourse, but we just think is the right precaution to be taking on this.
And yeah, that’s probably a longer answer to what your were going for.
No, no. That’s all right. So, you believe it’s the Russian government, from where you’re sitting, was using or misusing Facebook? You believe it was the Russian government? Unlike Trump, you believe it was the Russian government?
The information that we have on who these groups are largely comes from the U.S. government and U.S intelligence.
So you believe U.S. intelligence?
We have no reason not to. Certainly, we’ve seen the activity from APT28. That name comes from U.S. intelligence. Advanced Persistent Threat 28 from Russia, and the IRA. These are real things. These aren’t things that someone made up. We saw this activity. We went out, we traced IRA activity, not only through what they’ve tried to do in the U.S., but we’ve traced that activity back to trying to manipulate culture and news in Russia itself, including taking down there, pages that are connected to sanctioned Russian news organizations that the government, Russcom, has said are real news organizations there, but what we’ve detected through our systems are actually essentially the same thing as the IRA. All the people who are running them are the same.
These things are real, and we’ve been aggressively pursuing them for the last couple of years. This is now just part of the ongoing playbook that we have for preventing these kind of disinformation campaigns.
What took you so long? I think, as you know, many people feel disappointed with Facebook’s behavior and the slowness, given the power that you have, or the power over the market you have. I don’t wanna say what’s your excuse, but that’s kind of the question. What was the problem?
We just weren’t looking for these kind of information operations. We have a big security operation. We were focused on traditional types of hacking. We found that and notified both the government and the people who were at risk, but there’s no doubt we were too slow to identify this new kind of attack, which was a coordinated online information operation.
You can bet that that’s now a big focus of the security effort that we have here. We’re very focused on making sure that we get this right, not just broadly, but in all the elections that are coming up. 2018 is an incredibly important election year, not just with the important midterms here in the U.S., but you just had the Mexican elections. You have Brazil. You have India coming up at the beginning of next year. There’s an assortment of elections around the EU. We’re very serious about this. We know that we need to get this right. We take that responsibility very seriously.
I know you say that, but I do wanna get at, do you reflect on what it was within ‘cause you’re the leader here, you’re the head of this, that you didn’t see it? That you don’t see that side of humanity? Or, that you don’t understand your responsibility?
I’m not sure. I think… In retrospect, I do think it’s fair to say that we were overly idealistic and focused on more of the good parts of what connecting people and giving people a voice can bring. I think now we understand that, given where we are, both the centrality of Facebook, but also, frankly, we’re a profitable enough company to have 20,000 people go work on reviewing content, so I think that means that we have a responsibility to go do that. That’s a different position than we were in five or six years ago, or even when we went public and were a meaningfully smaller company at that point.
I do think it’s fair to say that we were probably… we were too focused on just the positives and not focused enough on some of the negatives. That said, I don’t wanna leave the impression that we didn’t care about security or didn’t have thousands of people working on it before then.
No, I don’t think that’s the case.
It’s not like ... I think that these are different kinds of threats that people widely didn’t anticipate, and that isn’t an excuse. I think it’s our job to anticipate this stuff on our platform and to make sure that people can’t use it for negative... This was a new thing. I think we understand that we were slow to it and need to do a better job both on this specific type of threat, defending against nation-states, which is not really a top-line thing that was a major focus before, even though there were some parts of the program that were doing that. I also think that we know that there are gonna be new threats in the future that we haven’t seen yet and that security is an arms race. It’s our responsibility to be as ahead of that as possible.
Some people feel you are a nation-state in a lot of ways.
We’re not. We’re a company.
You know that. You know people think of you in a powerful manner, I guess.
I think we have a lot of responsibility. The community, more than two billion people use our products, and we get that with that, a lot of people are using that for a lot of good, but we also have a responsibility to mitigate the darker things that people are gonna try to do.
What does that responsibility feel like? Do you think you have understood it? There’s a lot of ways ... Someone was saying to me, you can’t just pass power along. You have an enormous amount of power. Do you understand that? Do you think about that? Or, you don’t think you have?
I think we have a big responsibility, but I’m not sure what you mean by “pass power along,” but I actually think one of the things that we should be trying to do is figure out how to empower and build other institutions around us that are important and can help figure out these new issues on the internet.
One example, recently, is probably fact-checking. I don’t think that we should be in the business of having people at Facebook who are deciding what is true and what isn’t.
We’re gonna get into that in a second.
But I think that ... But someone has to have the job of doing that. Society needs people who can be trusted, who can say, to vet things fairly, and say “this is provably false.” I think that there’s a role to help build an ecosystem and support that ecosystem. News, I think, is another topic that I’m sure we’ll get into.
Yeah, we’re gonna get into it in just a second. Before we get to that, but we are going to, the idea of other institutions, that’s a really interesting idea.
When we first met, if you remember. Do you remember it, when we first met?
I remember we went on a walk.
We went on a walk.
I don’t know if that was the first time.
I think it was. Owen Van Natta introduced us. Two things you did is you said, “I hear you think I’m an asshole,” because I had just joked to Owen about it.
And I said, “I don’t know you well enough to know if you’re an asshole or not, yet, but I will soon.”
One of the things that you did tell me that was striking was you called Facebook a utility. Do you remember that?
Yeah, I called it that for a while.
For a while?
At the time you meant it was a useful system. It was, in contrast to other internet companies at the time, much more entertaining or various things like that were in ascendance at the time. What do you call Facebook now?
I think that that is still a good description. In general, we’re a social network. I prefer that because I think it is focused on the people part of it as opposed to some people call it social media, which I think focuses more on the content. For me, it’s always been about the people, and the reason why I called it a utility was because a lot of people used to think of it as a fad. What I was trying to communicate was, no, building a network and building relationships is one of the most core things that people do, and that is an enduring utility that people need, that is not a fad. The company shouldn’t be run to try to build something that is cool, it should be run to build something that is useful and enduring. And I still believe that.
I think that there’s this notion today that a lot of the main uses of social networks are for sharing content. That obviously has a big impact on giving people a voice, and there are safety and security implications. There are media implications of that. When I think about what social networking should be... now you’ve mapped out all of the people who a person cares about. What are all the useful things that you can do for people on top of that? So I think about things like Marketplace, that we’re doing, that now people can have trust through their network and can basically go and buy and sell things more easily than they would be able to on other services.
Your choices are basically, you can use Amazon, which is a central service. You can use eBay, which is a community, because they have their broad reputation system, or you can trust people through the network. A lot of people choose to do that because they know people in common, and that feels better. It’s a better experience for them, so that is a really important example.
Other examples are things like Safety Check. There are disasters that happen — Hurricane Harvey came up, and you had people self-organizing through the community and getting in boats and driving around rescuing people coordinated ad hoc through this network. That’s not a media function. That’s a social network of people coming together ad hoc to provide safety infrastructure that the world needs, so that’s kind of more how I think about what we’re doing. My hope would be-
You’re talking about a city then.
I’m not sure if it’s a city. It’s social infrastructure, to be sure, but my hope is that if you fast forward five or 10 years, more of what people think about social networking will not only be the aspects around people sharing content, but also people coming together in these different ways.
Let’s talk about news. Let’s talk about news.
This has been, everyday seems to be a new thing of people asking you to make determinations about what news is. The power you have over distribution is very clear — to publishers, to citizens and everyone else. How do you look at your role, ‘cause you’re kind of an accidental publisher, in a lot of ways? Content, there was all kinds of content, but right now you’re being asked, right now as we’re doing this interview, there’s a Congressional hearing going on. In that case, conservatives think that you’re not, you don’t give a voice to conservatives. Yesterday, I wrote a story, which I think you read, about other publications think you give too much voice to those. “You shouldn’t have InfoWars on here.” Let’s talk about InfoWars. Let’s use them as the example.
Make the case for keeping them, and make the case for not allowing them to be distributed by you.
There are really two core principles at play here. There’s giving people a voice, so that people can express their opinions. Then, there’s keeping the community safe, which I think is really important. We’re not gonna let people plan violence or attack each other or do bad things. Within this, those principles have real trade-offs and real tug on each other. In this case, we feel like our responsibility is to prevent hoaxes from going viral and being widely distributed.
The approach that we’ve taken to false news is not to say, you can’t say something wrong on the internet. I think that that would be too extreme. Everyone gets things wrong, and if we were taking down people’s accounts when they got a few things wrong, then that would be a hard world for giving people a voice and saying that you care about that. But at the same time, I think that we have a responsibility to, when you look at… if you look at the top hundred things that are going viral or getting distribution on Facebook within any given day, I do think we have a responsibility to make sure that those aren’t hoaxes and blatant misinformation.
That’s the approach that we’ve taken. We look at the things that are getting the most distribution. If people have flag them as potential hoaxes, we send those to fact-checkers who are all well reputable and have followed standard principles for fact checking, and if those fact checkers say that it is provably false, then we will significantly reduce the distribution of that content, and if someone-
So, you move them down the line rather than get rid of them?
Yeah, in News Feed.
Why don’t you wanna just say “get off our platform?”
Look, as abhorrent as some of this content can be, I do think that it gets down to this principle of giving people a voice.
Even if it’s a hoax.
Yeah. I mean, at some level, it’s hard to always have a clear line between ... I’m not defending any specific content here. I think a lot of the content that’s at play is terrible. I think when you get into discussions around free speech, you’re often talking at the margins of content that is terrible and what should ... but defending people’s right to say things even if they can be bad. Sorry, I lost my train of thought here. Where-
There’s a difference between offensive and hoaxes.
Oh yeah. Yes.
InfoWars. I want you to make a case for taking InfoWars off. If you were on the other side of it.
I think if you were trying to argue on the side of basically the core principle of keeping the community safe, I think you would try to argue that the content is somehow attacking people or is creating an unsafe environment. Now, let me give you-
Let me give you an example of where we would take it down. In Myanmar or Sri Lanka, where there’s a history of sectarian violence, similar to the tradition in the U.S. where you can’t go into a movie theater and yell “Fire!” because that creates an imminent harm. There are definitely examples of people sharing images that are taken out of context that are false, that are specifically used to induce people to violence in those ares where there’s-
And violence has resulted.
Yes. We are moving towards the policy of misinformation that is aimed at or going to induce violence, we are going to take down because that’s basically ... The principles that we have on what we remove from the service are, if it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform. There’s a lot of categories of that that we can get into, but then there’s broad debate.
Okay. “Sandy Hook didn’t happen” is not a debate. It is false. You can’t just take that down?
I agree that it is false.
I also think that going to someone who is a victim of Sandy Hook and telling them, “Hey, no, you’re a liar” — that is harassment, and we actually will take that down. But overall, let’s take this whole closer to home...
I’m Jewish, and there’s a set of people who deny that the Holocaust happened.
Yes, there’s a lot.
I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think-
In the case of the Holocaust deniers, they might be, but go ahead.
It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.” (Update: Mark has clarified these remarks here: “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”)
What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed. I think we, actually, to the contrary-
So you move them down? Versus, in Myanmar, where you remove it?
Can I ask you that, specifically about Myanmar? How did you feel about those killings and the blame that some people put on Facebook? Do you feel responsible for those deaths?
I think that we have a responsibility to be doing more there.
I wanna know how you felt.
Yes, I think that there’s a terrible situation where there’s underlying sectarian violence and intention. It is clearly the responsibility of all of the players who were involved there. So, the government, civil society, the different folks who were involved, and I think that we have an important role, given the platform, that we play, so we need to make sure that we do what we need to. We’ve significantly ramped up the investment in people who speak Burmese. It’s often hard, from where we sit, to identify who are the figures who are promoting hate and what is going to... which is the content that is going to incite violence? So it’s important that we build relationships with civil society and folks there who can help us identify that.
But it’s not just Myanmar. It’s also Sri Lanka. We have a whole effort that is a product and business initiative that is focused on these countries that have these crises that are on an ongoing basis.
Again, I wanna know how you feel. How did you feel when that started to happen? And the blame was shifted a little bit to Facebook and how Facebook was used as a tool by these people?
Look, I wanna make sure that our products are used for good. At the end of the day, other people blaming us or not is actually not the thing that matters to me. What matters to me is how are people using our services, and are we acting as the force for good that I know we can and have a responsibility to [be]. It’s not that every single thing that happens on Facebook is gonna be good. This is humanity. People use tools for good and bad, but I think that we have a clear responsibility to make sure that the good is amplified and to do everything we can to mitigate the bad. When you hear that new bad things are happening-
What kind of responsibility do you feel? I’m just really ... I’d feel sick to my stomach. I’ll tell you. That would be my first ... I feel sick. “People died, possibly because of something I invented.” You could do sort of the old, “Facebook doesn’t kill people, people kill people” kind of argument. What does that make you feel like? What do you do when you see that? What do you do yourself? What’s your emotion?
I mean, my emotion is feeling a deep sense of responsibility to try to fix the problem. I don’t know that’s a ... That’s the most productive stance.
To do something.
Look, you can either look at this and say, “We should have predicted all of these issues ahead of time,” and some people think that. I tend to think that it is very difficult to predict every single thing. Now, some of these things I think we could have done better on, but I think you’re building something from scratch. There are going to be —
— challenges that come up that are things that we did not foresee. If we foresaw this, I think we might have missed something else. Now, that doesn’t make it okay, but what I think it means is that our primary responsibility, I don’t believe, is to foresee every problem before it happens as much as it is to, when we become aware of something to do everything we can to address it.
Let me give you another example. When Live came up, one of the terrible use cases where people were using ... There were a small number of uses of this, but people were using it to-
Show themselves self-harm or there were even a few cases of suicide. We looked at this, we’re like, “This is terrible. This is not what we want the product to be. This is terrible, and if this is happening and we can help prevent it, then we have a responsibility to.” So, what did we do? We took the time to build AI tools and to hire a team of 3,000 people to be able to respond to those live videos within 10 minutes. Most content on Facebook, we try to get to within hours or within a day, if it comes up, and obviously, if someone’s gonna harm themselves, you don’t have a day or hours. You have to get to that quickly. With all the millions of videos that are posted, we had to build this combination of an AI system that could flag content that our reviewers should look at, and then hire a specific team trained and dedicated to that, so that way they could review all the things very quickly and have a very low latency.
In the last six months, we’ve been able to help first responders get to more than a thousand people who needed help quickly because of that effort.
So, not anticipating it before, did you not see that that would be a thing? I got into a lively debate with your product managers about it before it was ... because you all showed the press before it [launched]. They seemed genuinely surprised when I said, “What about murder, what about bullying, what about suicide, what about self-harm, what about this?” They seemed less oriented to that than towards the positivity of what could happen on the platform.
I think some of the cases we were ready for, and some we weren’t. Bullying, I think-
— is something we’ve worked on for a while and have ongoing, good collaborations with law enforcement and community groups around the world. There’s always more to do there, but that’s an area where I’m generally proud of the work that we’ve done. Look, I think on any of these given things, someone will have thought of it in advance, but I think that we should be judged by when we become aware of an issue-
How quickly you respond.
How do we respond, and do we get it right, and is it a repeat thing? The thing that I… In running a company, if you wanna be innovative and advance things forward, I think you have to be willing to get some things wrong, but I don’t think it is acceptable to get the same things wrong over and over again.
Absolutely, but you’re coming from a different case. When you get things wrong, I don’t wanna say people die, but people suffer. People can suffer in a different way than if I get something wrong or other people do. I mean, the vast amount of responsibility that you have is, I think ...
Yeah, I would just say that on the flip side, that if we don’t move forward, a lot of good that should happen won’t happen, either. And it’s hard to know what the moral equivalence of those things is, because a lot of the good is diffuse and not things that get in the news, but I can’t tell you how many times I walk down the street in some city and people come up and say that they got married because of Facebook, they point to their kids and they’re like, “I have this kid because of Facebook.” People have stories about how the communities that they form on Facebook are the most meaningful thing in their life, that got them out of bad situations that they were in, and I think if you don’t move forward, you lose all that stuff, too.
So I mean, these are hard trade-offs. And certainly I don’t think that ... You know, we retired “Move fast and break things” many years ago because we didn’t think that that was serving the community as well as it had originally. But I do think that there is a benefit and virtue to continue making progress, and I think with progress means that you get some things wrong. And I think that what our responsibility to do is accept when we get things wrong and not be in denial about it, which sometimes we can be too slow on, but in general I think, if we mess something up, we better damn well make sure we don’t make that same mistake again if it’s a serious thing. So, across elections and all this different stuff, yeah, we need to make sure that we’re on top of these issues.
Do you regret the “move fast and break things” [motto]? Because a lot of ... I mean, I make the joke, “You broke enough things, now fix them,” kind of idea. Do you regret that motto?
I mean, I think it is certainly used today as a symbol ... It’s used in a way that isn’t what I meant. And so the notion up-front was not about social impact; it was about writing code in this service, and the idea is that by moving faster, we can serve more people with something that a lot of people really wanted. And I kind of ...
Actually, you know, so at CZI, Chan Zuckerberg Initiative, we’ve kind of adapted that value. Instead of “move fast,” we call it “learn fast.” And that’s really the spirit of it more, I think, is that, the idea is, you can either try to get everything right up-front, which I think has a high cost to making progress and serving people, or you can believe that we’re not gonna get everything right up-front, but by moving forward we will learn more. And that will make it so that the second and third version of what we do is better.
And I really believe that that is the right way to run a company; I think companies need to be learning organisms. More than any specific product strategy that we have, our strategy as a company is to learn as quickly as we can how to serve our community, and I think you only do that by being out in the world, by talking to people, by running experiments, and by trying out things that you’re not sure are gonna be good to see how people use them. I think that that is our responsibility, is to learn as quickly as we can as an organization.
So if it’s “learn fast,” what’s the second thing? What do you do with things?
What do you mean?
”Learn fast” rather than “move fast.” What happens to “break things?”
Well the value was always move fast. It wasn’t-
What happens to the break part?
Well, the point there was, I think values are only worth what you’re willing to give up for them. So, a lot of companies have values that are “be nice.” it’s like, “Okay, well fine, be nice.” That’s good, you should, but the real question is like what are you willing to give up?
So I don’t think you can just tell a company “move fast”; the question is, “What are you willing to tolerate?” And what we were willing to tolerate early on was more bugs in the product. Not having the product do something completely different, but like if there were a few errors in the code and it wasn’t fully polished, we generally thought that learning quicker and serving people who wanted the product was more important than having it be completely bug-free.
What we realized was we were getting to a point that we were accumulating so many bugs that having to go back and fix all the bugs after the fact of having launched them was actually net slowing us down and making it so it actually became not an effective way to move forward. So we changed the motto to what is now the much less sexy version of “Move fast with stable infrastructure,” where the current strategy for moving fast is to invest disproportionately in infrastructure and abstractions that any given engineer could either go work at a start-up, or their own company, or they can come here and I think be much more productive because they’re building on top of all these great systems that have been built.
But either way, I don’t think you can just say, “Move fast”; the question is, “What are you willing to give up?” And in our case now, what we’re willing to give up is a meaningful portion of our engineering team working on great infrastructure and abstractions to help everyone else move forward when those people could otherwise be working on serving people directly. I still think that that’s the right strategy because I think learning fast is the core of what we need to be doing.
So I want to finish up on news by talking about sort of what’s going on today with conservatives versus liberals. Why won’t you make choices there, or do you feel like you just don’t want to make any, in terms of media and what should be ... How do you respond when conservatives say, “You don’t have enough conservative stuff on the platform?” You guys have responded and some people think you over-responded. How do you think you’ve done?
Sorry, I didn’t really understand that.
How do you feel with the allegations from conservatives that there’s not enough conservative ... That conservative voices are ... I hear it all the time from conservatives.
They get shadowed either on Twitter or on Facebook, or that you’re out to not allow conservative voices to speak up. And on the other side, others think that you’re bending over backwards to serve a conservative constituency.
I don’t think you can win any way, but ...
Well, I think it gets back to the core principles here. So it’s actually the same core principles we discussed before: Giving people a voice on the one hand, and keeping the community and people safe on the other hand. And I think that there are ... Our bias tends to be to want to give people a voice and let people express a wide range of opinions. I don’t think that’s a liberal or conservative thing; those are the words in the U.S. It’s-
Silicon Valley issues...
Yeah, but I mean, we think that that is a virtue. Interestingly, I think for most of the history of the company, I think a lot of people agreed that that was a virtue. I think recently, a lot of people may be just more focused on some of the negatives that can come with people widely having a voice, but I think that’s become a more unpopular belief in the last few years, that giving people a voice is good. But we still believe it. I think that you see a lot of good around the world come from that, and I think that we will eventually come around to that as well in the US broadly.
What is your political leanings? Do you have them?
I care about specific issues very deeply and I’m not sure that aligns with any kind of specific thing. So I mean, I’m very outspoken on immigration reform. In 2013, I helped start with a number of entrepreneurs FWD.us, which is a group working on immigration reform that I think recognizes that we need to secure the border and enforce laws, but that also understands that the benefits of immigration, both to the country and the economy and as a humane civil rights issue for the 11 million people who are undocumented here, is incredibly important. I mean, I’ve-
How did you feel about the border separations as a citizen?
It was terrible. Terrible.
What did you do? Did you do anything besides donate money or stuff like ...
Yeah, well I mean, the good news here is because we’ve been working on FWD for so long, it has established a lot of the infrastructure that now ... When a crisis comes up, you can’t just spin this stuff up immediately. So they’re in there and they’re able to help out.
But I mean, talking about social utility, one of the really proud moments recently of working at this company was the fact that a couple of people could-
The Willners. I had them on the podcast. Yeah, they’re great.
... start a fundraiser to raise $1500, enough to bail one person out, and they ended up raising more than $20 million. And this thing just went viral, and I think it’s a great example of when you give people a voice what positive things can happen, both substantively in terms of the fundraiser and just the widespread show of support, I think, is also really meaningful. And I think a combination of that and a number of other things like that may have been what led the administration to backtrack on the policy there.
Yeah, possibly. Possibly.
So let’s get into the idea of privacy and data. How do you assess your performance in front of Congress? It was a low bar, Mark; they didn’t do a very good job. That’s my opinion.
You thought I didn’t do a very good job?
I thought you did, but I only thought it’s because they did such a bad job.
Well look, I think a lot of people-
You did fine.
— think about this from a gamesmanship perspective of like, someone’s winning and someone’s losing.
Right. Well, it is politics.
I try not to ... Yeah, okay, and maybe I’m too idealistic still.
But I tend to come at this from the perspective of, we have a duty to the country to provide as much context as we can about the set of issues that we see. And in general, I was impressed at how many of the people there both, I think, had a handle on a lot of the issues and I think genuinely were trying to understand them. And some of the questions I thought were really hard and pointed.
Which one? Which one?
On the second day, I thought Congressman Kennedy’s questions-
Second day was better, yeah.
I mean, he asked them respectfully, but they were very hard questions around who owns the data and how is it gonna be used? And to me, that just got to the heart of why processes like that are important.
So I didn’t feel like my responsibility there was to show up and “win.” I was trying to-
Oh, I wasn’t thinking it was a winning thing. I thought the questions were not very illuminating. That’s all.
I’m there as a witness who hopefully understands some relevant context on an issue of importance to the nation, and I view my responsibility as making sure that they can get as much information as they need to in order to inform what they need to go do. Because now, whether it’s the cooperation that we have with the Mueller investigation, or areas like this, where there’s hearings about election interference, or the data privacy issues, these are broader issues and we’re a player in them, but there’s a much bigger picture here as well, and we don’t have the full context of that.
So I think to the extent that we ... Our responsibility is to do everything we can to prevent these issues on our surface and to make sure that the people whose job it is to have the full context across everything, have whatever information we can provide. Like you saw with the Mueller indictments recently, I think some of that context probably initially came from us, but then they had to go build on that for years in terms of putting together the whole story and do very significant work on top of that. But if we can help out in ways like that, then I feel good about our contribution.
But have you given them full activity of the Russians on your platform? Have you given the investigation full access to that data?
I’m not sure what that means. In general, I think the way this works is they ask a set of questions, and we go and do investigations, and turn up whatever we find.
Okay. Back to the hearings, one of the things I think ... I’m not saying it’s a win/lose thing; I think they did not press you very hard on certain issues. One of them that you kept saying ... Two areas: One is what you guys do with the data; one was the part related to Cambridge Analytica, which is what happened there, which I think is still ... You’re still investigating, it’s still being investigated by authorities in how it happened. And in that case, your defense was, “We didn’t see it, but once we saw it, we did something about it.” What I would’ve asked is, “Why didn’t you see it?” What’s the problem in that with this data that you did not see it being misused? Because I was at your 2009 or 2008 ... I remember when you were talking about this idea.
Yeah, so the principles at play here are, on the one hand, you want people to have control over their information and be able to-
... bring it out of Facebook-
Right. Data portability.
... to other different apps, because we’re not gonna build all of the social experiences and it should be easy for people to use their data anywhere. But on the other hand, if they have that information in Facebook and the developer has some relationship with us, then we also have a responsibility to protect people and keep people safe. And what happened here was a developer built a quiz app, and then they turned around and sold the data that people gave them to someone else. And that is clearly against all of the policies that we have. I mean, that’s terrible, right? We don’t sell data, we don’t allow anyone to sell data. Because it was on their servers, we don’t necessarily see that transaction or whatever they’re doing.
But you have, in the past, caught people doing this and been much more rigorous in that.
Well we find ... So we do a number of things. One is, we do ongoing audits and we have built technical systems to see if a developer is requesting information in weird ways. We do spot checks where we can audit developers’ servers. But a lot of the stuff comes from flags that either people in the community or law enforcement or different folks send us, and that was actually similar here too. I think it was The Guardian who initially pointed out to us, “Hey, we think that this developer, Alexander Kogan, has sold information.” And when we learned about that, we immediately shut down the app, took away his profile, and demanded certification that the data was deleted.
Now the thing that I think, in retrospect, that we really messed up here is that we believed the certification. Now normally, I don’t know about you, but when someone writes a legal certification, my inclination is to believe that. But in retrospect, I think it’s very clear ...
No, I don’t believe anybody.
All right, well that’s ...
There’s an expression in journalism, “If your mother says she loves you, check it.” But go ahead.
All right, that’s fair. I tend to have more faith in the rule of law, but-
And I think the links between Peter [Thiel] on your board and [Steve] Bannon and ... It creates a really bad situation for you all, or suspect. It at least leads to people wondering what was happening there. Easily.
All right. Well I don’t think that there’s any suggestion that that stuff was connected here, but I do think-
No, but I’m just saying. It just creates a, “What the heck was going on here?”
Yeah. I think in retrospect ... You know, we didn’t know what Cambridge Analytica was there, it didn’t strike us as a sketchy thing. We just had no history with them. Knowing what I know now, we obviously would not have just taken their certification at its word and would’ve gone in and done an audit then.
All right. Should still-
So now we’re basically doing this. Now our policy is, we are not just going to take developers at their word when they say that they aren’t misusing information; we’re going to go and audit every single developer who had a large amount of access to people’s information before we significantly lock down the amount of access that developers could get starting back in 2014.
Should someone have been fired for this?
I asked Sheryl this, so I’m just curious what you think.
Well, I think it’s a big issue. But look, I designed the platform, so if someone’s going to get fired for this, it should be me. And I think that the important thing going forward is to make sure that we get this right. In this case, the most important steps, in terms of, to prevent this from happening again, we’d already taken in 2014 when we had changed dramatically the way that the platform worked.
But overall, I mean, this is an important situation, and I think again it’s ... This to me is an example of, you get judged by how you deal with an issue when it comes up. And I think on this one, we’ve done the right things, and many of them I think we’d actually done years ago to prevent this kind of situation from happening again.
But to be clear, you’re not gonna fire yourself right now? Is that right?
Not on this podcast right now.
Okay, all right. Well that would be fantastic. I mean, I think you’ll do okay.
So let’s get to the privacy and data part of it. One of the things you kept saying in Congress, which really drove me crazy because you said it like ... I counted it.
Do you really want me to fire myself right now?
Sure. It’s fine.
Just for the news?
Yeah, why not? Whatever, Mark. Whatever works for you. No.
I think we should do what’s gonna be right for the community.
All right, okay. All right. Well I’ll get to regulation in a second, but two more sections and then you’ll be out of here. One is, you kept saying, “Senator, we don’t sell your data. Senator, we don’t sell your data.” You kind of sell people’s data in a different way by marrying it with other data, you sell insights into that data, you sell ... Your whole business is predicated on using data to make money. Why did you keep saying that? I mean technically, you’re correct, but ...
Well I think facts do matter.
Yes, I know, but you don’t technically sell your data, but you use their data to sell advertising. So you are in essence ... What are you doing with people’s data? How would you describe it?
Well look, it bothers me when reputable news outlets make claims like saying that we sell data because it is just-
Like to Procter & Gamble. You don’t-
It’s just not true.
We don’t sell data. Now, I understand what you’re saying, that the business model works basically in two ways; one is people have attention from being on the service, which is no different from the ads you’ll run during this podcast or traditional TV ads for the last 50 years. But there is an element of targeting which is that, because we understand what you’re interested in, we can show you more relevant ads to you. And people, overall, people want to know that their information is secure, and that if they give it to you, they want you to use it to make their experience good, but they don’t want you to give it to other people.
So while it may seem like a small difference to you, this distinction on “selling data,” I actually think to people it’s like the whole game, right? So we don’t sell data, we don’t give the data to anyone else, but overwhelmingly people do tell us that if they’re going to see ads on Facebook, they want the ads to be relevant; they don’t want bad ads. So they want us to use what they’re browsing on Facebook, and what they’ve clicked on, and what they’ve told us that they like in order to show them more relevant ads.
Do they still know enough about what they’re opting into? Someone recently called you to me a “greedy information hoarder”; essentially that you hoard this information and-
Well let me give you one example that I think is interesting.
... spit it back at people.
During the GDPR flows, and rolling that out, one of the specific things that we needed to do was get specific opt-in permission from people to use information from the websites that they used and apps in order to help target ads. And the vast, vast majority of people chose explicitly to opt into that, which goes in line with everything that I’ve seen on the research from what people want, which is that when faced with the decision of do you want more relevant ads or less relevant ads if you’re gonna see ads, people want better ads. Not everyone; I mean, some percent of people said no, but the overwhelming majority of people say yes. And I think that’s just an important thing to internalize on this.
Right. Do you think people understand how much information you have on them? It’s a different factor that ever before in history, how much information you know about people.
Maybe. Although I think most people actually, on a service like Facebook or Instagram, probably have a greater awareness of the information that’s there than on a lot of other services, because in our case, you actually put it there, right? You told us that you like that thing, or you posted that photo, or said that. So I actually think people generally have an awareness and feel like, “Wow, these networks have a lot of information.”
The areas that I would actually worry about more for consumers are places where they don’t realize that services are collecting a lot of information about them, but actually are. So that’s a whole different thing.
Like who? Who?
Well, a lot of other folks online.
I mean, there’s the whole industry of data brokers, for example, who we’ve recently-
Who you used to-
... made the decision that we-
But you used to.
... don’t want to be in business there.
I mean, we never were a data broker, but we used to let advertisers-
Yeah, marry them. Marry the data.
... use data brokers. And we decided no, we think that this is not a good thing, so we’re gonna cut that out.
Why did that happen? Why did you suddenly come to that realization?
Well I think around the time of the Cambridge Analytica issue, we realized that we needed to do a full audit, not just of that specific issue, but across all the platform, like everything that was going on. Where was data coming into the system, where might people’s data be going out, and every case of that, do people understand what is going on? And we just made a series of decisions that were like “No, we think that given how we view our responsibility and where the world is, this is no longer the right thing. We should not do this, we should not do this, this we should change or communicate differently.” And we did a series of actions around that, and this was one of them.
Okay. So let me finish up about you, but I do want to ask one more thing in this area: Regulation, how much do you think is coming from if the Democrats get back in power? They’ve gotten rather hostile towards you and Google, it seems.
Well, I think you’re too focused on the U.S.
Okay, across the world. Do you see regulation being … Obviously, Europe is a place where there’s much more regulation happening and more activity. Do you see it-
Yeah, so there’s lots of different areas for this. The area that I think is most likely is content. So the U.S. has a very rich tradition of free speech; it is written into the Constitution, free speech, so here, we have a very strong allergic reaction to trying to regulate that. But in almost every other country in the world, while people generally want as much expression as possible, there’s some notion that something else might be more important than speech; so preventing hate or-
In Germany or wherever.
... terrorism or just different things. So you’re already starting to see this; I mean, there was the hate speech law in Germany. I think that there will be additional laws creating responsibility for social networking, and social companies, and Internet companies overall to be more proactive in policing terrorism, or bullying, or hate speech, or different kinds of content.
And overall, I think that there are good and bad ways to do that, but my general take is that a lot of that stuff can be pretty reasonable. I mean, I think we’re not kids in a dorm room anymore, right?
No. That’s so long ago, Mark.
When we were ... No, I mean, but back then, if someone had said, you need to make sure that you’re gonna give people a voice, but you need to make sure that it’s not used to spread hate speech, the best you could do is get the community to flag things for you and hope to review them yourself. But now we’re a big company, AI technology has advanced significantly. We’re at a point now where we’ve built AI tools to detect when terrorists are trying to spread content, and 99 percent of the terrorist content that we take down, our systems flag before any human sees them or flags them for us. And we can afford, at this point, to have 20,000 people reviewing the content.
So I think the point where you have that kind of AI technology and you have the resources to be able to employ people to do that kind of content review, I kinda think you have a responsibility to do it.
Okay. So you can handle regulation. What about the call ... There’s been some calls to break up some companies like Facebook or Amazon that become too big. Are you in fear of that in any way?
You know, I think that there’s ... It’s a very interesting debate overall. If you actually get down to why we’re big, it’s not ... In the traditional sense, we’re not big because we’re so big in the United States, although we are and a lot of people use our products here. If we weren’t an international company, if you said, “Okay, you have to shut down all of your services outside of the U.S.,” we actually would not be very profitable at all; we actually would probably be unprofitable.
So the reason why we are a successful and large company is because we have built something here that can now serve billions of people around the world as well, which is actually where all the margin comes from, in terms of ... I mean, we have the cost structure that we have, and then that’s where the business comes from and ... Don’t get me wrong, there’s a lot of revenue in the United States as well, but that would barely cover the cost of the company.
So I think you have this question from a policy perspective, which is, do we want American companies to be exporting across the world? We grew up here, I think we share a lot of values that I think people hold very dear here, and I think it’s generally very good that we’re doing this, both for security reasons and from a values perspective. Because I think that the alternative, frankly, is going to be the Chinese companies. If we adopt a stance which is that, “Okay, we’re gonna, as a country, decide that we wanna clip the wings of these companies and make it so that it’s harder for them to operate in different places, where they have to be smaller, then there are plenty of other companies out that are willing and able to take the place of the work that we’re doing.”
Specifically the Chinese companies.
Yeah. And they do not share the values that we have. I think you can bet that if the government hears word that it’s election interference or terrorism, I don’t think Chinese companies are going to wanna cooperate as much and try to aid the national interest there.
What is your situation in China now?
I mean, we’re blocked.
And are you working on moving Facebook products in there?
Over the long term. I think it’s hard to have a mission of wanting to bring the whole world closer together and leave out the biggest country.
What will that take?
I don’t know. I mean, I think that that’s ...
You went. You jogged in Tiananmen Square. What else could you do?
Actually, I thought that that was interesting that that got so much pickup.
Oh, come on. I’m right on that one. You can’t jog in Tiananmen Square, Mark! You can’t. It looks like you’re cooperating with the Chinese government. We’re gonna argue about that forever.
Fine. Well, that year, I posted photos of me running everywhere, including Delhi, which has worse air-
It’s Tiananmen Square.
Which has worse air quality.
I know you were 12, but there was a guy with a tank and a briefcase in that square when you were 12, and it was problematic.
Okay. Well, I’m not gonna defend that.
Where are you with China?
I mean, we’re, I think, a long time away from doing anything.
I mean, at some point, I think that we need to figure it out, but we need to figure out a solution that is in line with our principles and what we wanna do, and in line with the laws there, or else it’s not gonna happen. Right now, there isn’t an intersection.
All right. I wanna finish up just talking about you. We just have a few more minutes. This is an issue I’ve talked about a lot is Silicon Valley responsibility, and taking responsibility. And taking responsibility of your dark things, and not being quite as optimistic, and a lot of people here have a problem with looking at that. How do you look at your responsibility, as a leader? As a leader of a massive company with enormous power? Do you think you grok that at this point? Sometimes I don’t think you do. I really don’t.
Well, I think we have a responsibility to build the things that give people a voice and help people connect and help people build community, which ultimately is the unique thing that we do in the world. That, I think is one important piece of it. But then on the other hand, I think we also have a responsibility to recognize that the tools won’t always be used for good things and we need to be there and be ready to mitigate all the negative uses, so whether that’s terrorism, or people thinking about self-harm or suicide who we need to go make sure they get help quickly, or bullying, or election interference, or fake news. The list goes on, and there’s a lot of these things. There are very specific pieces of work that we have to do on each. I mean, just take terrorism for example. We have a team of more than 200 people working on counterterrorism. I mean, that’s pretty intense. That’s not like what people think about what Facebook is.
No, I’m sure when you were an engineer you weren’t thinking this was your ...
Look, I do think that there will be things that we get wrong in the future, too, but I think to say that we don’t care about what’s going on, or mitigating any of the downsides of what people do, I don’t think is right. I think to say that that is the only thing that we should be focused on, I think also is not quite right because I think that what most people out there want is the ability to stay connected with the people that they love, and to be able to join communities because that’s an important part of people’s lives. If we’re not making progress on that and advancing the ball forward there too, then I also don’t think we’re doing our job.
What about the image of Facebook? It’s not great right now. Would you agree with that?
It’s not as good as it’s been.
Yeah. How does that feel personally?
I mean, personally, my take on this is that for the last 10 or 15 years, we have gotten mostly glowing and adoring attention from people, and if people wanna focus on some real issues for a couple of years, I’m fine with it. Frankly, I think that the news industry is critically important because it points out things and surfaces truths that can often be uncomfortable. I think that that’s working, and the spotlight has been pointed on things that we have a responsibility to do better, and I accept that. While it may not be the most fun period of running the company, I think we take the responsibility really seriously and get that in the grand scheme of things, I don’t think people are being unfair to us. I think people have been very positive and are focused on all the good that come with the technology for a long period of time. To have a period where people focus on some of the negative uses, to make sure that we fully understand that, I think is completely reasonable.
What about you personally? How does that feel? Because it’s directed at you.
Yeah. I mean, I think that that is my personal take. I think it’s ...
”It’s okay. Mark is okay.” You accept the responsibility of the criticism is what you’re saying.
Yeah. I also just think you need to put all this in perspective. If you look at what people have said about Facebook and how much they love the brand and the products, over a 10-year period, I mean, most of the coverage and what people say is super-positive. If there’s gonna be a period of two years where we frankly didn’t handle a bunch of things as well as we should’ve and need to get back on top of it, then I mean, you’re not gonna cry about that. You’re gonna do what you need to make it good.
What do you do to not cry about it?
What do you mean?
Well, if I got this much criticism, I think I’d feel a lot of pressure. I’d feel a lot of pressure.
Well, feeling pressure is different from being sad.
I think that-
Oh, I don’t think you should be sad, necessarily. I think you’ve got a pretty good life.
We sit down and say, “All right. We have to go do this.”
Well, you might be more self-reflective. Most people in Silicon Valley aren’t self reflective. Like you might go, “What did I do? What have I done? And what should I do better in the future?” I think that would be an adult response.
At an institutional level I think making sure that we put appropriate focus on these things is really important.
What’s your goal this year? You have these goals. This year is fix Facebook. You did the Visit Every Cow In America Tour last year. What is your personal goal this year? Away from fixing Facebook?
I mean, I think that the feeling this year is that ... I’ve done these personal challenges because I think running a company can be an all-consuming thing. I think in order to have a broader perspective, you wanna do things outside of that too. Whether that’s running, or learning Mandarin, or visiting different places, or coding an AI to run my home, I think that those are all good things. This year, I just think that we are so-
I missed the AI to run your home, but okay. All right.
I missed the AI running your home, but go ahead. All right.
What do you mean?
I missed that goal.
Jarvis, oh yeah.
Yeah, you got that.
Oh, yeah. I forgot.
That was fun. But this year, I think we have a number of issues that we need to deal with and it didn’t feel right to me to focus on something else outside. I think that this is, and interesting enough, this is an important challenge that I think we need to dedicate every fiber of what we’re doing to making sure that we get this right.
How long does that last?
What? This focus?
I think it’ll take about three years to fully retool everything at Facebook to be on top of all the content issues and security issues. But the good news is we’re about a year and a half in. I do think that by the end of this year, we’ll have significantly turned the corner on a lot of these issues. I don’t think we’re gonna be as good as we would like to next year, either, but I think it’ll be close. Then, my hope is that by the end of 2019, a lot of the systems will be much more operational and dialed in, which doesn’t mean that aren’t gonna be errors. There’re always gonna be errors that people say, “Hey, you enforced against this content incorrectly.” But I think part of that means having mature systems, like building an appeal process so that way, it’s not just some representative somewhere around the world of Facebook who makes a decision on your content, but you can appeal that, and maybe even appeal it independently over time to some other body. But by the end of 2019, I would hope to have all of that in place.
Then have some other goal. Do you have any political goals? I know people thought when you did your grand tour of the United States that was what you were doing with your team of videographers, et cetera. You had a lot of photographers, Mark, they were lovely photos. I went across the United States and I had no photographers with me.
Well, we have what? We have one photographer at Facebook?
They were nice photos!
You might imagine why people would wonder if you were doing that.
I understand. I mean, I care about helping to address these problems of social cohesion and understanding what economic problems people think exist. I tend to think that we all get support from three basic places: Our friends and family, the communities we’re a part of, and then, ultimately, the government with its safety net. I think as a society, we spend the vast majority of our time talking about what the government should do in the political debates.
I think we spend not enough time talking about how important community is. So you go around... I mean I saw, sit with ministers in places, and they talk about not just the religious role that they play as a religious organization, but as a community organization. One minister told me that he knew that when a factory closed down in town, he was gonna be seeing more couples for couples counseling a few weeks later because of the tension. All right. That’s a real piece of social infrastructure that needs to exist.
If people aren’t a part of those kind of organizations, then there’s a core need that people have that is not being fulfilled.
You go to military bases and you meet the spouses of people who get deployed in different places, and they told me that the core part of their social infrastructure are these Facebook groups where every time they get deployed, they go join the group of military families around that base and figure out what school they should send their kid to, what services locally they should use. That is like how they got rooted and how they get established in the community. It’s not just a group that’s online. It spans online and offline, meeting real people.
I sat down with kids in Chicago, a school where a lot of kids were in gangs. I mean, they told me the reason why people were in gangs is not because they wanted to be in a gang, they understood that it was dangerous, but because they needed a sense of community, and in a dangerous environment, they wanted to know that someone was looking after them.
My takeaway here is that there is a real issue, which is that people need community support, but if you look at the sociology and the history here, community membership has actually been on the decline, and it’s been fragmenting for 40 or 50 years, well predating the internet. Since the ‘70s, there are now 25% or 30% of people who are no longer members of groups, whether religious organizations, or local organizations, or volunteer organizations, that they once were. That strikes me as a real crisis. It’s not just American, I think it’s around the world as well. That’s why we changed our mission last year, to not just be about friends and family, which is always gonna be a core part of the Facebook experience.
And community, I get it.
But to now be about helping people connect and join those kind of meaningful communities, like military spouses, or the group for you have a new kid and you join a group for new fathers, or new mothers, that ends up being a really core of your social support network. Or you, God forbid, come down with a rare disease and you need to have a support group of people who have that but there are no other people around you who have that. There are about 200 million people on Facebook who are a part of what they call these meaningful communities. For whom upon joining them, that becomes the defining aspect of their Facebook and internet experience and one of the most important parts of their real world support structure.
Some people might say the uses of the internet and mobile phones is the reason people are feeling this way too.
Well, hold on, hold on, because-
The solution to Facebook might not be more Facebook, but go ahead.
Look, I think that this problem clearly predates the internet, let alone Facebook.
Loneliness, yes. The human condition.
Well, and the decline of community. I think that putting that all on the internet seems unlikely.
I get that. I get that.
I think it’s clear that not all the uses are gonna be good. I mean, I’m not trying to say that it’s all good, but I think that it can largely be good.
Let me hurry up here. Community is needed. Does that mean you don’t wanna run or do you wanna be Oprah? What is your goal?
Oh, sorry. I didn’t realize we were still on that.
Yes. You didn’t answer it, but well done.
No, look. I mean, we have a five year goal of helping a billion people join communities that are meaningful like that.
Okay. At Facebook.
If we do that, then I think that we will have played a role in reversing this many decades-long trend of people not being parts of communities.
Of these three groups.
I happen to think that we as a society do not spend enough time thinking about communities and the importance of them.
Governments are extremely important, there are things that only government can do. Safety net is extremely important, but I mean, that’s not me. That’s not the thing that I’m here to do. I can help build communities and connect people. I do think that that’s an area where I have unique insights and abilities that I can help people do. That’s what I care about, but I think that that’s really important.
Two more quick questions before I know I have to go. One is, who do you look up to? Do you look up to other internet people? Elon Musk, Jeff Bezos, or is it there’s like an ultra male competition between and among you? Who was your mentor, would you say?
Well, I think that there are a couple. Bill Gates has always been a mentor and inspiration for me even before I knew him. Just growing up, I admired how Microsoft was mission-focused. It was a company that had a clear social goal, or that it wanted to make ... They thought that computers were gonna be valuable, and having that become ubiquitous. It was like an Apollo-like goal to me that always struck me as really nice. Then, I think his second act of going and-
Doing the philanthropy.
— being one of the world’s best philanthropists has absolutely influenced me. Not only to try to follow in his footsteps and do something hopefully one day that will be as impactful as what he has done, but his lesson there that you have to start early to practice. Like anything that you wanna get good at, you don’t just show up and effectively and efficiently give money away. The notion that if I wanna be really good at this 10 or 15 years from now, then Priscilla and I really need to be starting to work on this now. He has had, he and Melinda, and Melinda has increasingly really been a role model for us as well, just have really deeply influenced the way that I think about both work and philanthropy.
But one of the things that I’d say that I’m really lucky is that a lot of the people who I look up to the most, I get to work with every day. I mean, I think Sheryl is amazing. A lot of what I know about business and building organizations and leadership come from working with her. A lot of the other folks who I get to work with every day, Chris Cox is just an amazing person. I always tell people that you should only hire people to be on your team if you would work for them. It’s not that I’m looking to swap my role, but I think that in an-
Well, you fired yourself earlier, but go ahead.
But in an alternate universe, I would be honored to work for any of these people. I think that that is, I don’t know, that is greater gift than having some external mentors who I get to talk to once a quarter.
Right, right. Okay. My very last question. We didn’t get to talk about products. What is your-
We did. We talked about community.
A little bit. Yeah, community. Community.
I got that in there to your-
I know you did. But of the products, of the many products that- And we didn’t get to talk about diversity, we didn’t talk tech addiction, there’s all kinds of stuff we could talk about. But of the-
Do you want me to go longer?
No. Yeah, sure if you want! I’m good! Rachael says no. But very briefly, what do you think the most exciting product area is right now? Let’s finish up on that.
Well, I think the time frame matters. I’m very excited about this social mission of helping a billion people-
... be a part of meaningful communities, that is a very important social need. I think we’re well set up to do it, and I’m very excited about the team that’s doing that.
Longer term, as a technologist, one of things that just excites me is there are always new computing platforms. Every 10 or 15 years a new one comes along. They’re always more native, they capture your human experience more. Immersively, you share more naturally what you’re experiencing. I just think that VR and AR are going to be a really big deal. You can just see this trajectory from early internet, when the technology and connections were slow, most of the internet was text. Text is great, but it can be sometimes hard to capture what’s going on. Then, we all got phones with cameras on them and the internet got good enough to be primarily images. Now the networks are getting good enough that it’s primarily video. At each step along the way, we’re able to capture the human experience with greater fidelity and richness, and I think that that’s great.
Now, I do think that we’re gonna move towards this world where eventually you’ll be able to capture a whole experience that you’re in and be able to send that to someone. I think that that’s just gonna be an amazing technology for perspective taking and putting yourself in other people’s shoes, for being able to feeling like you’re really physically there with someone even when you’re not. One of the criticisms of technology today is you’re sitting and looking at your phone, and we could be sitting together but we’re actually fragmented.
No, I agree with you on VR. I’ve just been doing some recent VR stuff that’s really promising.
Yeah. I mean, there’s a few technology leaps that still need to be made, but the initial use is amazing. I just think that’s a really important technology.
I’m not sure you an give people empathy though. You can see people, the world through people’s eyes, but you can’t understand their experience, necessarily.
Yes. Although, I think there’s also an economic ... We’ve talked a lot about the social aspects of all of this, but I think one of the biggest issues economically today is that opportunity isn’t evenly distributed. You get all these people have to move to cities, and then the cities get to be way too expensive, and if you have a technology like VR where you can be present anywhere but live where you choose to, then I think that that can be really profound.
There’re really only a few solutions to this. Historically, cities have grown to be bigger by building better physical infrastructure. There’ll be some amount of that. I mean, I think things like hyperloops and things like that can extend the suburbs, could be quite interesting, but I have to believe that, we’re here in 2018, it’s much cheaper and easier to move bits around than it is atoms. It strikes me that something like VR or AR, or even video conferencing on the path to that, has to be a more likely part of the solution.
I would agree.
Than just building a ton of physical infrastructure. We’ll do both, but that ends up being just critically important. We didn’t touch on that so much today in this. But one of the areas that we’re really focused on is economic empowerment. One of the things that we’re most proud of is that there’re 80 million small businesses who use our tools. When you poll them, the majority of them say that they’re growing jobs, creating jobs and growing faster because of using our tools. The vast majority of them aren’t even paying. We’ve got 6 million advertisers and 80 million small businesses. That’s an area where that’s really the foundation of the economy. If we can help grow that, then that will also make communities stronger and will end up being just a really important part of how I think the country and the world holds together and moves forward over the next 10 or 15 or 20 years.
All right. Now you have one chance. What would you like to say to your giant nation state of Facebook right now? What is the one thing? Like, “I’m sorry for this-”
Well, we’ve talking about this for a while.
I know. but what’s the one thing they’re getting wrong about you right now? I’m gonna give you a nice out.
That’s tough. It’s always hard to say what is the one thing. I don’t know. I think that the main thing that I’ve tried to internalize this year is we get that there’s a big responsibility and a lot of things that we need to do better than we are. We are working on it, and I think a lot of them, we’re doing better already, and for the rest, we’re committed to getting to where we need to be for the community. At the same time, we also feel a responsibility to keep on moving forward on giving people tools to share their experience and connect and come together in new ways. Ultimately, that’s the unique thing that Facebook was put on this Earth to do. I think if we don’t push forward on that, we will be missing our responsibility for advancing the ball there. That’s what I care about, and we’re just very serious about making sure that we do both of those things.
All right, Mark. I really appreciate it, we talked about a lot of things. We didn’t get through everything, but I do appreciate it.
Well, save it, we’ll do it again.
We’ll do it, yeah. Next time.
We’ll do it again soon.
Oh, we got a lot of stuff. Thank you so much, and we will talk again soon.
This article originally appeared on Recode.net.