On the latest episode of Recode Decode with Kara Swisher, Harvard University’s Ash Carter joined Kara onstage for a live conversation about AI ethics, government surveillance, how to regulate Google and Facebook, and more.
Carter, who was Secretary of Defense under President Obama from 2015 to 2017, said he’s concerned by the lack of transparency around algorithms that are being developed and marketed for the world, including the government.
“If I were a customer in the Defense Department or any company that I am associated with, I’m not going to be a customer for something [for which] you can’t come in and explain to me how the hell this works, because I’m going to turn around and get sued or have to explain in my case to a mother whose child has been killed or something,” Carter said.
“I can’t buy black boxes for national defense,” he added. “You can’t buy black boxes for policing. You can’t buy black boxes for picking which people have an opportunity to be employed. This isn’t games or something. These are serious human things.”
He also argued that, even as AI becomes more prevalent in all our lives, the military and the defense industry should maintain their “extant guidance”: No AI should be able to take a human life on its own.
“I don’t believe that human beings can cede their responsibility because I would certainly feel responsible, and I felt responsible every time we used lethal force,” Carter said. “I certainly felt responsible in the very fullest sense as a human being by it, and everybody in my chain of command did and the president did ... But how do you locate human responsibility in an AI system?”
You can listen to Recode Decode wherever you get your podcasts, including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts, and Overcast.
Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Ash.
Kara Swisher: I dared Ash to hug the flag, but ...
Ash Carter: Don’t tempt me.
Everyone gets that reference, right?
Don’t tempt me, Kara. Listen, everyone, welcome. Thank you for being here tonight. I’m very, very pleased to have tonight as our guest, and my guest here in the forum, Kara Swisher, the editor-at-large of Recode, the host of the Recode Decode and Pivot podcasts, which are my morning gym listen. I’m not making this up. I listen to Kara every day. I don’t know how you manage to do as much, be as prolific as you are. And contributing opinion writer to the New York Times.
But in the long and the short of it is, in a long career, she has become the most sophisticated, entrenched tech journalist on the scene today. And so in view of the importance of those issues to the Kennedy School and everything that we’re doing here and everything that I’m personally am very committed to, you can’t ask for a better observer and commentator than Kara Swisher.
Now, this is going to be an odd forum because, by rights, as her host here in the Kennedy School, I should be questioning her.
On the other hand, she has a podcast which is recording her, and it is normal for her to therefore be interviewing her guests. And, so, we decided that we were going to just have a conversation, and that’s why it’s billed as a conversation. I am going to suggest to you, Kara, as I did earlier, about some topics, though.
Yeah, just turn a little bit so I can talk to you.
Yeah, let’s turn so we can see each other here.
I will stick to the digital area, but if you want to stray elsewhere, we are also challenged as societies in the biotech area. We were talking to them at breakfast about that this morning, and also in the huge issue of jobs and training, and how to have a cohesive society in a world in which too many people feel that there’s a fast lane of technology and a slow lane there.
Yep, I’d write and talk about these things.
Right. So, okay, so all of those things are fair. I thought I’d ...
If you want to talk about Game of Thrones or Avengers: Endgame, I’m willing to do that.
I will go wherever you ... you may find me a little dry on things to say about those topics, but if you have something juicy, have at it.
I can put out the premise.
It’s not my style.
I can put out the premise that the Night King and Thanos are misunderstood. But we’ll go from there.
Just went, phew.
Back to where we were.
Okay. All right.
So let’s start with you, Kara, if you would. Social media curation. You’ve written a lot about that, and the repeated failures of social media to reflect good values, to act in a way, or to be governed in a way that you can you feel comfortable letting your children participate in them.
If you go back to the Zuckerberg hearings, which you’ve covered and have written quite a bit about — which were a huge bust, sadly, because he had nothing to say and they had nothing to ask — and if you could re-roll that tape and try to do better on both sides, let’s play that game a little bit, if we could.
What they were all saying there is that the answer is some mixture of regulation by government and self-regulation by companies. So can we try to design that mix? Is it 80/20, 20/80? And what would you say each of those pieces ought to be?
Let’s just say, when the internet started, there was a lot of premise around tolerance, openness, the ability for people to communicate across great distances, the ability for lots of different people to get access to information they couldn’t. All premises that were great.
And that’s what they sold you on, the idea of that. What happened was a coalescing of power around just several companies, just a few companies, an obscene amount of wealth creation for a very small amount of people who have enormous sway over these platforms and are, that are, that they don’t govern really, even though they do make the money off of them, and that they have no intention of governing and could possibly be ungovernable because of the way they built them, which makes a lot of money.
And so we’ve a really bad situation that the public square, which is — you know, the proverbial public square — is now owned by private billionaires, and they have no ... I’m going to say ability or compunction to fix the situation.
They’ve just let loose these cities — and I’ve used this comparison a million times — these absolutely unregulated cities to be built and are not understanding the consequences. And the consequences pop up again and again and again on very simple things such as, you know, just the way we talk to each other on Twitter or the way that it’s being used for propaganda by various political figures. Or whether it’s being used in, say, New Zealand, in a very tragic situation where the shooter was using it, and using it as part of the entire scheme of murder, that it was a critical part of the murder scheme to be radicalized online, then use it to say what he was going to do and then broadcast it. It’s just a perfect, horrible storm.
And so, the thing that I was troubled by, by those hearings, was that there’s ample evidence that these, in many, many ways, not just one, but dozens of ways that these companies have hurt society rather significantly and they seem to have no intention of doing anything about it. They just want to move on.
And so what I’d like to have is citizens — the public and politicians — have some measure of discussion with them about how they should be regulated and what the regulations should be that’s smart, not regulation just for regulation’s sake.
Okay, so let’s go down that path. That sounds a little bit to me like 80/20 rather than 20/80.
No, they’ve tried self-regulation. They tried no regulation. They seem to like no regulation, and they have had done a terrible ...
We’ve all lived that.
Their self-regulation is zero.
Okay, so how do you get zero?
They have some rules they change every now and then to suit themselves. But there’s, in general, they have rules, and they’re there and they point to them, but they make no sense to anyone who can read.
So let’s pull it a little bit on those threads. I can think of a few. One would be the Communications Decency Act 230. For those who don’t know, CDA 230 is the thing that allows them to be a ...
Grants them broad immunity for what happens.
Nobody can ... yeah. Unlike the New York Times or something, when you publish in the New York Times, you lay yourself open to the rules that govern speech in the press.
In this country.
Yes, in this country.
There’s different in Britain, and it’s different everywhere else.
True, but there are rules, anyway.
In general, there’s rules.
Not so under CDA 230. Now, another thing is various forms of regulation based on antitrust.
Sure. That’s not regulation.
Many people have told me that ...
That’s not regulation, that’s legal action.
Well, okay, but ... Well, I don’t know. Antitrust has been used lots of different ways in the course of our history. Sometimes it has been to break up companies. I personally don’t recommend that in this case. But more often, as in the case of AT&T and the Western Union Telegraph System and so forth, they were not broken up. They were recognized to be natural monopolies. And maybe that’s what Facebook is, a natural monopoly. But they were only allowed to exist provided they followed public rules. So that’s an avenue.
And another avenue, you talked about the money, Kara. Another avenue’s the money. If you follow the money and you say, why is this free? What is the business of social media? And there’s someone, and I forget who she is, but she’s also a very good writer, who has observed that in social media, the user is not the customer, the user is the product.
Product. That’s something that they hate, when you say [to] Facebook. They’re not the product.
I’ve heard that they hate that.
Let me just say, I actually would correct it. You’re not the product, you’re the fuel. You’re the fuel that creates their ability to sell things. And so you’re not the product. The product is advertising and the ability to target you. You are the fuel to get them there. Your data.
The third string would be pulling on the business models and having an alternative business model to that. So what do you think? Where should we go? If you were talking to a presidential candidate or presidential candidates?
If I was a presidential candidate, there’s a lot of them.
You could be one, too. There’s ...
I know. Anyone can be a presidential candidate, apparently.
Yes, I’ve noticed.
It’s every week, there’s a whole new one. The new flavor of the week. Look, they have current laws in place that would cover a lot of this. Some of it in terms of privacy and things like that are ... We do not have a national privacy law. We have one that’s about to come online in California in 2020 that’s going to be relatively strong.
There’s 10 more in states across the country, across this country, that are happening. There’s GDPR in Europe, in the European Union, which is the de facto rules right now, because everybody has to follow them so they might as well follow them here.
So, for privacy.
For privacy. All kinds of stuff. They have all kinds of stuff that are in GDPR. And then there’s stuff that’s being worked on in New Zealand and Australia around different things, and so different things like fines. And, of course, the European Union is also imposing fines on companies like Google and Apple and others. To Apple for non-tax paying, Google for monopoly. There’s other billion dollar fines, which is like they have in their drawer.
These people don’t ... It doesn’t ... They don’t ... these fines don’t matter to them in the number that they’re being put in. And then there’s threats of fines in this country, like the FTC over an agreement that Facebook had many years ago, not to do precisely what it’s doing. And them denying that they’re doing what they said they wouldn’t do, but they’re doing. And so there’s a possibility of a fine coming up for Mark Zuckerberg in that regard.
So there are existing laws and existing things like the FTC here in this country. And there’s also a number of privacy laws. There’s all kinds of disclosure laws that aren’t ... that don’t ... that aren’t ... They don’t really have teeth to them that really matter to these companies.
Then there’s self-regulation, which they could do themselves, which is to put out a series of rules that they would follow and they would work with each other, which they never do, on lots of issues. But what happens is they end up doing different announcements. Apple will do a different announcement than Facebook, than Microsoft and others. And so that doesn’t seem to work properly.
And then it’s just rules, whatever Mark Zuckerberg wants to do. And so, that’s a real problematic thing because here is someone, unlike most people in this country, who cannot be fired, who runs the biggest communications platform in the history of the human race. Absolutely cannot be fired by anybody except himself. He can fire himself.
And actually when we did a podcast, I said, “Who should be fired for all these problems at Facebook?” And he goes, “Well, since I made it, I guess me. I mean I would be the one. I guess I’m responsible.” And then I said, “Uh-huh.” And he said, “You want me to fire myself?” I said, “Sounds good to me.”
You know, it’s just, he can’t be fired except by himself. So that’s a problem. That’s, okay, a problem.
Well, okay. But we’re getting back to where we were, which is they’re not going to regulate themselves, you say.
What should really happen is there should be a national privacy bill, for one. There should be one national privacy bill that deals with issues around privacy, that doesn’t overwhelm the California law, that is stronger than the California law, and that covers a lot of issues.
There should be bills about disclosure of when you’re hacked, that kind of stuff, where they have to tell you right away. There could be bills around nondiscrimination online. There could be bills about, rules about hate speech, and what’s allowed and what’s not allowed. That’s something government doesn’t love to wade into because it ... because that’s a problem. Those are problems under the First Amendment.
But there’s this ... There are things about harm and what creates ... They do it already with terrorism. They do it with ... Recently, the companies had been focused on child pedophiles and stuff like that. But there could be stronger speech laws.
Well, one is to say what you can and cannot do. Another one is to say, I’m not going to tell you what you can and cannot do, but you’re going to have to defend in court what you did and did not do. That’s a different kind of thing. And for each of those things you named, perhaps ... is some ...
And then there’s antitrust. Is the question are these companies too bi, and should they be broken apart in some way and not allowed to be in certain businesses? Should Google be allowed to be in the search business and the advertising business? And also, by the way, they’re in Yelp’s business. And by the way, they’re also in this business. It’s very Microsoft-ian, what’s going on over at Google.
Old Microsoft. New Microsoft is good.
New Microsoft is ...
Which is shocking to say. Sort of like “Voldemort sure is nice.”
You need to tell ... say to the audience a little bit of an old Microsoft. She took on, back in the bad old days of the original Gates deposition and so forth, that was covered par excellence by Kara Swisher.
No, well not me, principally, I was pretty young then. That was a remedy, and it worked really well. We have a really good corporate citizen now, which doesn’t overstep its bounds. Like, that worked really well. And by the way, because Microsoft was pushed down, others could come up.
Okay. So there’s plenty of possibilities out there, of which breakup is only one. I was about to say that you work for PBS, which is another idea which is ...
Okay, well, either way ... channelizing is another possibility for Facebook, where you have subscription channels where you pay rather than advertising, paying. You pay. You could imagine one like PBS where the government pays, or at least subsidized or it is philanthropically backed. So this idea that you get it for free and they take all of your data without any rights, property rights, and give it to anybody they want to is a particular model of funding that doesn’t have to be brought to an end.
But it could be diversified. What do you think of that idea?
Well, it’s fine. It’s just, nothing’s come up. You know? It’s a good business. It’s a real ... Facebook has a really good business that works really well, that targets beautifully, so I don’t know why they would get out of the business, because if there’s no incentive to make it less profitable for them or their shareholders.
The direction I’m headed is an application of antitrust that doesn’t require a breakup, but a channelization or a diversification within one channel. Recognizing that as a natural monopoly. I’m just trying to put on the table, what the options are.
I’m not sure. I don’t know. You could be very creative. See, our antitrust laws are so antiquated compared to what’s happening today. They envisioned trains and telephones and stuff like that. This is something much different.
And so one of the issues around antitrust, and I’m not a lawyer, is the idea that it causes public harm, right? There’s some level of public harm. Everybody loves Amazon. Who doesn’t love Amazon? It delivers to your house in 14 minutes, like, a bag of gummy bears. Yay. You know what I mean?
So I think the question is ... And you like Facebook because you get free things. You like Google, you get free maps. And so it doesn’t cause public harm, but it does cause possible competitive harm and then societal harm, and that’s really hard to figure it out.
That is a latter-day interpretation of the Sherman Antitrust Act. I just want to say, it’s not necessarily wrong, but it is a ... it is something that started at the Economics Department in Chicago, then in our business school. And it was the idea that antitrust in the United States didn’t apply unless there was customer harm. That is not something that Senator Sherman would have recognized, or the good.
I guess not.
I’m making a serious point. It is a reductio ad calculundum, you know, you can decide whether there’s customer harm or not, but that is not the ... so, I wouldn’t be deterred from antitrust and I’m not trying to make the case for antitrust, but it does not ... The fact that there is no consumer harm done is not ...
And it’s been supported by a lot of people, from Elizabeth Warren to Donald Trump to Ted Cruz.
And she’s doing breakup. I don’t know ...
She’s doing breakup and antitrust. A lot of people are talking about that.
What do you think? What do you ... How do you ...
I am for a mixture of regulation and self-regulation. I do not despair of some measure of self-regulation. I think that there’s a commonsense amount there, and all of us who run or have run institutions, you have a commonsense rule.
You draw a gray stripe down the middle of the chart and you say, “everything on that side is no, and everything on that side is yes, and everything in the middle, I’m prepared to discuss, and that’s it.” And we all live with those kinds of rules in our institutions, and it’s babyish to suggest that one can’t craft them for almost anything.
You can at least make a try. Now, so I think something can be made out of the 20 percent, and maybe you and I don’t see it exactly the same way, but there are leaders out there, and there are businesses out there which I think are trying to find the way and a little bit of encouragement. We’ll do it. With respect to regulation, I think there’s so many different opportunities. Antitrust is just one. I remember, Kara, when I was a kid, that when a man and a woman went to bed at night on TV ...
Okay, I can’t wait to see where this is going.
... they climbed into twin beds, separated by a night table with a lamp on it. Now that we look back on that now, it’s kind of idiotic, but that, at the time, that was regarded as appropriate to protect children. That’s okay with me. Now that may be on your ... the other side of the line from where you would draw the line. But that kind of thing’s okay with me. I don’t want children afflicted with all of this.
And I think those of us who are adults have some responsibility, just like you have a responsibility, and I did in the Defense Department, you asked about firing people. I fired people who misbehaved, and my subordinates fired people for far less, because you couldn’t have that kind of thing in that kind of institution. And was it a little fuzzy every once in awhile? Well, sure. But some rules are better than no rules, almost invariably. So I think there’s a lot ...
There are rules around children on these sites. They do have their own self-imposed rules, some of which are great. But I think the question is how much ... I think what they want to do, and maybe you could comment on this, is they say if you start to regulate them too much, we’ll ruin innovation for them, and it will ruin the golden goose of innovation in this country.
Which is, you know, these are American companies who have led the way in each and every one of these areas so far. It doesn’t mean that’s going to happen in the future, but their argument is that innovation will be harmed by any kind of meddling by regulators who don’t know what they’re doing.
That’s not my experience in almost 40 years of managing technology. My experience is that challenges are what fuels progress. This is a challenge. If you turn it around, and if you’re a can-do person, and instead of talking about all the complications and how you tried for a few minutes and couldn’t take it any further because you got stumped by something. And instead you take it, you have the same attitude towards it that I would have towards solving a physics problem or something in defense, which is, I’m going to keep working, and keep thinking, and keep trying until I get it through.
So why is this so hard? Why is it that the most inventive people can’t take on a challenge which is right in front of their face and my face? And when I’m more optimistic about technology than that, I think we can invent our way at this. That doesn’t slow technology down. It speeds it up. And I’ve seen that again and again in my career, that when you tell somebody, do something, go to the moon, and they say it’s too hard.
That’s what they told President Kennedy, we’re celebrating the anniversary of that. His scientific advisers, I happen to know, ‘cause I knew them when I was younger at MIT, said that they uniformly told them not to make the speech because they did know how to do it. And he said, “Well no, I’m going to give this speech anyway.” And darned if they weren’t galvanized and they did it. So that is not my experience. I’ve heard people in the tech sector say to me, “You’re going to stifle innovation.” And I say, “I don’t believe it.”
I’ve also lived a life of engineers telling me that they couldn’t do something they didn’t want to do. And I’ve seen that all the time, program after program. Not all of them. Most engineers are can-do people. So, here’s a challenge, get out there and can-do.
Well, that’s great. But let me just say, this is a group of people who are coddled, that grow up in a very, very deep bubble. Like, everyone talks about “deep state,” go to deep bubbles, Silicon Valley. It’s a place ... you can’t believe the way they live. And they live in a way that they don’t think that they ... that they think they’re victims. Like, I get a lot of victim-y stuff right now from Facebook and Twitter people, like, “You’re so mean to us,” and that kind of stuff.
And it’s astonishing that billionaires will tell you they’re victims. That’s always my favorite. I’m like, wow, really, you could have me killed and dismembered in two seconds like, or whatever you want to do. The fact that they don’t understand their power ... and then a lot of the communications that they are putting out, just like Mark Zuckerberg’s op-ed recently where he begs for regulation. And to me, everyone was like, “Oh look, he’s saying he wants to be regulated.”
Guess what? He doesn’t have an answer, is what he doesn’t have. And so he’s saying, “I can’t deal with these difficult problems my company has created. You all figure it out.” And I think that’s an abrogation of his responsibility. I think these companies have a responsibility to not just help us understand what the problem is...
And believe me, I don’t want them in charge of it, because who knows what will happen with that kind of legislation? But one of the things they do is they create a situation, and then when it becomes very thorny, such as with free speech or the hate speech or who determines what, they just don’t want to get involved, when they create the platforms that allow this problem to exist. And I think that’s one of the issues that I find really vexing, dealing with...
Okay, so here is ... Let’s go with that. Let’s take your characterization of that particular executive at that particular business. Nevertheless, I would say that that’s not everybody. You’ve talked about Microsoft.
That’s a different kind of leadership. There are places that it’s easier to start. I also think that in ... today, we need to try to, wherever we can, build bridges, and however restricted someone else’s perspective may seem to be ...
And I had difficulties with people who, for example, thought Snowden was a hero, which I absolutely did not. We don’t need to get into that. But I didn’t just dismiss them. I tried to reach out a little bit, see if I could get some ... We can’t afford to fail on this. And this guy, if you’re talking about Zuckerberg, is the king of this mountain. Should it be that way? Well, I don’t know. Is he an ideal individual? I know him. I don’t know him as well as you do, but we got to get on with it.
Well, yes. But I just said there’s no leverage points for this. Now Microsoft, great, but Microsoft doesn’t have a social network and doesn’t play in this game. The only things that matter at this moment, on these topics, at least, are Facebook and Google. That’s it. That’s the entire ... And what happens ...
Twitter, but you know, whatever. What Twitter? It’s not the ... Twitter, yes, but it’s a bit of an echo chamber of itself, you know? I mean, I think not, it doesn’t have as much impact. It’s just because Donald Trump is screaming loudly on it that we pay attention.
Why I think Microsoft is relevant is because it is a technological company with a different ethos. That’s weird. It’s not a social media company.
Yes. Absolutely. But I’m saying it doesn’t have a player in this game.
Of course not.
And that’s the problem. And so what happens with a lot of these tech companies, I can tell you, people at Apple and Microsoft are just holding their heads in their hands because this is not their ... you know? They would prefer more privacy. They would prefer more.
It does a disservice to their field ...
And then they don’t want legislation. They don’t want ... It’s a really difficult issue and I think when you get to the really ... these are just the issues around speech. When you get to the issues around AI and how to regulate that, and how to think about the development of that, or drone technology, or self-driving cars or health care, when you start to get into these other things that are quite serious and where there’s global repercussions for what we’re doing, and climate change. You could just move on to all these things, what’s the technological solutions there? They will make this issue, which is already tearing apart our society, even more difficult.
Let’s take driverless cars or autonomous weapons. I was asked about that all the time when I was Secretary of Defense, what about autonomous weapons, or a health care or sentencing or parole or any other system where you have machine-aided decision for something of grave consequence for human beings.
There’s another area where we got to get some right-left hand limits here, then we’ve got to get this thing in a frame or we’re going to be living in a world of hurt down the road on that. The way we’re living now in a world of hurt on social.
How do you begin to get inside that frame? I believe there’s ... tell you what I think. This is the same thing I said about weapons: I don’t think that in matters of gravity, like the application of lethal force, that you can have true autonomy. I don’t believe that human beings can cede their responsibility because I would certainly feel responsible, and I felt responsible every time we used lethal force. I thought that was necessary, we needed that to protect our people. But I certainly felt responsible in the very fullest sense as a human being by it, and everybody in my chain of command did and the president did. And every president I’ve ever worked for has felt that way too. You don’t want them feeling any other way. But how do you locate human responsibility in an AI system?
Well, it’s difficult, I’d imagine, in that situation.
Okay, but let’s work that out because there’s another one where I think it’s said that you can’t do it, and you can do it.
Well, there’s certain things like ... Look, with policing data right now, there shouldn’t be any AI in policing right now because the data is so dirty, it is so dirty, and we’re going to get the same outcome. “These people are criminals because these people were criminals and so therefore they are criminals.” The data is so impossible to put in the system. There’s a famous engineering expression, “crap in crap out,” right? What you’re having is that these decisions are being made. There’s very specific AI like that for policing. Policing and AI to me sounds like we’re making what’s already awful just terrible. That’s one part.
It’s like policing and war ... I mean, AI and lethal warfare. I said — and it’s still the extant guidance in the Department of Defense — that we will not have that. There will be a human being responsible for the use of lethal force. I think that’s right, and that can be engineered in a way that’s not the same as a man in the loop, to use that expression.
You talked about algorithms, you talked about data, and there’s one other thing: Algorithms. Yes, the way they’re designed, they’re layered, and it’s very hard sometimes to know, how did the AI make this inference? It’s hard to go the other way. It’s hard to go down the layers and not just up the layers and figure out how it happened. What is that? It’s a problem for AI to figure out how AI works. I believe that can be done in algorithms and it can be designed in algorithms.
Datasets. What are datasets that are so useless that you’re never going to get anything good out of them? You could massage until the cows come home and it’s still crap, as you’re saying, dirty datasets. Well, how do we understand complex datasets? That’s a problem for AI. If AI is going to be used to fake people out, how do you devise countermeasures to faking people out with AI? All these things ...
Some of the things are quite serious and some of them are very useful. If you’re using them for radiology, there’s tens of thousands of pictures and you would really prefer the computer which can figure out those things pretty well compared to a person, that makes perfect sense and saves lives.
Then you have other things that are useful around climate. All kinds of data that we can start to see patterns and that it can start to learn and do things. That specific AI, certainly, has a lot of uses, as put them specifically.
When you get to something like ... There’s one that I just heard of that, when you have a job, it takes pictures of you while you’re doing your job interview and then it compares you to the top performers. Like, what? Huh? Of course people used to do that by sight, like, “That guy doesn’t look funny.”
By the way, I don’t want that guy to pick someone because you get discrimination, you get lack of diversity when that guy picks that guy. People don’t do it very well either. But it’s a really disturbing question of how that should be done. It might be fairer — who knows? Who’s to say? — but I think one of the issues is what goes into it, “Who is the top performer? Who determines the top performer.” You get a lot of ... You go down some very ugly rabbit holes in this thing.
The second thing is, who’s making this AI is another thing, and right now we have a lack of diversity in the making of the AI. We also have only a few companies at the top of this, and that would pretty much be Google. That Google has sort of the ...
From Chinese companies.
The Chinese companies and Google, pretty much. Then, I guess, OpenAI does too. But that’s another issue around AI. We have to start to create guardrails around these. We were, this morning, it was funny, we were talking about norms, they were creating norms around AI. All kinds of people have come out with statements on AI, principles of AI and whether they’re transparency or fairness or ... There’s all kinds of things that you would want to require for, and these are the guardrails you’re thinking of. They’re called norms. You’re going to have these norms just like you do around biotech or you would around medicine or anything else, which have long been used to this stuff. We have to start to create them for AI. France is doing some, Germany is doing some, some companies are doing ... Google’s doing some.
IBM is doing some.
There’s all kinds of different ... AI Now Institute is doing some, all kinds of groups of people are doing it. But they kept talking about norms so I was sort of laughing, because it was all these academics saying, “We have these norms, those norms, and these norms,” and I’m like, “You can’t have norms when these people aren’t normal.” They aren’t used to norms, and they aren’t used to rules. What do you do with a group of people — again, going back to regulation — who have never been regulated to start to put guardrails in place? How do you get them to stick to those or think of them as a good thing? That would be my ...
If I can try a few answers on that. We’re ahead of the wave a little bit in AI, compared to social, so I have a little bit more hope of getting things in at the entry stage. I think it is as simple as knowing how this stuff works, which is its algorithms plus data and scrutiny of both. You can ask in any application you’re going to buy, “Well, wait a minute, that sounds too good to be true.” Things that sound too good to be true usually are too good to be true. You start digging down and how the hell can this be? Every once in a while there’s a pony in here, to quote a book title of yours, but a lot of times it’s just nonsense. It’s just nonsense.
It is dressing up something that has always been a matter of dirty human judgment or the accumulated wisdom of judges or something like that, which is far from infallible. But dressing that up, or alternatively disguising it with a maze of technical stuff, that most people aren’t like you and me and can’t see through. But I can see through it, and I know most of the engineers can see through it and I expect them to and I wouldn’t ...
Let me put it this way. Yes, I can’t vouch for all the companies that are working on AI, but I’m telling you this, if I were a customer in the Defense Department or any company that I am associated with, I’m not going to be a customer for something [for which] you can’t come in and explain to me how the hell this works, because I’m going to turn around and get sued or have to explain in my case to a mother whose child has been killed or something. I need to know that you have designed it, otherwise I ain’t buying your chili, because I can’t buy black boxes for national defense. You can’t buy black boxes for policing. You can’t buy black boxes for picking which people have an opportunity to be employed. This isn’t games or something. These are serious human things.
How do you look at a lot of the tech people not wanting to work on Defense Department stuff? There’s been a lot of that at Microsoft at Google and some other place.
Fair question. To be accurate, it’s not a lot, but it’s some. It isn’t in all cases. We were talking about Mark Zuckerberg earlier, my approach, odd as this may seem in our day where everybody screams at each other, would be to have a conversation with him. And I offered to have a conversation with him, and I’ll tell you how it would go. In my case. I’d say, “I like one thing you just said, which is you believe that you should behave ethically in the application of your knowledge. Congratulations. I’m with you. Hold that thought and do it with everything Google does.”
Now at that point, you and I are going to go down different paths, but let’s start with where we have a common view. Here’s what I would say to you. I would say to you that, “The government, it’s just us. It’s not a separate thing. It is the way we do things that must be done, collectively. If you know the most about this technology, how are we ever going to get it right if you won’t participate? We participate ...”
That’s canny. That’s a canny argument. Go ahead.
Well, participation doesn’t mean acquiescence, all the time. But there are 300 million of us in this country, you don’t get your way all the time anyway. But if you’re not in the game, some dummy, or whatever you fear the government is like, is going to make the decision you don’t want to make. You are the best, get in the game. By the way, you have a responsibility to do so. This isn’t a playpen, this is ...
Well, it is a playpen, but OK.
You have knowledge, you have responsibility. Next, I would say to them, Kara, “You ever do any work in China?”
No, I’m saying ...
Yeah, you said that. Okay. Yeah.
I would say to the Google employee, because in China they don’t tell you whether you’re working for the PLA or not. The last thing I’d say is that, “You live here in whatever campus Google happens to be in and there are roads that you can drive on, and there are police who protect you on a good day, which is most days. Police actually do. There are all these utilities and things that are ... this idea that you can stand apart from the environment ... And by the way, that the laws that protect and enable your business and your freedom and your ability to choose, where the hell do you think all this stuff comes from?”
Those are the arguments I would make and I would hope that I would be able to win over some people and at least I would have tried. But I don’t know that I’ll win over everybody, but those are the winning arguments, Kara. That’s why I might, on that issue ...
Right. Those are sort of the patriotic ... You got these lovely roads and stuff like that. They can say, “All right, but it didn’t mean I have to build drones to kill children and blank. By accident kill people or by accident attack people I didn’t want to attack.”
Well, that doesn’t mean you don’t get to choose within Google what you work on.
But you were saying no one in Google, and that Google ...
... as a total enterprise. Well, no, no. They were saying that they, not only personally, didn’t wish to work on them, but they didn’t think the company ought to work on. All of us have an individual choice. All of us have an individual choice. None of us needs to do anything that’s against our conscience. I’m okay with that. If you’ve had an argument with me and you’ve failed to convince me, then I will follow my conscience even if it’s... I’m okay with them doing that.
It depends on the company. Microsoft’s gone forward with it. Salesforce has gone forward with the stuff they’re doing, Google didn’t, around Maven and stuff like that.
Right. But I think leadership has to be the leadership, and they could have said to these people, “We’re going to do this ...” Of course Google, and I want to pick on Google. Google does, has subsequently clarified that they will do other work for the US government and the Defense Department. I think that’s the appropriate leadership decision to make.
But I would also say to a kid who doesn’t want to work on it — by the way, these were not kids, mostly. These were people who were quite advanced in age, were part of the original tech generation who started all this. But I would say to them, “I’m not going to fire you as long as you can provide something, find something useful in the company to do. We’re doing lots of other things and you don’t have to work on something that I can’t bring you to regard as conscionable.” I’m okay with that. I’m okay with that. You don’t have to do just anything.
But for the company’s leadership to basically shy away from a small subset of employees on any issue, I don’t think is the right leadership decision to make. I made leadership decisions. We all have. We have run enterprises, and are they universally popular? Of course not. You get nasty letters. You get people say, “How dare you condone...” all the time. If you’re not willing to take that, you should ...
One of the arguments you were making was one that we discussed earlier today when I was doing a podcast with Mark Zuckerberg, one of his arguments about not reining tech in, especially him, is that your only alternative is China. “Well, do you want to be like China?” That’s always the ... Well, what if China wins? What if we have a Chinese internet? I called it the “Xi or me” argument, and I’m like ...
Good for you.
”Neither! Who’s number three?” Because I don’t like either of these choices. I don’t like Xi more than you, but I don’t mean I want you. It’s often pushed now, recently, by, I’ve noticed, a lot of people in Silicon Valley, “If we don’t get this, we’re not going to keep up.” At the same time, I do recognize the challenge from having the Chinese government owning the next internet age, which is disturbing on every level, given how they behave and the willingness to use surveillance tools, quite bald-facedly the way they do it there, I think, although some people say there’s just as much surveillance in this country, it really is. It’s a national policy to be creating these surveillance economies there. How do you look at China as a challenge in the digital age, especially since a lot of this will translate into cyberwarfare abilities?
It is a challenge. It is a challenge. We’re still very good at it, No. 1. No. 2, AI isn’t one big blob. Will they be the best repressors and jailers of minorities in the world? Yes. If that’s the race you are racing, they’re going to win that race. I’m confident of that. But that’s not the race I’m in. In the application of AI, the things that I think are consistent with values that are part of our ... at our best, are enlightenment values. Will we apply them best? Will we show the most prudence in avoiding abuse? Yeah. I’m pretty confident the United States of America and our European allies, they will be better at that. Then there are lots of commercial things that will segment. It’s not going to be one thing. I’m pretty confident in us and I’m not in a repression race with them.
I also don’t buy the argument at all, and I hear this all the time, it’s a qualitative argument that doesn’t work by the numbers, which is, “The Chinese have a big population. They’re going to have all this data that we don’t have.” Show me the theorem that says a factor of four in the size of a database makes a material difference. I defy anybody to show me that theorem in AI. It’s just baloney. But it’s said. Then it’s said again and again, and it’s said again and again and it’s adduced to some sort of proof, that the Chinese are inevitably going to be better at it. They will have a larger volume of intimate ...
Facial recognition, things like ...
Things that, it’s not first prize for my government to have, in my judgment. I ain’t ...
We don’t want to get into a facial recognition war with them.
I ain’t in that competition, but the fundamental science and applying it to good things that are good for human life, or are we going to ... Am I pessimistic about that? No. I’m not at all.
What do you worry about, though? Come on, there ... I get we’re the can-do Americans but ...
No, no. I worry about the same thing you do, I believe, which is applications that are not transparent and therefore where no one is held accountable. It is advertised, it’s a Wizard of Oz kind of black box and it’s just baloney inside there. They are going to tell you that you should buy this, or this idea or that idea is supported by more people and therefore is more supportable, which doesn’t violate, I mean, violates the laws of sort of ethical logic, that you decide what’s right and wrong by how many people agree with it. That’s what I worry about. There’s a lot of potential for that and I did not allow that in my precincts at the Defense Department, because I thought I had a responsibility of great gravity and that I could fall into that trap. That’s what I’m worried about.
You worry… You wrote this book about Inside the Five-Sided Box, which I assume you were referring to the Pentagon.
Teasing. Lessons from the Lifetime Leadership in the Pentagon. What is the biggest worry from technology, from a Defense Department view? What are the concerns?
Being unhooked from the global commercial technology base. We aren’t what we were in the ’50s and ’60s and ’70s when I started. In those days, all technology of consequence for protecting our people, and all technology of any consequence at all, came from the United States and came from within the walls of government. Those days are irrevocably lost.
And therefore, the only way I could discharge my duties as Secretary of Defense — which are not going to go away anytime soon, which are to protect the many against the few — the only way I could discharge them competently was to have access to the best technology. The only way I could do that was to have a reasonable relationship with the technology community, where we could be reasonably sympathetic to one another’s problems and values and challenges, and work together. Letting that go is my biggest fear. But then we’ll be a ghetto and then I won’t be able to do my job. I’d say that, of any important functionary...
Meaning you have to go elsewhere for technology, whether it’s Israel or China, even.
Yeah. But I’ve got to go outside the Pentagon no matter what, and it is a ...
It’s outside the government. Outside of the government, like the defense, the advanced research, DARPA.
Yes. Being unhooked ... Well, DARPA does a great job of that. I founded outposts. I tried to build all those bridges, the Defense Innovation Board that Eric Schmidt was on and Jeff Bezos and Reid Hoffman. That was an effort to get a path of dialogue there so that when we ran into an issue where they were seeing it one way or their employees were or the tradition was, and I was seeing it a different way, we had our way to get out of that. Because we all got to live here.
If you think ISIS is in this conversation with us, you’re dreaming. We’ve got to be able to protect ourselves and to be excellent at that. I need your help and that means I need to be reasonable about your values, too.
Do you think the relationship was damaged by the Snowden revelations? I do. I think it led directly to some of this stuff.
It definitely was.
The relations between tech and government.
Yes. No, it was. When I would go to the tech community as Secretary of Defense, sometimes I’d just say to people, “Look, we’re going to have to agree to disagree about Edward Snowden because I do not think he is a whistleblower. I know what a whistleblower is. I support whistleblowers. I’m required by law to support whistleblowers. You’re allowed to blow whistles. You’re not allowed to be a traitor.”
A whistleblower is supposed to follow some rules. One rule is that you at least make an effort to get your concern redressed. No record of such, in his case. Second, that you limit the damage or exposure you do to that which can reasonably call attention to your problem. He put everything out there. I sometimes liken it, Kara, to — we have thousands of nuclear weapons custodians in the United States. Suppose you decided that you didn’t like all these nuclear weapons, which are easy to hate because they’re ugly, terribly destructive things, and you decided that you were going to demonstrate how bad they were by stealing one and setting one off. Well, I mean, that’s overdoing it. That’s more than you need to make your point.
He didn’t follow what we teach here. We teach here at the Kennedy School, and I didn’t make this stuff up, the rules of whistleblowing, being a good whistleblower, and tolerating whistleblowing. None of that was followed, and he did damage to our security. He did damage to our international reputation and he did damage to our companies.
And created a rift between the tech industry, the discussions that were going on between the tech industry and government. Because they didn’t love being spied upon.
True. Yes. I think that’s absolutely true. Was some explaining required on the basis of what we’re doing. I’m okay with that part. By the way, you may not like it, but there is a constitutional process for that, which is an executive branch that follows the laws passed by the legislative branch and subject to judicial review.
That is the way it was done. If you think there were excesses, I am prepared to concede that, and ...
I just read the Mueller report so I don’t know about that anymore.
It wasn’t ... Well, again, I’m not trying to apologize for it per se, every little excess that was in there, but we had a process at least for doing that. Parenthetically, I have had a security clearance for communications intelligence since 1980, Kara, and I’ve never, I don’t believe, I Googled this to death, but I never witnessed one case in my lifetime of an American being jailed, harassed, outed, or humiliated in some way, extorted or anything by the government on the basis of their surveillance.
There’s one case that came close to it, which I recall, which I can’t really name, but I never saw it, which is a reasonably good record of conduct. That’s a long time of observing. That suggests to me that it is possible to have a pretty good track record of having a constitutionally controlled system where the laws can be enforced with a reasonable amount of surveillance without everybody being spied on all the time, which I don’t favor, either.
Last question, ‘cause I’m going to get some questions from the audience. You and I talked when you were the defense secretary, and one of the things we talked about was right after Apple had declined with Jim Comey on the encryption. At the time, what was really interesting to me is parts of the government did not agree with Jim Comey on this issue, and parts of the government did. I know President Obama did, obviously Comey thought he was right, but you had been not on that side. How do you look at encryption right now? Because it’s going to come up again.
It will. It will. What I expressed at the time was my abject dependence upon encryption, and quality encryption, in order to run the Defense Department. None of our stuff works without the assistance of computers, and none of those computers work if they get hacked. I am totally dependent upon really good encryption. I was making that a point. Encryption is not a bad thing. I don’t want everything open.
Secondly, and I wasn’t saying this as loudly, publicly, at the time, but I’m not — and I hope you’ve gotten this idea — for tangling … for having the private sector of our country tangle with its public sector. We’re all the same thing. The government is you, and so if we’re having a disagreement, the right instinct of the public servant is to say, “Let’s not Mau Mau each other. Let me try to get on the right ... Let me try to get together with ...”
Tim Cook is not an unreasonable person, so I would’ve taken a quieter ... I don’t know where you would’ve gotten in that particular case, but a quieter ... You go high like that, and you go public, and the people you stimulate are not the people you want in the conversation. It’s all those people who have nothing else to do than to get on Twitter, and you know what that is. You get the tails of the distribution weighing in against the center of the distribution. That’s not healthy. Both stylistically and substantively, I was in a somewhat different place.
Now you can argue with the fact that some measure of law enforcement access is appropriate. The tail doesn’t have to wag the whole technology dog, but it is necessary that the country cooperate with the government when it is lawfully trying to carry out the functions that are really important to us. From that point of view, I could see Jim Comey’s side as well. The thing is to have two big leaders in our society out tangling in public, that’s not my style. I don’t think it’s effective in the long run. So, that’s kind of where I was coming from.
Right, at the time. What if something else pops up? Is there an instance where you would see Apple have to turn it over? To me, they should have other ways to find these things out. If our National Security comes down to one iPhone, we’re in a lot of trouble. We really are. Come on. It’s a plot of a movie, it sounds like.
I think you say you’re going to count on us to be clever enough ...
There’s other ways to do intelligence.
There are other ways to do intelligence.
There’s tons of other ways.
We’re frequently entirely satisfied with those other ways of doing intelligence. We can’t have the moon. We can’t have everything so that we can flawlessly protect people against ... It’s going to be a little harder than that. People are going to need to tolerate a little bit of failure, and we’re going to need to tolerate the fact that people want privacy and independence, and we’re going to have to work a little harder to get the result that they deserve. I’m fine with all of that.
Technology is inevitably going to trend in the direction of encryption that is increasingly strong. So, you’re going to have to use other methods for collecting information. By the way, when quantum comes along ...
Quantum computing. There’s going to be an interesting moment when all of the corpus of information that has been encrypted in a way that is secure today will no longer be secure. You might say to yourself, “Well, it doesn’t matter because what I say then will be secure then.” But are you willing to have uncovered everything you’ve ever said? Most of us aren’t prepared to say that.
So all of your records ... In the Defense Department, I don’t want our thinking over time about war plans or the formulation of things to become suddenly opened up. Even though they might not be our then-war plans, they may be two versions back. I can’t afford to do that. There’s going to be a big trove of accumulated digital information, done with shorter keys, which will suddenly become breakable one day.
Somebody’s going to make a lot of money on a ... It’ll make the year 2000, Y2K thing, look like a small business. There are going to be people who will promise at the eleventh hour, the eleventh day before quantum makes all your files open, to re-encrypt them using quantum with longer key lengths. You’ll be lining up with your hard drives to have your stuff. Watch that space ...
All right. Why don’t you get in that business?
That’s one of these great business ideas that you and I can ...
There you go. You just gave it away.
Damn it. You know what the other one is?
All right. So we have questions from the audience.
Selling Coke outside of Whole Foods.
That’s true. Questions from the audience? Why don’t we start right here. Go ahead.
Audience member: I’ll start. Thank you so much, Secretary Carter, Kara. My question is, at the beginning you talked about what sounded like an internet Bill of Rights; privacy, nondiscrimination. But how much faith do the two of you have in a Congress that couldn’t really understand how Facebook or Google worked to actually implement that regulation?
May 1st. I just spoke in front of the House Democratic Caucus, and I did a podcast with Nancy Pelosi, Speaker Pelosi. That’s a crazy group of people, I’ve got to say, I was sort of like, “Whoa, whoa, wait a minute.” What was really interesting is I brought my two sons, and my youngest, who’s 13, he’s 14 now, he said, “Mom, that’s like looking at all of America in one ball room.” I was like, “Yeah, unfortunately.”
I think they’re perfectly capable of it. Like I said, I think they’re not ... Some of them are up to speed. There are people that were technologists that are in Congress right now. Several of the Congressmen had worked at Microsoft and other places. There’s several physicists. I forget, there was a bunch of technologists there. Certainly, you have Congressmen like Will Hurd from Texas or Congressman Ro Khanna, who certainly understand this, and their staffs do. I do rely on the fact that they regulate banks. They regulate cars. They regulate ... They don’t have to be tech experts to do this.
What I do worry about is sort of these loud-mouth other people. In the Senate, there’s a ton of them. There’s a ton in the Senate. There’s ... Senator Bennet is very smart. Senator Klobuchar, Senator Warner, Senator Burr, there’s a whole bunch. There’s governors, too, so there’s a lot of very tech-savvy people throughout government.
The issue is, can you keep ... This is a congressional issue. It’s obviously not going to be from the White House, at this point. There’s almost nobody in those roles there now, nor does there seem to be ... They’re going to have to figure out a way to do it. I do have confidence they’ll figure it out, and they do think it’s important. It’s just when you get on TV with that idiot who doesn’t know the difference between an iPhone and Google, you sort of just want to go like this. That’s just that guy, though, and he won’t be part of it, presumably.
I agree with everything she said. Don’t despair. These are generalists, and they’ve always had to specialize. There’s somebody in there who knows something about Syria, and somebody who knows something about tech, and somebody who knows about this area and that area at the law. You couldn’t, if you were a member of Congress, possibly vote on all the issues. You’d be cramming all day and all night, every single day. You have to depend on your ability to walk down the hall to somebody and say, “What on Earth do I do with this? I have no clue.” And have somebody trust you. It’s not as bad.
When you don’t get anything out of them, it’s usually a sign that they don’t think anybody in the country really wants them to act. I’m not convinced that the Congress is truly convinced yet that we’re concerned enough about tech, compared to all the other things that Americans are concerned about.
Yeah. It’s not a really big selling item in the election.
To really make them act, so it’s on everybody’s B list, and their A list is something else.
Yeah, health care. Overwhelmingly, health care. Tech is somewhere down here.
Yeah, but in other countries, it’s not. Look, they’re going to act in New Zealand, in France, and everywhere else. There’s going to be pressure from states. All those states are going to all act, and I think that’s what’s going to be the important ... It’s going to bubble up. It really isn’t a campaign slogan. “I’m going to get Facebook.” I don’t think that’s really ...
Then, on the Republican side, unfortunately with a lot of people, they’re obsessed with this idea of being “shadow-[banned].” The president did it today, and it’s just a lie. It’s a flat-out lie that they’re being discriminated against on these platforms. They just aren’t. It’s just not happening. So that obsession, it’s like a fever. It’s like a weird fever. They’ve never ... I’ve always thought the people that complain about not being able to speak never shut up. That’s what I’ve noticed. They have plenty of places to have outlets. That is occupying that side of the ... They need to stop. They won’t stop, but they need to stop.
Audience member: Thank you.
David Carey: Thank you. My name is David Carey. I’m a fellow on the Advanced Leadership Initiative. The one undisputed portion of the special counsel’s report was how much effort Russia made to influence our election. It was interesting to watch, following the New Zealand livestream terrorist attack, that Sri Lanka, first thing they did, they turned off all social media. Perhaps a template for the future. We are 18 months away from a presidential election and I’d like to ask both of you, how much confidence do you have in Facebook in particular, but all the social media, and Google, I guess with YouTube, to stop what happened last time, to really get out ahead of foreign parties attempting to weaponize our social media to influence the national election? If the election was held three months from now, are you confident they’re on top of it, or do you feel we’ve made little progress since 2016?
I think they are working on it. Aside from the fact they had to do their dumb fake war rooms that they invited reporters in to ... Did you see that? That was a ridiculous circus by them. I do think they have been working on a lot of stuff. It’s really complicated to do that. Some people think they should stop doing any political advertising a couple months ahead. They do that in France. A certain time before the election, have a blackout. There’s all kinds of questions of what they should do, but I do think they have been identifying and working on these problems to solve them.
I think probably, yes, I do have confidence that they do understand, this next election, the 2020... It worked pretty well in the ’18 elections. The 2020 elections have to be really clean. I do think they will continue to use these tactics like the Trump campaign did in the last one. I do think the Russians or whoever will continue to try to manipulate these elections. The issue is, we’re never going to know how they did it or whether it worked.
Just today, you don’t have to insult him, but Jared Kushner’s an idiot, just a complete idiot. He said that the Mueller report was more damaging to democracy than Russian intervention in our elections. I don’t even know what to say. I would’ve rushed the stage at that point. Are you kidding me? This is ridiculous. Any foreign government using our tools to ruin our election is a problem.
My concern, too, is voting machines and things like that. Senator Wyden is very ... That’s his big topic, the abuse of voting machines and the ability to manipulate them. I know that’s a plot of Scandal, but it’s actually also a problem. I’d worry about that issue, and I think Senator Wyden, I did a great podcast with him on that. I think people are smarter about these manipulations, too, so I do feel more confident.
We have an effort here at the Kennedy School, which I don’t run, but is run out of the same center, the Belfer Center that I’m the director of, which just to give you a few ... It ran a training program for the election operators in all of the states. Help me out here, some of the folks who worked on this. I think 46 of the states participated. Now, we have been asked by both the Republican and the Democratic sides, in the Iowa caucuses, to help them. This is not magic. This is basic hygiene, and tradecraft, but it’s good. My concern for next time is we’re getting more resilient, and people are wise to a lot of the tricks that are played on them on social media, but there are new things coming.
Like deepfakes, things like that.
I worry about a new playbook, but the Russians are shrewd, about a playbook we haven’t seen yet, for which we haven’t prepared, and which people are not inured to.
There’s no punishment for doing it. You’ve got major administration figures applauding it. You had Giuliani the other day, you had Kellyanne Conway and the president, so there’s not going to be any repercussions. To say those things is so irresponsible. It’s just beyond irresponsible, as far as I’m concerned. But, I think the companies do care. Thank goodness, they do. I think they’ve done a pretty good job around the world. So, yes?
Heidi Legg: Hi, I’m Heidi Legg. I’m the Director of Special Projects at the Shorenstein Center. I’m an enormous fan of yours, as a fellow journalist who does profiles. So, thanks for being here. After journalism school, I went up to California in the year 1997 for the promise of this thing called the internet. It was an amazing time. We had a surplus. Clinton was president. I don’t think he’d met Monica yet. We thought that this was an amazing thing for free press, independent press. We could create magazines. I can’t believe I’m actually thinking of regulation, but I look at it and I hear there’s so many topics that we can touch upon with the tech companies, but you’re so close to them, and you’re talking to them. What’s their view on what they’ve done to journalism? One of the reasons that I would be very pro, looking at regulation, is that they have all of the power now to be able to gain all the ads because of their digital capacity, the fact there are no privacy laws.
Heidi Legg: What we’re seeing at the Shorenstein Center is the only legacy players that can even come head-to-head with the tech companies in surviving and being journalism outlets, are ones with billionaire backers or large media conglomerates. So like a CNN, and Fox News, or the Henrys buying the Globe, or Bezos buying the Post, Patrick Soon-Shiong buying the LA Times, but otherwise, it’s an obliteration of journalism. I’d like to know, when you’re talking to these big leaders, since you’re with them, if they believe that regulation is going to curb their innovation, what about the fact that they’ve killed innovation for journalism? When I went out to ... As a journalist in San Francisco in 1996, they’ve killed it. They seem to just have a complete disregard for the fact that they are now the platform, and they refuse to be journalism ...
Take the responsibility? Yeah.
Heidi Legg: And take responsibility. I mean, that’s the one reason ... I’m so shocked that I’m thinking regulation, knowing my personality, my background, and the space I’m in, but I just don’t see any other way but to regulate them into looking at local news, which would solve a lot of these other problems of our society being misinformed.
Yeah, they don’t care. I don’t know what else to tell you.
Heidi Legg: How do we get them to care?
Heidi Legg: I’m so shocked by that.
You aren’t going to. How do we get them to dress better? I don’t know.
Heidi Legg: It’s destroying democracy to not have a journalist in your child’s classroom, or at your Thanksgiving table, or in your neighborhood, and giving you the straight facts and the context.
They don’t care. I promise you, they don’t care. They have a lot of news.
Heidi Legg: I’m coming on a road trip with you. We are going to go talk to them, because this is crazytown!
I’ve talked to them. First of all, Google and Facebook own all of digital advertising, really. It’s a duopoly. That’s pretty much it.
Heidi Legg: Are they more than 95 percent now?
Whatever. They own everything. They own the whole thing. Then they tried to reach out and do these Instant Articles and other things, which have been worse. We bought into it, and then put up with ... It’s like we’re painting their fences. We’re giving them our stuff, and that’s ridiculous. Pull out of all that stuff. It doesn’t help you. You don’t get any money from it.
One time, Facebook came to me once and said, “You should do Facebook Live.” I said, “Why?” They said, “Oh, you’ll be better known.” I go, “Where’s the money? Where am I going to make money at it?” “Well, it’d just be good for you.” I’m like, “No, I don’t think so.” They’re like, “Why not?” I go, “Why should I do something for no money that will help you? There’s nothing in it that helps me.” They’re like, “Well, that’s one way to look at it.” I said, “That’s the only way to look at it.” You know what I mean? “There’s nothing about you that’s vaguely interesting to me as a journalist.”
So yeah, we’re on there. I use Twitter. I like some things, for marketing. Twitter’s great. I just did a Twitter Live, and stuff like that, but they do not care. Let me just ... If you looked to them for any kind of help. Mark Zuckerberg or whoever is giving money to local news. It’s just, whatever.
Rich people have always owned newspapers. Guess what? There used to be the Bancrofts or whoever owned the Times. Who owned the Los Angeles Times? There was a rich person who owned the Los Angeles Times, whether it was the Grahams or whoever. Sometimes, you get great owners, like the Grahams and the Sulzburgers, and sometimes you get shitty owners, right? You’re going to have a billionaire owner for a lot of these things, going forward. They’re just not going to be as valuable, some of these things.
Some of them are going to be very valuable. The New York Times has done a really good job about figuring out different ways to make money. Now it’s not an enormous business, but it’s doing rather well.
Heidi Legg: I think 3 million out of 300 million Americans.
Right, but they’re good. They’re doing well. They’re moving in the right direction that way.
Heidi Legg: Small.
It’s small. Of course it is, but you have Laurene Powell Jobs, a very interesting owner of lots... I think she’ll do more. Marc Benioff ... Look, I would rather have Marc Benioff owning it than a lot of people.
Heidi Legg: Absolutely. We’re tracking these, and we’re very excited about these at the Shorenstein Center, but it’s still ... If their argument is that they shouldn’t be regulated because it’s going to stifle innovation, they have completely stifled innovation in the journalism space.
Yes, they don’t care.
Heidi Legg: Thank you.
Ash Carter: One more?
They don’t care, unfortunately. I’m sorry to tell you that. Some of them do, but I think it’s interesting a lot of them are buying this stuff. Jeff Bezos, I think he’s done a nice job at the Washington Post. I don’t think they’ll stop investigating that, Amazon, because he owns it. I don’t think they will.
Ash Carter: One more?
There’s one up there, too.
Amy: Hi. Thank you for taking my question and being here today. My name’s Amy. I’m a media and technology attorney in that space, and had some experience with navigating the Communications Decency Act and other online speech issues. I wanted to follow up on your conversation about the Communications Decency Act, and regulation versus self-regulation of these social media companies. I’m interested to hear, what entity or system do you think is best equipped to adjudicate those online speech disputes? Do you think that’s Congress, through legislation, or AI through filters, or the social media companies themselves, or the judicial system? As part of that process, how much weight do you think it makes sense to give to the privacy rights of the people who are posting that content that could be problematic in the first place?
Mark’s creating councils, okay? Again, let’s do his work for him, sure. Meanwhile, he’s one of the world’s richest people. That drives me crazy when they’re always asking for help, rich people asking for help. It’s always a pleasure. I think that there’s a lot of ways. You could just make them liable for what’s on their platforms. That’s terrifying to them. They think it’ll ruin their businesses. Lawyers scare them. Being held liable for that. It works for a lot of people. It works for chemical companies. It works for gas companies. It works for everybody. So, I think that’s one way.
The other way is to modify that act, I think, so that smaller companies do get the immunity and the larger companies don’t. There’s all kinds of creative ways. Just the idea of removing it has terrified Silicon Valley. I don’t know if there can be outside councils. Would you be on a council at Facebook to adjudicate things, or do you have other things to do?
I have other things to do. Also, I’d need to be convinced that this was real. Look, there’s a lot of history here. We did a pretty good job in the matter of decency with ... There were commissions that set rules. Remember, how many four-letter words were there? I don’t remember.
I don’t remember.
Maybe today that seems quaint and stupid, but something like that is fair when it comes to children and decency. We had people that said “go think of that and tell us what the rules ought to be.” Kara’s absolutely right. Nothing like a lawsuit sobers up the morality, I think. Some relaxation of CDA 230’s total immunity, I think, is appropriate.
When it comes to money and property, something we haven’t talked about tonight, but when ... How are we going to deal with the fact that people are trading in what you could argue, at least, is your property, wrongfully surrendered, which is your data. That is the big thing that people don’t want to talk about. Talking about freedom of the press and freedom of speech, that’s all very important, also. I don’t mean to belittle it at all, but you start talking money, and people really clam up.
At some point, we have to face the economics of what is going on here, which is that there is a transaction between a tech company and an advertiser, in which we are batted back and forth like a tennis ball, and our most sensitive information ... That isn’t the Chinese government that’s getting it, okay? That’s okay. But it’s anybody. They’re subject to leaks, data breaches, increasingly onerous or precision, or wrongheadedly targeted, marketing, and so forth. These are different problems, but propriety, privacy, money, and I ... We have models in the past for all of these things. I’m more optimistic.
Maybe these businesses aren’t quite as good as they think they are, because they’re a little more expensive to maintain properly. They’ve been allowed to be... Guess what? It’s better for chemical manufacturers to dump stuff into the river. That’s a better business for them. When they have to put filters on and they have to put suits on people and they have to pay off lawsuits, it costs a little much.
Guess what? That’s exactly what’s been happening here. They’ve allowed toxic waste to dump into the river of society, and they don’t care. You have to think of it that way because they don’t want you to think of it that way. Their businesses might not be quite as sweet. That’s all. Nobody wants to hear that because Wall Street doesn’t want to hear ... None of the power structure wants to hear that.
We would not even know as much as we do about tech without Kara.
A career, a lifetime of illuminating these issues for us. Thank you so much for being here.
Well, thank you for not getting us into a giant war. I appreciate that. Thank you.
Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.