clock menu more-arrow no yes mobile

Filed under:

Full transcript: Anti-Defamation League CEO Jonathan Greenblatt on Recode Decode

“If we’re not vigilant about the rights that we have and the privilege we enjoy, we shouldn’t expect to keep them.”

If you buy something from a Vox link, Vox Media may earn a commission. See our ethics statement.

A protester holds up a sign that reads, “No hate.” Spencer Platt / Getty

On this episode of Recode Decode, hosted by Kara Swisher, Jonathan Greenblatt, the CEO of the Anti-Defamation League, talks about how the century-old nonprofit is evolving to fight antisemitism and other forms of extremism in the digital age.

You can read some of the highlights here, or listen to the entire interview in the audio player below. We’ve also provided a lightly edited complete transcript of their conversation.

If you like this, be sure to subscribe to Recode Decode on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.

Kara Swisher: Recode Radio presents Recode Decode, coming to you from the Vox Media podcast network.

Hi, I’m Kara Swisher, executive editor of Recode. You’re listening to Recode Decode, a podcast about tech and media’s key players, big ideas and how they’re changing the world we live in. You can find more episodes of Recode Decode on Apple Podcasts, Spotify, Google Play Music, or wherever you listen to your podcast. Or just visit for more.

Today in the red chair is Jonathan Greenblatt, the CEO of the Anti-Defamation League. He’s also a social entrepreneur and previously worked in the Clinton and Obama White Houses. Jonathan, welcome to Recode Decode.

Jonathan Greenblatt: Thank you for having me.

No problem. When we talk in the beginning, I want to get your background because you’re also a techy. You’re also a techy. Which is critically important, I think, in your job today. So why don’t we start talking about that, your background a little bit and how you got to the Anti-Defamation League, and then tell us sort of what your charge is right now.

Sure. So, by way of background, I did my undergrad at Tufts and I got my MBA at the Kellogg School of Management. And in between, I worked actually in government. I worked at the Commerce Department in the early ’90s and then I worked at the White House at the National Economic Council.

And why did you do that?

I joined the Clinton campaign when he was running for president.

First Clinton one.

Yeah. I was a work study student at Tufts and he had this idea of young people serving in their community to pay their loans and I thought that was a much better idea than mopping floors and busing tables that I was doing. And so I moved down to Arkansas after I graduated and worked for Governor Clinton.

Wow. Just did that, just moved.

I believed I wanted to fight the good fight.

Right. And so you did that and then it took you to ...

He won and I came up to D.C. and I did international economics for five years at the Commerce Department. It was at the time when NAFTA was getting passed, the GATT became the WTO, Hong Kong transitioned over to China, APEC, so it was a time when international trade was really popping. Great time to be working on those issues. And then we tried to understand where’s the economy going. And I worked for a guy named Ira Magaziner at the White House.

Mr. Healthcare.

Exactly. Who then looked at trade issues. And what I saw was, tech was really growing. So we would go out to the Valley to try to understand these little new companies like eBay, Yahoo, Netscape.

Pierre had gone to Tufts.

Exactly. And the long and short of this is, I saw those people were changing the world and I wanted to be a part of that, but I didn’t ...

So this was what year?

This was ’95, ’96.

You sure? You were there when I was ... there’s not too many people that early.

And I remember when Amazon went out and I remember when Netscape launched. I was on the team at the Commerce Department that piloted Mosaic, the Mosaic browser in ’93, ’94. So I wanted to get into tech. Didn’t know anything about it really other than ...

So when you went out there, you saw how like, “Wow, this is cool.” You understood.

Yeah. I remember reading like Peter Schwartz’s “The Long Boom” article in Wired way back when. And I just thought this was the future.

But here you are in Washington.

It’s funny how I came back. So I went and got an MBA and then I went out and I wanted to go to a pre-IPO company that would change the world and I found this little business in Los Angeles called

How did you settle on

I was looking for a business that was in A) a really big market: Real estate’s a trillion dollars. B) Had a great management team, and they came from a bunch of really good companies. C) Had great venture, because I figured that was a proxy for figuring out what company would succeed and John Doerr was on the board. And Mary Meeker was also on the board. She was very involved with it. And then last, it had competitive advantage. I didn’t know anything about real estate, I didn’t own a thing, but they had a deal with the National Association of Realtors and I knew coming from D.C. that alliance would probably be very powerful.

Which, of course, it’s interesting you picked all the safe choices in tech like the company that is now as big as Airbnb, which had none of these advantages, over one of the others.

Right. It’s really internet 1.0 where you’re taking linear business models and sort of just putting it on the web.

Right. Absolutely. So you worked in Los Angeles,

Did that, and they hired me as an assistant product manager. The lowest you could be.

What’s that mean? What’d you do?

I was responsible basically for display ads.


So I was responsible for figuring out ...

Which was important on that particular ...

Huge. Their business model was, aggregate all of the MLS listings on the web. So you aggregate them, you normalize them so a consumer could search for real estate from anywhere in the country.

Which was a big deal.

Huge. It wasn’t possible before. And they monetized it by selling ads to realtors. And so that’s what I did.

Or home loans or whatever.

Yeah. Well, I basically focused on realtors, but, yeah. Then as it grew we did apartments, we did the financing side, etc.


And it was a great gig and I did it for a few years. The company went public, and grew big, and then had some issues, and it went a little south. I learned I was pretty good at shipping software. I learned I was pretty good at leading a team.

Were you technical at all? Did you have any technical ...

No, but you learn. You sit with engineers and you write like a technical specifications document and you learn how to, again, do software.

Right. And so why did you leave our internet cabal?

What happened was, again I learned I was pretty good at driving product, but I was not ... I missed public service. Now we were working for Wall Street shareholders — who were anonymous — and I still wanted to change the world.

And you didn’t want to go to another internet company. Google had just gotten started then, that might have been a good choice for you.

I remember when it got started.

Yeah, in ’99.

What happened was my roommate from business school came to me with an idea for a business. This thing called Ethos Water, that became Ethos Water. So basically he had this idea of, could we take bottled water — which is a $15 billion category in the U.S. — and use part of the profits to help children around the world get clean water. A billion people lack clean drinking water. And he came from McKinsey, he knew a lot about strategy. A very smart, good person. I was being very operational. At that point, I was running all consumer products for Realtor. So I wanted to do something that was still operational, but more socially responsible.

So water it is.

I left Realtor and we started the business together. So we started Ethos Water out of my house here in LA — or we’re in D.C., I suppose — and we bootstrapped it because no one wanted to [invest]. This is now 2002. Bubble had burst, things were being re-sorted out, and no one wanted to invest in a bottled water company.

Right. Right.

So we bootstrapped it and we ... eventually a few of my friends gave us money. It was like, I think they didn’t want to see my wife and I get a divorce. It was like therapy money.

But Ethos got a lot of traction, correct?

It certainly did. And eventually we had that young entrepreneur who started eBay. I met him at TED. I met Pierre at TED 2003 or 2004, and he invested. He was our first big investor. And eventually got it to a really nice size and then we sold it to Starbucks. And then I went to work for Howard Schultz as the Vice President for Global Consumer Products.

So you’ll be in the Schultz administration? We’ll talk about that in a little bit.

Not funny.

I love how he pretends.


“Oh, no, Kara. I’m good.”

More about that later.

Good at lying.

Yeah, so I went to work for Howard. Integrated the business, launched our product on their much larger retail platform. Howard asked me to serve on the board of the Starbucks Foundation because now he had millions in free cashflow to distribute to projects all over the world. Great gig. Enjoyed it. My wife and I had two kids at the time, were back in LA. I was in Seattle, which was not easy.

You moved to Seattle? Oh, so you ... I was just in Seattle the other day.

It’s a great town.

Yeah, it is. It’s gotten even better.

It’s really remarkable what’s happened to it. So I went back to LA. I got recruited to run a little magazine business called Good Magazine.

Another interesting entrepreneurial effort.

Exactly. Socially responsible. That was sort of going south. I mean, the print business is not a great business. It wasn’t back then either. And I invested heavy in digital, I invested heavily in online video.

So you’re the publisher?

Essentially CEO. Yeah, like publisher. And that was a great run. And then we had an idea that sort of came out of that, which is all these young people wanted to ... They said, “Well, I read your magazine,” or, “I drank your water. Now what?” We had this idea of, could we aggregate volunteer listings? Because it turns out volunteer databases are a lot like MLS. Offline, not standardized, fragmented.

So I pitched an idea at the Googleplex. This was in late 2008, I believe. And I said, “Why don’t we do for volunteer listings what Realtor did for real estate?” And they got excited about it. And so it was a 20 percent project. A bunch of engineers helped me to do it. They invested, and some big companies ... P&G put money in, Gap put money in, and we built this thing called All For Good, which was the largest aggregation of volunteer listings on the internet.

And there had been volunteer listing sites.

Sure, there’s VolunteerMatch, Idealist, there were a few others, but they were all, again, not standardized.


So you have to go to multiple places. What we did, we used data feeds and then we reached out and sort of scraped and brought all the listings in one place. And that essentially became this really big technology architecture.

The other innovation that we had at the time was we used APIs. So this was like late ’08, early ’09, social was really beginning to take off and so the innovation was, why would you go to You could use APIs and integrate the listings right into your Facebook feed. Right into whatever kind of site you were using.

Right. Right. Which you all did. And so you were working on that ...

And then eventually that grew to a nice size and then that got acquired by Points of Light. You know, the group that President Bush started. Because they have thousands of volunteer centers across the country, they didn’t have a technology architecture. So now it’s being used by all of the managed listings.

Managed listings?

So I had a few of those and they all ...

So you did all these different entrepreneurial things. So here you are wandering around from one ...

And then I got a call in 2011 somewhere from the West Wing. So President Obama had created this Office of ...

The show or the person?

Exactly. President Bartlet called me.

I wish there was a President Bartlet right now.

How we miss him.

How we miss President Bartlet. Let’s take a moment. Especially CJ. All right. So you get a call and you were ...

He created this Office of Social Innovation. This really talented woman Sonal Shah, an economist who started it, had left. And he wanted this office, which was supposed to be focused on using innovation to accelerate economic recovery, boost job creation. He wanted someone who’d created jobs, contributed economic recovery to run it.

So, look, I mean I honestly I wasn’t an Obama person, but you get a call from the President, you take it. And I believe in ... It’s a call to service. So we came out, my wife and my three kids, and we decided I would do this. So I spent three-and-a-half years working for President Obama and running that office.

So how many people did you have in it? Because I’m assuming it’s not staffed right now at all.

I think it has become ... I think it’s become the Office of American Innovation.

Oh, Chris Liddell?

No, that’s Jared.

Okay, that’s Jared. Okay.

It’s the best kind of innovation. It’s American innovation. As opposed to all the other kinds. Yeah, so I probably had half a dozen to 10 people would wax and wane with fellows and details ...

Sure. And so what were your initiatives that you worked on?

We did three big things. So No. 1, we tried to find new ways to put people to work. So I was responsible for the national service agenda. Service as a strategy to put people back to work. So like AmeriCorps, Peace Corps, I was responsible for all those programs and expanding them, because the budget was frozen. So we created new programs like FEMA Corps to help with disaster relief and Justice Corps to help with issues on the immigration front, actually.

Secondly, I did all the public-private partnerships for the president. So I worked on Joining Forces, helped set up My Brother’s Keeper, all these different initiatives that tried to find ways you can bring philanthropy and business together to achieve the public interest.

And then thirdly, I worked on the impact investing, or Social Entrepreneurship Agenda. So I worked on boring things like tax policy, ERISA reform, trying to move big swaths of capital from offline — like, passively handled by fund managers in New York — into the economy. And so how we get foundations that have $800 billion ...

And they do. They only use VCs or whatever.

Exactly. How to make it easier for foundations to put money into jobs that we’re creating ... into companies creating jobs. How do we make it easier for pension funds to put their money into firms that are doing like renewables or socially responsible ...

So they can find them and invest in them. And there’s funds like that on Wall Street.

More and more. More and more. And so I helped launch the first social impact bonds in the government, launched all the kinds of new programs to create novel financial instruments that used the capital markets more effectively.

So that we could invest in social good, presumably.

That’s right. So social good you got to measure. It’s got a dry financial return. But you could also achieve kind of broader public benefit.

Which is attractive to millennials. That’s one of the many polls they do — one of the endless polls on millennials — that is one thing that sticks out.

It’s unbelievable. So millennials vote with their wallets. And now big firms like BlackRock and Goldman Cap Group and all these other large-scale investment houses are building funds and firms specifically to take advantage of how millennials want to deploy their dollars.

Right, and companies that reflect those values. Interesting. It’s interesting how Amazon’s going around trying to figure out where they’re going to be. I suspect they will not be somewhere that is less than ... you know, it’ll be interesting ...

They’re figuring it out. They historically haven’t been great at it, but they hired a really effective executive from Business for Social Responsibility, BSR, and they’re now doing interesting stuff on the sustainability front.

Yeah. Absolutely. But I’m thinking of where they’re locating even their facilities that they’re going to pick.

They have to think about all of these issues.

It’ll matter how a state behaves, I think, in a lot of ways. It’ll be interesting.

It’ll matter deeply.

That’s where economic growth will happen.

Economic growth will happen there. They’ll be able to commit to things like public transportation or better infrastructure. Lots of really interesting things.

Yeah, so how did you get to the ADL? Because this is ... and what a time to get there.

It’s a funny story. So I was giving a speech in Massachusetts to a room full of university chief investment officers, which is sort of my crowd. How do we get them, again, to deploy their dollars?

This was when you were in the Obama administration?

And I got a voicemail from a headhunter. It was a headhunter who said, “Hi, my name is so-and-so. I’m from this firm. The ADL, their longtime chief executive, Abe Foxman, is retiring. We got your name. Would you be interested? Please give me a call back or if you would know someone, please call back.”

So I just get a lot of calls from headhunters, like I think a lot of people in these public-facing jobs at the White House. I don’t respond to most of them, but I responded. I called my wife, actually, when I got this call because two things. No. 1, I didn’t explain this, but when I was a senior at Tufts, I interned at the ADL office in Boston. My grandfather was a Holocaust survivor from Germany. The year before, while studying abroad, I’d visited the town where he was from. No Jews there anymore.

I came back to Tufts and said, “I want to do something.” I heard about this organization, the Anti-Defamation League. Talked my way into an internship. And then 10 years later when I moved out to LA, I didn’t really know anyone. I learned that a woman I had worked for at that Boston office had moved to the Los Angeles ADL office. So I called her and ... She’s a Jewish mother, basically. She’s a Jewish mother. So you call a Jewish mother, you say, “I just moved to town.” She wants to ...

Yeah, help you.

Yeah, help you. So she wants to feed you because she’s certain I’m emaciated because I’m living alone. And then she wants to set me up on a date. And so she did that. She set me up on a blind date, this woman, with one of the people who worked at the ADL office. And 17, 18 years later, my wife and I are still married.


So my wife worked there for seven years.

You got your wife through the ADL, you did an internship, and now your job.

So I got that call and I called my wife. I’m like, “Can you believe this? Abe Foxman’s retiring, they call me.” My wife said, “Oh, that would be a great job.” I said, “Oh, I think that would be a terrible job. You have to fight Nazis, and anti-Semites, and racists.” No. 2, I told her, “We’re going home to LA.” The plan was to do this then go back to California. It’d be a waste of my time. And thirdly, I also thought, Kara, I’m not qualified for the job. Look, I’ve never ... I’m not a lawyer, I’m an MBA.

Right, which is a critical part of ADL.

Crucial. I don’t know anything about the civil rights agenda. I’ve never run a nonprofit organization. I’ve never worked in the Jewish community. Like I’m certainly not ...

You’re perfect.

Yeah, exactly. So I thought, “She certainly is not really interested in me.” But you know what? What I thought was, “You know what, I’ll go talk to this woman — not because I want the job, because I don’t. I’m not even qualified. But the next CEO of ADL” — because I knew how important the organization was — “should be thinking about search, and social, and tech, and innovation, and income, and business.”

Because that’s what the Nazis are using.

That’s where the world was going. That’s right. So I took the interview on a lark think that, “Well, I can help shape the search and that will be my contribution.” And one thing led to another and I’m here.

But it’s interesting, because the ADL is such a storied organization. It feel like, even if this isn’t where I thought I would be, it’s a privilege to be here every day. And the issues matter more now.

Yeah. You sort of hit the timing here. Your timing is perfection, in a horrible way. So you took this job and you ... Explain what the ADL does, for those that don’t know. There’s a number of organizations like it, but it’s a unique and important organization.

It is unique. So the ADL was founded in 1913 around the time that Leo Frank was lynched outside of Atlanta. It’s a famous story. Jewish man falsely accused of a crime, found guilty, sentenced to death, the governor commutes his sentence — because it clearly was a sham trial — to life imprisonment, the mob is so enraged they hang him from a tree. And the ADL was founded at that time when anti-Semitism is prevalent along with racism etc. And the founders create this organization and in their own words, they write a mission statement that the organization will “work to stop the defamation of the Jewish people and secure justice and fair treatment to all.”

So that’s a very interesting mission statement, because 100 years ago, the Jews — again, not only was there pervasive anti-Semitism, quotas kept them out of many universities, customs kept them out of many professions ...

They had to hide away.

Covenants didn’t let them live in many places, so they didn’t really have any of the political power, economic resources the community has today. They don’t really have a leg to stand on. So it was a bold proposition that they would be out for themselves, but also justice and fair treatment to all. Like again, based on ... Their future was very uncertain. They were very weak. But they had this audacious — frankly it’s a very Jewish — idea, “We’ll fight for ourselves, but also fight for others.”

So over the next 100 years, they tore down a lot of those quotas, exposed a lot of those practices, they made America a better place for the Jewish community. And in the early ’50s, like in ’52, they filed an amicus brief in Brown v. Board of Education. Which was a bold, controversial thing to do. And they literally put people on those buses, the Freedom Rider buses, and they marched with Dr. King. And they stood up for the LGBTQ community in the ’80s. And I’ve heard these stories when people were afraid that gay men, you could catch AIDS from someone sneezing on you. The ADL stood up for them. And they stood up for immigrants in the ’50s. I could go on and on. They have a remarkable history.

Today, basically, the work continues to be inspired by that mission: Fighting for the Jewish community and for others. The ADL does three things: Advocacy, education and law enforcement. Advocacy is working to change laws through the courts or through Congress. Lobbying, filing amicus briefs, litigating to a degree.

So there’s strategic issues around that where you place your ...

Exactly. Around protecting minorities, preserving the First Amendment.

No. 2, education. Long ago, they realized you can’t litigate or lobby your way out of hate. You have to change hearts and minds. Today, the ADL is one of the largest providers in the United States of anti-bias, anti-bullying, anti-hate content in schools. Our materials reach more than a million-and-a-half children every year. We literally cannot keep up with the demand.

No. 3, we work with law enforcement. We both help them investigate hate crimes ...

Right, and focus on who needs to be focused on.

Focus. We have a whole research apparatus, our Center on Extremism focuses on researching the bad guys and we train police now to deal with hate and how to deal with extremism. We train 15,000 officers every year. More than any other NGO in the country. The FBI has made our training mandatory. The NYPD has made our training mandatory. So basically advocacy, education and law enforcement, those are the three things we do. We have a network of 26 offices across the country, field offices, that are like our channel, that sort of go to market and implement those programs locally in Seattle, or ...

And presumably, you work with others like the Southern Poverty Law Center and others to try to chronicle what’s going on.

We work with the SPLC, for example, and the U.S. Holocaust Museum on some of that training for law enforcement and researching the bad guys. I was with Anthony Romero last week in the Bay Area. We work with ACLU a lot on First Amendment cases.

On the education front, we’re constantly partnering with groups like Facing History and working on the ground in school districts.

All right. We’re going to talk about what that means now, then. Here we are. You got here. It’s a really bad time now, all of a sudden. And so we’re going to talk about that and more, including the impact of tech on all of these problematic issues for the American public and the political scene right now, which is making it even worse.

We’re here with Jonathan Greenblatt. He is the CEO of the Anti-Defamation League here in D.C. That’s the headquarters there, correct?

No, we’re headquartered in Manhattan.


We have a big office here in D.C.

Excellent. We’re here with Jonathan Greenblatt. He’s the head of the Anti-Defamation League. It is an organization that fights for the rights of those that do not have them.


We’re here with Jonathan Greenblatt. He is the CEO of the Anti-Defamation League. We’re talking about his background and how he got to this organization. And it’s very entrepreneurial. And it’s very tech-oriented, which is interesting because it’s a critical skill going forward.

Before we get to that, I want to talk about sort of the state of play right now. In the Trump administration, everything seems jacked up in the most horrible way at this point. And I want you to talk about why that is or what’s happening. What has happened in this country where it just seems like you have a lot to do?

Well, I’ll tell you. I mean, as a 501c3, we’re non-political, but I don’t think there’s anything partisan about fighting prejudice. And what we saw in the 2016 campaign was, you saw one particular candidate really stoke up ...

Around immigrants.

Around Muslims, Mexicans and immigrants of all variety, issues, if you will, of tolerance and extremism. And we saw a mainstreaming along of sort of white nationalists into the room in a way we had not seen since George Wallace in the ’60s. Of course, George Wallace didn’t win the White House. And indeed, after the election day, in the last two months of 2016, you saw a massive spike in hate crimes and bias incidents directed at Jews, again Muslims, Mexicans and immigrants in general. And it was really very alarming. And this is the data. Again, there’s nothing political in pointing out the fact that that spike happened and it continued in the first half of this year.

We saw in the first half of 2017 a 76 percent increase in bias incidents against Jews compared to the first half of last year. Nearly 1,000 incidents of harassment, vandalism and violence. Just against Jews. When you add in the spike we’ve seen against Muslims and Mexicans, it’s really extremely alarming. So when we talk about, well, why are things jacked up?

It is difficult to explain why the president would choose to focus his Twitter feed more on NFL players demonstrating their First Amendment rights versus white supremacists who literally have murdered several people over the course of this year: An Indian immigrant in Kansas City; two innocent bystanders in Portland, Oregon; an African-American ROTC student right here in the D.C. area. It’s hard to understand how you can equivocate on the unequivocal.

All right. So let’s talk about why that has happened. Obviously it’s the permission, I guess, to do that. Or is it social media or what’s the ... Let’s talk about sort of the ... You don’t want to blame everything on Twitter, but in a lot of ways it’s created an atmosphere of hatred, really.

Well, let’s be honest. I think, first and foremost, leaders lead. And what gets said at the top trickles down. So I think if we try to understand the causality being ambiguous about calling out what seems to me pretty unambiguous, that creates the conditions in which extremism can really ... they can feel emboldened.

Now, social media, Twitter in particular, helps to accelerate and amplify that. And so you see it as a bit of an echo chamber. And whether we want to talk about trolls, or we want to talk about sort of bots and cyborgs, or whatever the causality there, social media has become really this echo chamber where the things we hear from the top really reverberate and they resonate with parts of the community that, again, white supremacists, it isn’t that they haven’t been around, they’ve always been around. They have always been bigots, but they’ve had to literally convene in corn fields in the dead of night like in rural Iowa. Today, they’re out in the open, hiding behind the anonymity of a Reddit or a 4Chan and then using the social media ecosystem to push their memes out into Twitter and to the public.

So talk about how they do this because it’s something ... and then I want to talk about how to fight that. How do you fight that or if it’s possible to even fight it? They finally get a voice, is what you’re talking about. The internet was started with the idea that everybody gets a voice now, isn’t this great for democracy? Isn’t this great for all people because there’s been gatekeepers, you know the whole ... So talk about their success in using these and what that means.

I think one of the things that’s happened is these platforms like Facebook, like Twitter and many others, have emerged without the kind of filters and the sort of systems, the checks in the systems, that you have in broader parts in media like newspapers like we were talking about before we started taping or broadcast. The fact of the matter is, journalism as an industry has an ethos and people go to school for it. They get trained in it. There is not ethos on social media, right? And that creates the conditions in which you can get your message out very directly to people. And it plays into, again, we all have these cellphones in our hands which connect us directly into, like, the matrix.

So there’s no more breaks. There’s no more filters. That’s a big part of the problem. And they’ve learned. They’ve learned how to exploit that effectively. So we watched this during the 2016 campaign. We watched where — when I say we watched it, our Center on Extremism tracks the right-wing extremists, the left-wing radicals, we track all of them, and we could see things started in 8Chan or 4Chan or Reddit where a lot of these memes actually get developed. And we watched them send them out to particular voices on Twitter or DM them or send them privately, and then those voices consistently would start to propagate this stuff. And then people like the Trump campaign would pick it up and retweet it.

So you could see there was a through line between certain white supremacists and extremist accounts and how things ended up in the public domain. There’s nothing accidental about it, Kara. It was very intentional. It was very deliberate. And so part of the challenge becomes when, again, Twitter and Facebook, let’s be frank, they themselves can’t keep up with the technology. So one of the things we did last year with Google was we exposed the parentheses meme. Do you remember that?

Yes. Explain it for me.

So basically white supremacists wanted to identify Jews because they think the Jews are behind all the evils of the world. So they created this meme where they would put parentheses around the names of Jews to demonstrate how we “echo through history.” By the way, they would put it on Jews or people who they thought were Jewish and they built kind of a plugin for the Chrome browser so that you could — I guess it worked on Firefox, too — so you could pull up a website and if you have the plugin, the plugin would search for names on a webpage. And if a Jewish name showed up from a database or names they had previously identified and entered, it would put the parenthesis around it for you so you could easily identify the Jews in a news story for example. So it would say, “By (((Jake Tapper))).” “By (((Walt Mossberg))).”

So we identified that and we went to Google and we actually also went to Apple and got them to take it out of their stores. The plugin, basically. But I say this because these things get created and Google and Facebook and Twitter, they themselves can’t keep up with it.

They just put it up.

I mean, if you can imagine Facebook has a billion, I think it’s 1.7 billion.

It’s over two.

It’s over two billion members. So the last data I heard was more than 4.5 billion messages across the platform every day. And if you include WhatsApp, I mean the numbers are astounding. There is no way in God’s green Earth, no matter how many customer service reps Mark Zuckerberg hires, he could ever keep up with the torrent of information.

That it’s being perpetrated around. Especially the negative information. Okay. I still think it’s their fault. You know what I mean?

But it is, though, because ...

Because I think one of the things they put out is one, they built systems where they didn’t anticipate this.

That’s right.

And two, they act like it’s a benign platform. I’ve been saying this a lot. They act like, “Oh, it’s a benign ... it’s only for good.” And they don’t ... I’ll never forget some Facebook executives talking about Facebook Live and I said, “When’s the first hate crime on it?” And they were like, “What are you talking about?” And I was like, “You haven’t thought about this? Like maybe you have, maybe you haven’t. Why haven’t you done enough?” You know what I mean? They just ...

You and I both know that the Valley — I spent a lot of time in the Valley, so have you — there’s a Libertarian ethos there. A Libertarian ethos just like ...

It’s a faux Libertarian. It’s not a really good one.

Well, it may be. Like a Thielian Libertarian ethos, right, where it’s like, “Anything goes, and it’ll be good, and just keep government away, and we’ll innovate our way to utopia.” And we both know that human nature doesn’t exactly necessarily work that way.

And we shouldn’t be surprised that extremists exploit new media. The Nazis did it with “Triumph of the Will” and using film as ways to propagandize. The Soviets did it with Pravda and using print media to kind of influence people. So we shouldn’t be surprised that extremists today try to terrorize and spread their own form of tyranny, to use that term again, through new media. Now with that said ...

We shouldn’t be surprised, but they shouldn’t be either.

That’s the point. So the point is that look, for example, white supremacists could, if they chose to, decide to ask for a room in the Grand Hyatt in downtown D.C. But guess what, the manager of the Grand Hyatt’s going to say, “You know what, I don’t think it’s a great idea for me to rent our space to you because I don’t think it’ll send the right message to the rest of my patrons if five people with swastika arm bands walk goose-stepping through the lobby.” So by the same token, it’s fair to say that Facebook and Twitter and Google could do a better job of ensuring that they preserve freedom of speech, but they also protect the safety of all of their users.

They are trying to make inroads now. They realize they have a problem. And I’ll tell you a story. Last year with Twitter, I heard from people ... you know, journalists, broadcast and print, who would interview me and then afterwards they would say, “I’m worried about the anti-Semitic abuse being launched at me.” I said, “What do you mean?” So we organized a task force to look at this last year and we pulled some sample Twitter data. We found millions and millions and millions of anti-Semitic messages. Tens of thousands of messages directed specifically at Jewish journalists. And when this story broke, Twitter initially wasn’t willing to listen to us. But if you remember how their M&A talks got derailed last year when Disney pulled back and Salesforce pulled back.

Derailed is a kind way of putting it.


Nobody wanted to buy them.

And part of the reason they said was concern about the liability on the platform. I think in part that was because of the report that we released. And so here’s what happened there. Twitter realized this is no longer a stakeholder issue, it’s actually a shareholder issue. And this is what my own experience in business ...

Explain the difference between them.

A stakeholder issue is where a small group of activists expresses a concern and it’s a marginal issue and you deal with it out of the CSR office. It’s kind of nice to have. A shareholder issue is when you deal with it out of the investor relations office and it’s an absolute must because if your share price is going down, that suddenly gets the board’s attention and gets your shareholders’ attention in a different way.

So how do you — when you go out there since you do speak their language, you’ve been working with them — get their attention on this? Because I think this is a really critical issue, that they’re very slow to want to do anything about this. What is the reason for it, from your perspective? Because they see themselves as, again, benign and good people, which they are, not benign but good for sure. I mean, I don’t think they’re sitting there and thinking, “Ah, we’ll just let anything go that’s on my platform.” They’re definitely worried and concerned about it.

Yeah, I think ... Look, at an individual level, I’ve been blessed to meet lots of executives. They’re absolutely good people. I think what happens is that when these things grow at a degree of scale, the individual’s kind of desires get overtaken by the board. And so what I think is that these companies now realize they’ve sort of crossed a Rubicon. They realize their size, and their penetration, and all segments of society now has the attention not only of journalists. Is it Farhad who’s been writing about, what did he call, the “Frightful Five” or something like that? He’s used that phrase for a while. Now regulators are looking at it. Aspiring politicians are looking at it. So suddenly it reminds me of the ’90s when they went after Microsoft.

Except in that case it was over monopoly. This is really tarnishing society.

Totally right. At some point, it reaches a critical mass and size that you get attention.

100 percent.

And I think, by the way, that Microsoft, it’s almost like a parable, what happened to them, and I think everyone in the Valley — why Google and Facebook have such huge offices out here. But I think they suddenly have tuned into the fact that they can no longer ignore this problem. I believe they ignored it before because of Libertarian ethos and because they all want user growth. One of the core metrics.

It goes against every user growth ...


I talked about this about Twitter is that if they turn off the bullying perfectly, their user growth drops. But it’s already dropping. Because they create an atmosphere that is so vile and poisonous that user growth dies anyway. So it’s really kind of fascinating.

You’re probably right, but if you think about that investor relations deck, every quarter you want to have user growth going in the right direction. So anything that might put the brakes on that concerns an analyst who raises his hand or her hand and say, “Whoa. What’s going on here?”

So flash forward to today. We’ve now found they’re much more willing to work with us. Actually, we mentioned him already twice today, Pierre Omidyar, that I was at South By this past year and announced that we’re opening a new center on technology and society in Silicon Valley. Actually rolling it out next month in November in Palo Alto. Pierre gave us the seed capital to fund this thing. An Iranian American, never involved with ADL, but he cares deeply about free speech. He’s worried about fake news, he’s worried about kind of the cyber hate, so he gave us the seed capital. And the companies realize they’ve got to figure out ways to convene and work together.

I would liken it to sort of child pornography. Even copyright infringement. Where they’ve developed shared strategies ...

Or spam. They were fast on that, right? They’re very fast on child pornography. So what is your office going to do out there? What is your goal?

In our first two years of working ... When I came on board two years ago, I immediately cranked this up. We created a cyber hate working group. Many of the big companies work with us on it. And we’ve worked on things like terms of use and how to develop our terms of use or terms of service that will keep out ... you have to allow for some degree of hate speech. Hate speech is free speech. Like it or not. You can say mean things. But hateful speech is different than harmful speech. It’s one thing to say, “I don’t like Jews,” and then to say, “I want to kill them all.” And it can be a bit of a gray line, I’ll acknowledge, but the First Amendment isn’t supposed to allow for harmful speech.

Or violent speech.

Or violent speech. Yeah. So with that said, just last week we announced we’re creating a ... in fact, I’ll tell you a story. You saw this a few weeks ago, there were accounts about how folks were using the ad platforms, I think ProPublica broke this, the ad platforms to target Jews or target blacks.

I was going to ask you about that. Yeah.

So that broke on Thursday. On Friday, the head of Facebook’s policy group was in my office in New York and said, “How do we work together on this? We know we have a problem. What do we do?” Last week, we announced a new initiative under the rubric of this new center, we’re calling it the Problem Solving Lab. Here’s why it’s important. It’s not lawyers, it’s engineers. It’s not policy people, it’s product people.

So I think the way that we will really start to solve this problem is figuring out, again, shared strategies, technical approaches, and we’ve got Microsoft, Google, Twitter and Facebook all convening to participate in this with product people. Because you go to build these solutions. You’re not, again, I think, going to lobby your way out of the problem.

I get that. But again, I want to get to ... you’re being nice because you’re working with them. I like that you’re working with them, but why didn’t they ... again, Libertarian doesn’t cover it for me. It’s something else that’s at work within the group. Either that or they see themselves as not impactful. I get exhausted by Google execs saying, “We’re such a small company.” You know what I mean?

I know. I know.

You know what I mean? I’m like, “Are you kidding?” Or Facebook news distribution. Everybody gets their news from Facebook and or Twitter and or ... their impact, they don’t seem to want to acknowledge their impact.

A Libertarian ethos layered on top of an evolving business model, but let’s be honest, naivety. And whether that’s an intentional naivety like, “I’m going to cover my eyes,” or it’s an unintentional naivety, they don’t realize. But I think this thing has grown to a scale. It’s a bit like a Frankenstein’s monster. They don’t even realize what they’ve created.

And it dawns on them when the president of the company, a Jewish woman who publicly mourned the loss of her husband just a year or two ago suddenly sees literally like anti-Semites using her platform to find other like-minded people who want to kill Jews. I think that was a wake-up call for Sheryl Sandberg. I think it was a wake-up call for Mark Zuckerberg who, a week before, or maybe two weeks before, talked about Rosh Hashanah in a personal post on Facebook that he doesn’t do very often. And again, suddenly their platform’s been hijacked by haters.

So I think they realized that a Libertarian ethos and an uncertain business model are no longer excuses when extremists are running amok. So we’ve been working with Google through their Jigsaw division on their initiative called Perspective. Have you heard about this? So we’ve got the best data sets out there on anti-Semitism and bigotry because we’ve been tracking this stuff literally for a hundred years.

So I think AI and machine learning are important parts of how we tackle this problem. And I’ll give you an example. So I often get, if you look at my Twitter feed, it’s crazy. I’ve got horrible white supremacists tweeting at me, and anti-Israel people tweeting at me, and all kinds of stuff. It’s really great. So people will tweet a ... So if I’m walking into Best Buy on a Sunday afternoon and I get tweeted a picture to me of an oven, that might be okay because maybe there’s a sale on Whirlpool ovens in aisle 12 or whatever. But when I’m sitting here in your studio and I get tweeted a picture of two ovens, double ovens, and it says, “Jewish bunk bed” on it, that’s probably not such a nice thing to send to me.

No. I wouldn’t even look at my Twitter if I were you.

Yeah, I don’t look at it very often for this reason. So if you used AI and you saw, “Ah, the person tweeting @Jgreenblatt, his name or whatever, its name is @WhiteGenocide, their twitter bio says, ‘I want to kill all the Jews,’” and you see that they’ve been flagged for messages before, and you see that they have none of the friends in common with me and other followers of me. There are lots of triggers that if we were using AI to effectively in nanoseconds, milliseconds monitor these kinds of things, you could instantly if not solve the problem you could mitigate it dramatically.

Right. We’re going to talk about solutions and what to do and some of the tactics that these extremist groups use with Jonathan Greenblatt. He’s the CEO of the Anti-Defamation League. And increasingly, he’s going to have to focus on the tech solutions to these problems.


We’re here with Jonathan Greenblatt. He is the CEO of the Anti-Defamation League. It’s been a fascinating discussion about how tech companies are dealing with the onslaught of extremism, that extremists are using online tools quite effectively and for organization, for spreading of hatred, spreading of their ethos. Talk about a few things that they do. Like you talked about the parentheses, but talk about some of the more egregious things recently.

Sure. Well, one of the things we’ve seen ... we’ve seen different ... So No. 1, on Twitter we’ve seen extremists specifically pursue journalists. So it’s a technique. They try to shut people down. They try to push people to self-censor themselves. And they do it by doxxing journalists.

Which is well known.

Which is well known. And so they’ll put up that information, so suddenly — if you’re the head of the ADL, you expect to get harassed on Twitter, but when you’re a freelance journalist writing for GQ, you don’t expect that your personal cellphone is going to start to ring with horrible messages or you’ll receive snail mail to your home saying, “We know where you live,” which is the kind of things that have happened repeatedly to journalists. So one of the techniques is to target journalists. And they did this during the campaign after they would write things about the Trump campaign. And use doxxing and sort of cyber bullying to try to shut them down.

The second thing that we’ve seen them do is really when someone does something questionable, just jam them with all kinds of messages. And I don’t know what the term for this is, but literally people see their Twitter feeds flooded with hateful messages. And they’re using cyborgs and bots to do that. Like no person can ... The level of incoming I’m talking about is absolutely paralyzing.

And in terms of communicating with each other, what are the preferred areas? There’s Reddit, obviously.

Yeah. It’s sort of Reddit and 4Chan and 8Chan where they can be a little bit more hidden than on services like Facebook or Instagram or Twitter. We’ve also seen them move to services like Telegram and others that are — WhatsApp — that are harder to track, they’re more point to point versus many to many. And, you know, there was a piece in Wired last month that started to talk about this new kind of alt-right internet that they’re attempting to create. Like to recreate many of these services for their own community.

So they can talk ... but I think what would have been a more effective thing is they’re not talking to each other, talk to a lot of people.

That’s their idea. So, you know what’s interesting about all of this? So what they’ve really tried to do, and I got to be honest, the campaign and then this presidency has given them a pathway forward to normalize. And this is what I think we need to be most worried about.

I agree.

Yeah, it’s not the Twitter handle @WhiteGenocide — as disgusting and revolting as that might be — it’s Richard Spencer who says ...

Verified on Twitter.

Verified on Twitter. “I’m a free-speech advocate, so you should let me speak at your university. Or you should welcome me at your event.” Or people like Alex Jones who literally are fellow travelers with these people because they recycle their ridiculous conspiracy theories. And then suddenly the Megyn Kellys of the world interview them and give them an imprimatur of respectability.

The argument to me that you should hear what they’re saying, that you should hear the voices because the media tends to make it more benign than it should be. People should actually listen to what their actual words are.

I think that’s right. I think that was one of the ... I would really give props to the folks at Vice for in Charlottesville in August, because what they didn’t do is glamorize these people. They just put the cameras on them and let you hear them say, “Jews will not replace us.” They put the cameras on them and let them say all these just absolutely revolting things.

Is that normalizing or let’s just show you what they’re like?

Well, so it’s interesting. There’s a fine line between normalizing or glamorizing these people. When you put Richard Spencer on without any context, you just interview him with his sort of short hair cut wearing like a suit and a polo shirt, you almost make him seem like he’s a respectable member of the intelligentsia.

When you layer in, though, some B roll of him doing the heil Hitler salute and saying the kind of outlandish things he says about Jews and African Americans and Mexican Americans, that’s when you expose his intolerance for what it is. So, yeah, look, we’re free-speech advocates at the ADL. We believe you got to expose this stuff in order to understand it. You got to hear them in order to grasp the threat that they represent, but it needs to be done in the clinical kind of way, not in a way that unintentionally or by the way intentionally elevates these people.

Right. Right. Exactly. It’s a difficult thing.

It is hard because we’re also in a media environment where we’re always looking for equivalence. Like I call it the “Crossfire effect.” You have to have someone on the right, you have to have someone from the left. Look, there’s not right and left around bigotry. And yeah, we need to be able to acknowledge that someone like ...

Well, isn’t cable news built on that? I mean really. Honestly. I won’t go on any of those panels. I refuse. I’m not going to have someone who’s ignorant be on the other side of something.

Exactly, because what that essentially does is you anoint them as if they were a credible voice. And again, it’s not that we shouldn’t ... look, you need to understand that people think there’s a flat earth. Usually people think there are aliens in Area 51.

They’re not? Okay.

But what we should accord those kind of conspiratorialists to the same place we should accord people ...

Right. So Twitter this week got into a lot of trouble around Rose McGowan taking her off, talking about free speech. She did put up a phone number, but other people have — including Donald Trump — put up phone numbers, too. And they didn’t get kicked off. You can see this happening over ... You can talk about this particular ... but over and over again. And now Jack has tweeted he’s going to put up new rules and more new rules and rules of rules. And it seems utterly either just a lot of talking or ineffective. Either of which is pointless in some ways.

Well, look, I think we participate on Twitter’s Trust and Safety Council and I think we’re the only civil rights group to do that. And they have definitely made, Twitter specifically, have made progress in the past year. They’ve introduced new things, some new tweaks to the product and the platform to, again, reduce the risk of some of this stuff blowing up in the way that it was doing a year ago.

But the Rose McGowan thing and the MeToo campaign just point out how complicated this is. And I would say, think about the newspaper industry for a minute or broadcast and news and media more generally, decades ago, generations ago, they introduced ombudsman. Much like federal agencies have inspector generals to provide some oversight and acknowledge with a little bit of humility that we need someone as a voice for the people or a voice for the public. It would seem like we’re in a moment today where these platforms and these large companies need ombudsman as well who can help to provide oversight and be a bit of a check and balance on the kind of bizdev groups, if you will, the investor relations groups who would say, “No, no, no. Just grow, just grow, just grow.” Responsible growth seems to be like a more sustainable strategy.

Which they don’t want to do. All right. I want to finish by talking about some of the things that you think you need to do as a group to get better at and what are the things you’re most worried about. And I want to focus on tech because a lot of this stuff will proliferate via tech. Is it VR? Is it ... what kind of things are ... machine learning? Or what are the things that you need to fight extremism and to, you’re never going to stamp it out I’d suspect, but what are the things that are critical for organizations like yours? And then what are the things you’re worried about?

If you think about the advocacy, education and law enforcement: No. 1, on the advocacy front, I’m definitely worried about the convergence of in a digital environment a traditional civil rights agenda. So what do we do when sort of big data gerrymands people, if you will, by class or by race or by religion? Even, not just unintentionally, invisibly because the things are being algorithmically served to us that we don’t even know.

Oh, yeah. Your race and what you look like. All the “I” stuff, all the ... the other day, someone sent me something about an app that could tell if you’re gay and it’s like ...

I heard about this. I heard about this.

Or anything. They could obviously do color, they could do racial facial characteristics.

So you could easily in a “Minority Report” sort of way serve up ads to people unbeknownst to them, they’re not seeing what other people are seeing. And again, digitally gerrymand folks in ways that constrain them from choices they don’t even know about. So I worry a lot about that on the advocacy front.

Were you worried about Apple’s facial recognition software that’s going into the phones?

We’re watching it closely. Again, I think we have to be vigilant about all of this. If we’re not vigilant about the rights that we have and the privilege we enjoy, we shouldn’t expect to keep them. So I think we need to look at all of these things very carefully, very cautiously.

You know, Tim Cook stepped up after Charlottesville and gave us a big seven-figure gift in support of the ADL for the first time ever. And yet again, Apple has an awful lot of control and an awful lot of ...

It’s a big issue for him.

Think about ... We have a Google Home in our ... I have a Google Home.


But the privacy considerations with things like how is it monitoring what we’re saying. Is it really ... There was a story that broke about the Google Mini last week, you probably saw that where it was actually recording everything that was being said at the viewer’s home.

By reporters. Oops.

Yeah, exactly. So advocacy is one thing.

I unplug mine.

Did you really?

I always unplug mine. All of them.

You unplug them when? When you don’t want to use them?

Yeah, I unplug them.


I cover my computer screen. I block them.

Yeah. You have to be wary.

I just block. I just don’t even know. That’s my plan. And then I’ll take it off when I want to use it and then I put it back on. It’s just a small little moment of victory for myself.

But I’ll tell you something. If you have kids, they love to interact with Siri or Google. They think she’s a person.

Well, not Siri. Siri’s not the smartest one in the group.

She’s going to be the student. The problem child.

She’s the problem child. So one is recognition ...

Just to come back, so that’s that. So there’s a whole set of issues, a host of these things. And then of course I continue to worry about the normalization of extremism. And that shows up in the way that elements of the right, as we’ve been talking about, are trying to not only insert themselves, they are doing so.

Look what’s happened in the Austrian elections this past weekend. Look what’s happened in the European elections. And again, what’s happening right here. I worry about in 2020 and in 2022 you’re going to start to see slates of candidates who come from this kind of worldview. It will be very problematic, I think, for the public good.

And then I would be remiss if I didn’t point out there are issues on the left as well. Sort of rethinking free speech and clamping down on the way that ideas are allowed to circulate specifically on the campus, which is also crazy.

It is.

Like I might not agree with everything that Ben Shapiro has to say, but he has the right to say it. And we need to, again, protect the privileges that we have if we want to keep them.

Yeah, that is an unusual thing happening on these campuses.

It’s a real problem. It’s more prevalent than you probably realize. Where, again, in a world of microaggressions, in a world of ... it starts to look a little bit like thought police.

On the education front, look, I think the anti-bias, anti-bullying work we do is critical. We need to work out how to digitize it, how to Kahn Academy it. How do you bring it to far more people than we can do with face-to-face training?

Are you getting help from Melania Trump on this?

I will leave that alone.

Just saying because that’s her thing, right?

I suppose it is.

And then thirdly I think on the law enforcement front, how do we use AR and VR to enhance training? We’ve been asked by several metropolitan police departments, big cities, to add to the work we do around training them on extremism and hate, to do intrinsic bias. Which is encouraging because we know there are real issues there.

So imagine if you could use virtual reality to put a police officer in the shoes of a young black male. What it feels like to be pulled over for “driving while black.” What it feels like to be a young Mexican national on the other side of an ICE kind of raid. And I think technology allows us to do really interesting things that would enhance our ability to help law enforcement.

So empathy via technology?

Empathy. And understand the communities they’re trying to serve.

Do you do anything around the taping of police officers? There’s some interesting stuff going on around language. They’re taping language and then showing how they talk to different people.

No, I haven’t seen that.

Yeah, it’s really interesting. It’s Oakland. So they’re taping versus just the body cams and so you can tell, the computers can tell what race they’re talking to.

Is that right?

By the words they choose. It’s very clear on the words they use for African Americans versus white people.

That’s interesting.

And it’s absolutely different. It’s data. You know what I mean?

So the last thing I want to talk about, we just have a few more minutes, is the idea of what data is. And you have all this data, people don’t care about actual facts. Pressed by, again, this administration this idea of fudging what facts are. Just today, there was lies said and then everyone’s now talking about not the lie, but whether it’s true. You know what I mean? Like you start to do that.

So how do you do that when you have all this data? What happens to ... because one of the strengths presumably of ADL is data. This many assaults, this many this. I had a relative who there was some fact and I was like, “This is an actual piece of data.” “Well, so you say.” And I was like, “But it is.” You know what I mean? It was just like ... it’s a fascinating thing. So you’re a company — not a company, an organization — that traffics in data that is critically important, and presumably new data initiatives would help you as you begin to really see patterns and where things are happening. How do you combat that when data isn’t data anymore?

Yeah, it’s very challenging to be in a post-fact society, where Stephen Colbert’s version of the truth seems to prevail. On both sides, by the way. ADL has always been an incredibly fact-driven organization. Data-driven, fact-based. And we are in an environment where people want their own facts. I think one of the things we need to do is to ... let’s just acknowledge that data is just that. It’s numbers. And bits or ones and zeros. And they’re very hard to make any sense of until you contextualize them and embellish them with more information.

So we need stories to support and supplement the data. We need images to enhance kind of the ones and the zeros. So I think we’re going to have to find ways to — through visualization and through kind of the infographic and other techniques like that and videos etc. — to make things really come alive. So now we’re back to the VR piece we talked about just a minute ago.

I think VR could be very effective.

Incredibly powerful. So it’s one thing for me to say to you, “Okay, last year we saw 990 or this past year we saw 997 anti-Semitic incidences in the first half of the year.” It’s a whole nother thing if I could put you literally in the body of a 14-year-old when she is being harassed, when kids are throwing pennies at her at school or she walks back to her locker to find a swastika on it and you’re literally in that girl’s ...

Seeing the experience as if you were that girl will make this come alive in a way that was never possible. And, you know, we have to acknowledge that these issues are real, and if we can find new ways to leverage the technology to transform those experiences and give you the actual, a degree of insight that just a piece of paper can’t, maybe that’s how we change this.

Yeah. Yeah. So last question. What would you like Silicon Valley to invent for you?

What would Silicon Valley invent for me?

To help your work.

I think there are probably a few things. I think it’s interesting ...

Because you’re going to have to soon be defending cyborgs. Have you seen “Blade Runner 2049”?

I haven’t seen it yet. I think we’re going to see it this weekend.

It’s real long.

That’s what I heard. I heard it’s three hours.

Yeah, I just interviewed Jared Leto, who was in the movie.


Yeah, he plays a trillionaire.

They say a trillion is the new billion or something like that.

Apparently he’s really quite good. It was interesting because at one point, he was talking about ... I was asking about robot rights, obviously, because eventually when these cyborgs start to really look human, they are human, or are they human or are they a new life form? And he’s a creator of a lot of these cyborgs and so this cyborg comes out of like a baggy almost, essentially, and drops down in a bunch of goo and stuff. And he’s been trying to get them to procreate. That’s what he’s been working on.

The cyborgs?

The cyborgs. Because there is one that it worked for and so he’s trying to replicate this to see if he can make more and more cyborgs more quickly. And so this particular cyborg it didn’t work with and so he kills her. Like just after this cyborg’s been birthed, essentially. With a knife, just kills her. And I said, “That was a super disturbing scene that you just discarded this creation that you made.” And he said, “It was like breaking my iPhone. That’s how I thought about it as an actor.” And he’s like, “You throw an iPhone against the wall because it didn’t work.” And I was like, “What?” It was a great way to think about how he was thinking about his character, but eventually that’s the kind of thing we’ll be thinking about.

Probably. I mean, these questions of consciousness really get raised and you start to try to think about ...

Yeah, you will be defending ... the ADL will be defending robots someday. Just get ready for it.

It’s interesting. It’s a brave new world.

Yeah, so what would you like them to do or make? What is the thing that you would ... if you had an ask for these companies, Google, Facebook, Twitter, what would you want if they could do it? Besides a ton of money.

Well, I think I ... So I guess I have a couple quick thoughts, one of which would be create ... you should be able to sort of ... it would be interesting, wouldn’t it, if you could sign up — I don’t want to call it a premium version, let’s say a clean version of a Facebook or Twitter. Like, look, we turned Showtime off of our cable package because I got little kids and it’s gross. The movies and it’s really bad stuff. But our kids can watch the Hallmark channel and they could see clean stuff. Now by the way it might not be a view into everything that society has to offer. It might not be the highest form of art, but you know for my kids it’s okay. So it would be interesting if you could create clean versions of these kind of social platforms.

I’ll tell you something else, from a design perspective, it’s very interesting. Did you see that Facebook acquired tbh the other day?

Yes, I did.

Yeah. Have you ever used tbh?


So it’s fascinating. It’s a fascinating app that is very popular with middle schoolers and high schoolers. And it basically, the parameters of it are you can poll on other kids, but only positive things. Only positive things. So it minimizes the kind of bullying dynamic that could be so prevalent on these apps. And so you start to realize if you embed in the design of your products, in the architecture of these platforms, a bias toward good.

So if I could ask for anything from Google and Facebook and Twitter, I would ask for that. A bias toward good. Now let’s acknowledge, it wouldn’t be perfect. There would be biases. We’d have to work them out. But if you started with the premise like, “I’m going to protect my IP. I want to protect the public interest. I want to create a bias toward good.” I think that would lift up all of us.

That’s called Instagram. You know, it’s interesting because some of the services, they are ... Snapchat is a lot more pleasant place to be. They design it that way.

Yeah, so again I think it’s interesting now that you mention that. So if you think about Snapchat for a moment, it’s post-Twitter, if you will. And it’s designed with an eye toward a younger audience. Trying to create interactions that are more positive.

Or not negative, really. I don’t know if it’s necessarily positive because some of it’s silly.

Fair enough. That’s the point about bias toward good. It’s not negative. It’s not negative. And it kind of ... it reduces the ability of someone to go in and hack it for the wrong reasons.

Yeah. That’s a really good point. 100 percent. That’s a great ask. That’s actually a great ask. I think they spend a lot of time designing for addiction, but that’s a different story. We had a great ...

What’s that guy’s name? Tristan Harris?

Tristan Harris. Yeah, we had him at Code last year.

He’s so interesting.

He is. He used to design for Google.

Tristan Harris. That’s right. And he talks about the addiction and how these ...

A slot machine for attention.

And how the feel you get, not just the kind of endorphins, but the feel you get from your finger when you touch your phone and you feel that. Or the sound of chimes. So think about if we could do a bias toward good that would again mitigate the harmful stuff.

So addiction for good? Fantastic. Jonathan, this has been a fantastic conversation. Thank you so much for coming.

You’re welcome. Thank you for having me.

Here with Jonathan Greenblatt, the head of the ADL. Can you tell people if they want donate where they want to go?

Go to

And anything they want to do to help or ...

Anything. Look, there’s lots of ways we can use help. So go to to learn more.

And your office will be opening in two months in Silicon Valley?

Yep. Yep. We’ll be opening in a few months in Silicon Valley.

Where are you locating?

We’re working out the details now.

Okay, cool. That’ll be great. I’ll be there at your opening.

Thank you.

This article originally appeared on