clock menu more-arrow no yes mobile

Filed under:

What the hell happened at The Markup? Part 1: Former editor-in-chief Julia Angwin on Recode Decode

Angwin was fired Monday evening, and most of her staff resigned in solidarity. “I have to brush up on my coup literature,” she joked.

If you buy something from a Vox link, Vox Media may earn a commission. See our ethics statement.

Julia Angwin, former editor-in-chief of The Markup, sitting in a chair and holding a microphone up to her mouth.
Julia Angwin, former editor-in-chief of The Markup, talks with Recode’s Kara Swisher at a live taping of the Recode Decode podcast in Washington, DC, on April 26, 2019.
Kyle Gustafson for Vox

Last year, award-winning journalists Julia Angwin and Jeff Larson left ProPublica to start a data-driven media startup called The Markup. This week, the future of not-yet-launched The Markup appeared to be in jeopardy, as Angwin was fired by Larson and their other co-founder Sue Gardner, and most of their reporters publicly resigned in solidarity.

On the latest episode of Recode Decode with Kara Swisher, recorded in front of a live audience at the Line Hotel in Washington, DC, Angwin said the firing stemmed from a fundamental disagreement: Whether The Markup should be based on reporting, or on an explicit anti-tech advocacy agenda. After people in attendance at the taping shared some of Angwin’s comments online, Larson and Gardner called Recode to dispute her claims; you can read their objections here.

“Once we started getting into it, it just became clear that she was taking a much more anti-tech position than I,” Angwin said of Gardner. “And I’m obviously known for being very skeptical about tech, but I’m a reporter. I go in with the facts.”

She also recalled several meetings where Gardner and Larson told her she was “failing as editor-in-chief and bringing down The Markup,” but not offering ideas for how she could improve in that role. Instead, Angwin said, they urged her to be a columnist, which she didn’t want to do.

“I’m not really a columnist,” she said. “I mean, every once in a while I’ve written an op-ed here or there about an investigation I did. So, I just said, ‘That doesn’t make sense.’ ... I didn’t feel like I could stay there and not be the person in charge of editorial.”

“I don’t know how this plays out,” Angwin added, referring to the fact that The Markup’s staff is now a fraction of the size it was last week. “I have to brush up on my coup literature. ... Actually, somebody told me, ‘The coup’s already happened. This is the countercoup.’”

You can listen to Recode Decode wherever you get your podcasts, including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts, and Overcast.

Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Julia.

Kara Swisher: Hi everybody. First of all, I want to know what the hell you’re doing out here at this time of day. I don’t know what I’m doing. I got excited to drive my kids to work … to school, to work, whatever. Anyway, I’m sorry for being dressed like this. I’m freezing, so I’m gonna stay dressed like Johnny Cash this morning, so you’re gonna have to deal with it.

I’m really excited to do this. I love Vox. Recode has recently been united with Vox, and we’re doing all kinds of cool things together, and we’re very excited, including these events. It’s their fifth anniversary, of Vox itself. It’s I think the 59th anniversary of what Recode has been over the years, but we’ve been going for a long, long time.

We’ve just started doing these amazing live podcasts, and so when they asked me to do this, I thought, “Who could we get where we could talk about journalism and where things are going, and perhaps there’s a little controversy we could talk about at the same time?” And so I brought up someone I worked with for many years, Julia Angwin from the Mark — well, not from The Markup.

She’s gonna come up. We’re gonna talk about a lot of things. We’re gonna talk about The Markup and a bunch of other things.

Sit in my gold chair. My golden chair. So anyway, Julia and I worked at the Wall Street Journal together 20 years ago, 30?

Julia Angwin: I don’t remember, gosh.

We worked in traditional journalism for a long time, and both of us left traditional journalism to do other things. Let’s get Markup out of the way. Tell us what happened there. Explain what The Markup was supposed to be, what happened there, and any other incredibly awful details you can bring to mind.

It’s great to be here, guys. I never knew being fired from the company I founded would be so good for my social media presence. I left ProPublica a year ago to found The Markup. I have been a tech journalist along with Kara for 25 years, and I had built up a specialty in doing investigative journalism alongside and with computer programmers who would help me build big data sets, and analyze data to do really deep investigative work.

Before we get to that, how did you decide to do that? You were a traditional media ... you were in media. I was in the early internet and one of the few people who was covering tech long ago. Talk about how you got into that. You had covered just companies, right?

Yeah. I remember I covered Jim Bankoff when he ran AOL, and I covered AOL. You obviously were the original AOL person, wrote the book on it, literally. But I remember actually Jim Bankoff, I wrote a story, he’s the first reference ever being quoted using the term “social media.” It’s in the OED, in a story that I wrote about him.

He was running the content stuff for AOL.

Yeah, right. The way that I ended up in this weird sort of fusion of programming and journalism is that I grew up in Palo Alto and started programming in fifth grade, so my parents were the early, early ...

You went to Palo Alto High School?

I did go to Palo Alto High School, also known as Paly. I actually grew up in the personal computer revolution. Computers have gone from the size of the stage to the size of this, and everybody was super excited, including my parents who drove out there in their VW in 1974, and said, like, “Let’s join this.”

I never had a typewriter. I learned to code in fifth grade, actually, because of Steve Jobs. He had done a program in the Palo Alto schools to have all the kids learn how to program in BASIC. I actually thought there were only two life choices: Hardware, software.

So you were going to get into computer tech.

Yeah. I went to college. I studied math at the University of Chicago. They didn’t have a CS degree, so I took CS classes. I spent my summers working at Hewlett Packard and there was no reason I wasn’t gonna go back except that I fell in love with the college paper and started writing for it. I thought, “I’ll just do this for a couple years,” just as like a rebellion against tech.

That worked out for a little while. I was here in DC after college. I covered the Hill. Eventually, in 1996, the San Francisco Chronicle hired me to cover tech, because it became clear that there were no reporters who knew anything about it, and so they were like, “Oh wait, you’ve used computers before, please cover technology.”

That’s about it. Yeah. That’s the qualification.

That was it.

You had this computer background, had you ever thought of moving into tech itself, like, getting a job at Google or wherever?

No. After my summers in Hewlett Packard in college … I mean, to be completely candid with you, Kara, my boss there was sexually harassing me. I was so young that I didn’t know there were any options, and I don’t even know if there were options at that time, so one of the main reasons I left tech is I was really, early, pushed out by #MeToo.

When I had a job waiting for me after college I thought, “I just can’t go back to that. I’m gonna go into journalism.” That was a good choice for me, I think.

At that time, tech was dominated by men, as it is today. You decide to get into journalism, you went to the Chronicle, and then to the Journal.

Yes, right.

How did you get into the idea of using computers to do this? People had been doing it for crime statistics and everything else, but you shifted it in a different way.

Most newsrooms have like a data desk. Actually, I don’t know if all of you guys know, but it was called the Computer-Assisted Reporting desk, the CAR desk. That field is still called that, which is kind of terrifying. What happened was, I went on book leave to write a book about MySpace, because I thought social networking was gonna be big. I was right about that. I was wrong about which one to write a book about.

When I came back I thought ... One of the many things I was shocked about while writing that book was sort of the dawning of the realization that there was a market for personal data. That’s really what the social networks were doing is monetizing your data. And so, I thought, I want to start an investigative project on that topic.

I started reading the literature, and I found that there was this programmer at Berkeley in graduate school, Askhan Soltani, who had sort of done this scan of the web to see how much tracking was going on. I convinced my boss to let me hire him just to do that research again for me. That’s how it started. I was just like, “Oh, that seems cool, that seems like an investigative project,” and I hired him.

That spawned a whole series of articles called “What They Know,” where I continued to hire him, and I stole programmers off the graphics desk, and I stole them from wherever I could to do all sorts of different types of analysis. What I found was that that type of reporting, it could lead to more concrete results, because the fact that you diagnosed the problem so clearly and you released your data set meant that people could really clearly identify the problem, and there was a way to solve it. I mean, obviously, we haven’t solved any of those problems, but ...

When you were writing those things, there was no people being upset about it that much. There was some. They were doing this wholesale taking of data, not stealing, you gave it up. There wasn’t that much anger over it. It was celebrated. It has been celebrated for a long time.

I was too early for the outrage. People were like, “Why are you writing about this? It’s just creepy ads.” I think it had to get to the point where ... The election was where people realized, “Oh, this is affecting our common discourse,” in the elections. That’s why I feel like people woke up in 2016. When I was writing in 2010, Jeff Jarvis blogged, like, “this is so dumb, you’re taking down the innovation economy, like, what a stupid series of articles.” That was kind of the common tech view of it.

That this was a good thing. We finally found a business plan.


That works and that people don’t care, and they want to willingly give up their information.


Right. That was the thought about it. I’m gonna fast-forward to what happened at Markup. You had gone to ProPublica, which is a fantastic organization that does investigative ... and had done this, and come across a story around Facebook.

Basically, we were looking into Facebook and what Facebook knew about you. We offered readers a tool that let them download all the things that Facebook said it knew about you, and what we noticed was that we hadn’t realized that they were profiling people by race. They would identify you as “African American affinity.” Their description was actually “affinity,” meaning like you liked black people, which was weird.

Somebody tipped me off to the fact that meant, if advertisers could choose that category, they could probably discriminate in their ads by race, so we thought, “Oh, let’s see if we can break the Fair Housing Law, and make a housing ad that’s only targeted to white people,” and we put it through the system and it went through. We wrote an article like, “Wow, didn’t know you could break the Fair Housing Law, that’s cool.”

Facebook said, “Oh, we’ll fix it. We’ll build an algorithm.” Whatever. They built the algorithm, they released it, we tested it again, and we could still break the Fair Housing Law. Then they were like, “We’ll try to fix it again.” HUD began an investigation and then we noticed other things. We actually tested other things. We were able to buy ... we didn’t actually buy the ads. We noticed actually that employers were putting age categories in their ads, so their ads would only be targeted to people like 18-24, which also is a violation of age discrimination laws.

We started looking at more and more different aspects of the way that you could discriminate in advertising, and after about two-and-a-half years, three years, actually, it was only about a few weeks ago, Facebook finally said it would stop offering what I call the “dropdown racism menus” that they were offering before.

Right. They don’t call it that. It’s really helpful to people to target their ads properly.

Right. It’s a service.

Were they very, very sorry?

They were!

They were very, very sorry.

Incredibly sorry. You know what they were gonna try to do?

Have an “I’m sorry” tour?

They were gonna do better. They were gonna do better.

It’s like this: “We didn’t mean to do this, and we’re gonna do really, really better.” They have hand signals when they do it … not Mark. When you put into this, what was the attitude towards Facebook? Now, you had written about MySpace in your book. Had you seen the power of Facebook? Had you thought about what was happening there?

You mean prior to the advertising?


Actually, back at the Wall Street Journal, we had done a story, literally the same story as Cambridge Analytica, about a company that was taking your voting information, stealing data from Facebook and using it to target ads. It’s just that we were too early. It was literally the same thing, and Facebook, by the way, said they were sorry and that they were gonna change the third-party controls so that people couldn’t steal this data anymore. I’d been increasingly concerned about this data and market for a long time.

Why do you think they have that attitude? You’ve approached them from a computer point of view, which is something they understand. Why do you think they continue to do this? Today, there’s news they just announced they’re gonna pay a $3 billion to $5 billion fine, which is, they have it in their drawer, it is not a fine. It is not fine. It won’t touch them in any way.


Why do you think they’re like this, from your perspective?

It’s always hard to prove intent.

We’ve noticed that.

I have to say that I feel like there was a little bit of an engineering mindset. I feel like whenever I would talk to the people at Facebook they’d be like, “Well, if you choose to target ads, what is the difference between targeting towards a group versus having a dropdown menu to exclude a group?” In the engineering mindset, they literally were like, “Well, if you could target ads to people with brown hair, like, why couldn’t you have a dropdown menu to exclude your ads from ever being shown to a black person?”

It was this lack of context about humans and laws that I feel like maybe there was just a lack of education about it?

Their response to you was that? That this was … fine?

It evolved over time. The very initial response was, “I don’t think you understand ad-targeting.” And then, it became, “Oh, this was a mistake, we’re gonna fix it.”

What do you imagine is gonna happen today to Facebook, and companies like this? Because I think a lot of people feel that people have outrage over privacy, but I don’t think people do. I think they’re gonna get fined, and they’re gonna move on and find ever-more-nefarious ways to spy on you.

If you look at Google, they paid an almost $6 billion fine to the EU last year. It literally didn’t move the stock, make a blip in their earnings, it didn’t change any behavior. We’ve seen that these companies can weather these kinds of fines. I don’t know. I try to be optimistic. I do think that public ... when they’re embarrassed, they try to eventually sort of fix things, but it doesn’t feel systemically there’s a fix that anyone is pursuing. I don’t know how that works, because these are companies that are almost ungovernable.

They’re absolutely ungovernable.

They’re bigger than any nation. They regulate speech, around the world. Their decisions about what people can say to each other is the decision in any country. And every country is struggling with how can I... “This is something bigger than me that I can’t control.” It’s a force that I don’t think we even know how to deal with in the world.

Right. How would you deal with it?

I’m not that great at solutions. I’m super good at problems. But, I do feel that there has to be ... something structural has to change, and I don’t know what it looks like, but I do feel like governments have to have some control over how speech happens in their countries.

Which happened in Sri Lanka. They just shut it down.

Yep. That happens. And Germany is doing a pretty good job of keeping the Nazis off Facebook. So, it’s weird. And Twitter, too. You can go there and have a Nazi-free experience on the internet. And so you realize, there is a world where you could have a Nazi-free experience, which is cool.

Not in this country. They’re trending. So, let’s talk about what you decided to do. You just said this is about journalism, because one of the things that’s being impacted by Facebook and Google and the changes they’ve made in targeting is to suck up all the digital advertising dollars.


Let’s talk about ... you decided to go off from ProPublica where you had the very traditional, even though ProPublica’s more of an outlier than the Wall Street Journal, than the San Francisco Chronicle, to start The Markup.

Yeah, yeah.

What is the concept?

What I wanted to do was I had a little team at ProPublica, two programmers and a researcher, and we were doing our investigations. And one year we did Facebook, one year we did software that was used to assess criminals and predict their future criminality, which we showed was biased against black defendants. Big surprise. And one year we did car insurance, how red-lining worked. But each year we had to pick. And I felt like, everything is happening all at once and, I want four teams like this!

Because technology is not just impacting the companies that we think of as tech — Facebook and Google — but every bit of our lives is being algorithmically decided. And some of these decisions, like the criminal risk scores, have enormous consequences, incarceration or not. And so my idea was, I want it to scale up this work and sort of build a field around tech accountability journalism. Tech journalism, its origins were really very much fanboy. Right? And so, the field is evolving, but I wanted to build that investigative wing and really make a model for the field of how this could be done using technology to investigate technology.

To investigate things. So, you went around to raise money and got how much money?

More than 23 million.


Mostly from Craig Newmark, who pledged 20 million.

Right. Who ruined classified advertising for the San Francisco Chronicle, for example. Craigslist founder.


So, he was taking his money, and he’s talked about this. He was taking his money that wrecked the newspapers to try to do something about it. He said that to me.

I mean, I’ll let you put words in his mouth.

I shall. Because he said them. So, $23 million to form a team to look at ... instead of just what had been done, which is piecemeal, or else someone like me, which I just stand outside of tech companies and yell at them from the side.

Which, by the way, is very effective.

I know it is. It’s really effective. And I will continue to do so. So, you decided to do this, you got two partners, one was someone you worked with at ProPublica.

Yeah, my colleague Jeff Larson. He and I had been doing these investigations together for years. And then, he and I recruited this woman, Sue Gardner who’d run Wikimedia foundation, to be our business partner, because we had heard that journalists were not good at running businesses and so we thought we needed someone to help us with that.

Right. And, you’d seen people trying this thing? I did it. There’s lots of efforts in this area. What did you think was going to ... Vox itself was within the Washington Post and then moved to here. What did you think about going into business as a reporter? The word is “reportrepreneur,” in case you’re interested.

Oh, jeez, I didn’t know that.

Please don’t use it ever again. So, go ahead.

I won’t.

So, did you worry about that part of it?

I did worry about it, but I actually felt like the nonprofit model that we were pursuing had a little bit more hope. I feel like the for-profit model has led to ... it’s just been really hard because it trends towards clickbait, you know? And you have to really push against that because the ad model is really this incredibly transactional model and it’s not based on the quality of your audience at all.

I was hopeful that we could build a nonprofit model by getting our readers to understand that we weren’t going to have any tracking on the website. It was going to be very privacy-protecting. It was going to feel like a service. Because a lot of places charge subscriptions, but then they still track you and still advertise to you and it’s like, come on guys. So, we felt like, okay, maybe people would donate the amount that they would have subscribed somewhere because they understood that we were on their side. We weren’t selling their data, collecting it at all.

So, it’s like ProPublica, a nonprofit model that you would have people who supported you, which is like a subscription.


Essentially. And then, money from rich people.

Yeah. Right. Combination. You got to always have a billionaire in journalism these days.



Yeah. Yeah. So, you did that and then moved into this. So, I guess the only question is, what the hell happened?

Yes. So, I did a kind of founders mistake, which was I recruited Sue to be the business partner. And we didn’t talk about what our roles would be until we were a couple of weeks away from closing the gift from Craig. And then, she gave me an ultimatum that she would quit if I didn’t make her CEO and my boss. And I was scared. I thought, “Oh my God, we’re going to lose this money. How do I go back to them and say we’re starting again with a different team?”

And so, I said, “Okay, well, I’ll agree to that.” Because honestly, I didn’t want to do the business-side stuff. Most of that job is actually stuff I don’t want to do. But I need an employment guarantee. I need a contract so that I can’t be fired at will. And she said, “Sure, let’s totally do that. But it’s going to take a couple of months. Let’s just close this stuff and we’ll get to it. We’ll get to it.” And we never got to it.


And so, that’s why I was fired by email on Monday.

Right. Right. So, what was the problem? I know starting things, like I started mine with Walt Mossberg, we had plenty of fights. We had plenty of issues. We had less fights than we had agreements, which was, that’s the way it should work, essentially. But in starting these things, it’s really ... and we had a for-profit model and we had events and complex things. What happened there? What was your fault? What did you do wrong? Like you said, first of all, you didn’t get this guarantee.

Yeah. Yeah. I mean, I think we weren’t aligned on the vision as much as I thought, right? Because once we started getting into it, it just became clear that she was taking a much more anti-tech position than I. And I’m obviously known for being very skeptical about tech, but I’m a reporter. I go in with the facts. Right? And, we had numbers of meetings where she was talking about how we should have a take, write a policy paper about our position on tech, how we should be a cause, not a publication. She built a spreadsheet ranking all of the employees that I was thinking about hiring by how skeptical they were on tech and how negative, and wanted them to be more negative. And I felt like not only these were morally questionable, but they were legally, very risky.

And so, we had a lot of conflict about that and I started to feel very nervous about where this was going. Was this going to be an advocacy organization? Because honestly, I could sort of see why, from a financial point of view, probably it’s easier to raise money if you’re like, “We’re out there swinging.” Right? And so, from a business-side perspective that may have been a better play.

Well, talk about that. The difference between skepticism and being negative. Because I get accused of being super negative all the time. Constantly. And I think it’s fair. It’s a fair point of view. I think I’ve been with them long enough to be warning people about what’s happening. Talk about that idea, because there are advocacy publications. I guess you might put Mother Jones in there. You might put some others. But you didn’t think of yourself as that.

No, because I actually had this other idea. So, I really felt like journalism is always put on the pedestal of objectivity, which is this weird neutral tone and point of view, and it really has led basically to false equivalency. Right? On the one hand, climate change is happening. On the other hand, some random person says it’s not. And, the truth is that’s not a fair representation of what the reality is. The fair representation is that 99 percent of the science suggests that climate change is happening and 1 percent of people who don’t have credentials say it’s not.

So, I wanted to move a little bit more towards what I call the scientific method approach, which was, you have a hypothesis. Hypothesis: Facebook allows dropdown menu for racism that allows advertisers to break the law. Test that hypothesis. Okay, how much data do we need to collect? In that particular case, buy one ad, you’ve basically proven it. Some hypotheses need thousands of data points, right? For criminal risk scores, we collected 18,000 scores of defendants. And then you basically say, “Here’s our finding. Our finding is x, and here’s our limitations.” So, the limitations of our finding is we couldn’t test every ad on Facebook. Right?

And so, I felt like that was our approach. And that is different than Mother Jones and the Nation. But it’s also different than normal journalism. It was just an idea of, could we bring a more scientific approach? Because I feel, right now, despite all the craziness in the world, I do feel data changes the narrative. When you bring data to the table, people are willing to take it onboard, and it does lead to change and impact and policy changes. And so I felt like that was our calling, as a journalist, was to bring that data to the table so that we could make change.

And, what happened?

So, I don’t know why I was fired. Right? She would never give me a reason. The reason that’s been out there is the management issues, leadership, that’s what she said publicly.

You didn’t hire fast enough.


Did you not hire fast enough?

I wish we had been hiring faster, but we had a pretty aggressive hiring schedule so I felt like we were definitely on track to launch in July. We some investigative stories that were coming down the closing finish line. I felt excited about it, but she took me aside in January with Jeff, the two of them, and they said, “You’re not suited to be editor-in-chief.” And the reasons were things like “You don’t like meetings.” That’s true. But I did go to all of them. I went to all the meetings, but I just didn’t like them.

Another reason was I wouldn’t agree to take a personality test. I don’t believe they are based on evidence and she was really insistent that she needed me ...

A personality test?

A personality test.

Which one?

Actually, I think she wanted me take ... there were a couple. Enneagram or something, I don’t know. I don’t know what these things are. And so there were a whole bunch of reasons like that that I wasn’t “suited” to leadership. But it wasn’t a performance improvement plan, like “here’s the ways you can grow as a leader” or anything like that. It was just a negative thing.

And this is someone who had not run a publication previously.

I mean, I think she ran some ...


... the Canadian Broadcasting Corporation internet site. So she’s been in journalism, like radio, and TV. So that was disturbing. That’s when I sort of realized that, oh, this isn’t going that well. Right?

Right, right.

And then, we continue to have conflicts about the mission and the advocacy. And then, in the end of March, she and Jeff took me to dinner again and said that I was failing as editor-in-chief and bringing down The Markup. And once again, there was no plan of how I could improve. It was just a declaration. And they said, “You’re probably better suited as a columnist.” And I was like, “Oh.” Well, I mean, I’m not really a columnist. I mean, every once in a while I’ve written an op-ed here or there about an investigation I did. So, I just said, “That doesn’t make sense.”

So, you would not step down in the way they wanted you to. They asked...

Yeah. Honestly, though, they didn’t even offer me a job. It was just like, “You might be suited as a columnist,” but there’s no job description of what that would look like and what would your role be or anything. So, I wrote them a letter with my lawyer saying, “Look, it seems like you’re reneging on your agreement to give me my employment contract. It’s been sitting with you, my lawyer’s given it to you. But I’m not going to step down as editor-in-chief because I have promised our donors and employees that we’ll pursue a particular vision and I don’t have faith that you’re going to carry that out.”

You thought they were going to do much more advocacy.

Yeah, that’s what it seemed like. And I don’t actually know what they’re going to do because there’s two journalists left in the newsroom and I don’t know what that publication is at this moment. But that’s where it looked like it was heading. And I didn’t feel like I could stay there and not be the person in charge of editorial. And so I wrote that letter, and then she fired me.

Then she fired you.


So, what happens now? Now, Jeff has written a very problematic memo, very defensive memo, which I said on the internet yesterday, about what’s going on and begging the people to come back, which was odd, at the same time.

Yeah, it’s worth pointing out. So, there were seven reporters, five of them resigned after I was forced out.

Quit. Right. So, most of the staff is gone. Craig has given this money. Now he’s written a note and tweeted that he’s thinking of reconsidering it. So what happens?

I don’t know. I don’t know how this plays out. I have to brush up on my coup literature.

I can help you there. I can help you a lot, actually.

Right. Actually, somebody told me “The coup’s already happened. This is the countercoup.”

The countercoup, yes. It’s the countercoup. We’ve got to get some Mother of Dragons into you. I can help. So, what do you want to do now then? Then we’ll get some questions from the audience here.

Well, I just want to do the thing I was doing. It would be awesome if I could just do that. We had some great investigations. I feel like we were going to launch. I had some great people who were going to come. The people who were there are great. If I could do this somehow, somewhere, that’s what I would do. I want to build this field. This is too important an issue for there not to be a team like this doing this type of work. It doesn’t have to be at The Markup. It can be somewhere, but I’m going to try to figure out a way to make it happen.

Questions from the audience? Right here.

Audience member: Hi. I’m really interested in what you said about it being easier to raise money for a journalism organization that’s focused more on advocacy. Can you talk more about that and maybe suggest an alternative business model or model in general for journalism that can be more objective, but also investigative?

Julia Angwin: Yeah, I’m actually not sure about that but I do feel like ... I could imagine that people donate to their favorite causes and I could understand why you might want to position your journalism as a cause. What I feel, though, about that is that from my experience doing this type of work, that it undermines your findings when you go in with an agenda. You have to be willing to follow where the data leads you, and so I think it’s a dangerous road. I can see it’s tempting, but I think it’s dangerous for the pursuit of truth.

Because in the end, “Facebook sucks” is your end point, right?


For example, which it probably is true. It’s probably accurate but you don’t want to start with that. You want to give them the benefit of the doubt when you start something. You could go down another road, for example.

Yeah. No, and many times our stories have taken many surprising turns, right? So we were looking at Amazon, I had been told that if you shopped on your mobile phone versus desktop, you would get different prices. So we put all this testing software up — in the Amazon Cloud, of course — and ran tests on Amazon from Amazon Cloud and we found no results. We were like, “Oh, there’s nothing here.” I thought, “This is really disappointing,” but happens.

Then I went to drinks with Barry Lynn, the guy from Open Markets Institute, and I was like, “Yeah, we looked into Amazon. We couldn’t find anything.” He said, “Oh, well have you tested ... The real question with Amazon is how do they treat themselves as a seller on their own platform that they control?”

So we tweaked our hypothesis. We said, “Okay, let’s ask that question. We already have all this stuff running,” and boom, we were like, “Oh my gosh, they give themselves a huge advantage when they’re the seller,” or one of their favorite sellers, what are they called? I’ve forgotten the name, but the ones who pay them fees to be in their warehouses, Fulfilled by Amazon. So then we were like, “Oh,” we had a huge finding. That’s where you just let the facts lead you to where you want to go.

Right, so they were up to something, you just didn’t find the thing they were up to.


Right? One thing. That would make more sense for them to advantage themselves.

Right, and also ...

That’s one of the big issues.

... we would have been fine with no finding, right?


If there’s nothing to find, then there’s nothing to find.

Right, but you keep looking and pushing at various parts. But advocacy is fine, too. There’s a lot of people with points of view who do reporting, reported analysis.

I also feel like there is, I actually feel like we’re not lacking right now for opinions about tech online, in my feeling. So I feel like that space was fully owned and was good, it was great. Everyone’s doing it. We could bring just a different piece to the table.

Okay. Over here.

Audience member: Thanks so much for coming out. I’m interested in your take on media literacy for youth and how we are educating high schoolers, in particular, to consume the news. So how do you, and how does that, relate to the work that you’re doing with data where a lot of people in general and young people don’t really care about data. They just want the stories and they don’t sometimes respond to data in the ways that we want them to. So how do we teach young people to care more about what the data is saying rather than other, frivolous things?

I will push back at you. I have a son who literally is, I call him Wikipedia, Walking Wikipedia, because he has so many, well, not those facts, but he’s so factually oriented that he will not talk about anything else, so it’s an interesting change. But how do you…?

Julia Angwin: Yeah, I’m not sure that hypothesis is true. I’d like to see the data for that, but I do think that media literacy in general is a challenge and I actually just feel like that’s a classic case of pushing the burden onto the user, right? The fact is that if you’re being completely spammed with lies all the time, is it actually your responsibility?

I feel like one thing that we kind of haven’t paid attention to enough is the literature around persuasion and the fact that all of us are very persuadable, and we used to have information gatekeepers who had certain standards and they didn’t publish things that were untrue, and the reason was they were at risk for a lawsuit, right?


The Wall Street Journal, where Kara and I worked together for years, every story in there we could be sued for. We could be sued for the letters to the editor. Those were fact-checked, right?


The advertisements we were actually responsible for. The internet companies got a special exemption in the 1996 Telecom Act so that they’re not liable for anything that anyone writes on their platform.

If you’d like to know, because I’m going to write about it in the New York Times next week and they’re very nervous, Section 230 of the Communications Decency Act gives internet companies broad immunity from anything that flows over, it’s sort of treating them like a phone company essentially, and it was done at the time, and I was there and I wrote about it for the Washington Post, actually, in order to allow these companies to grow.

They were small startups and they didn’t want to be sued out of existence from the very get-go. AOL was a big part of pushing for that and other sites at the time were. So what they did is got this broad, broad immunity and it was a gift. I just interviewed Nancy Pelosi about it and she said it was a gift that they’re abusing. It certainly was a gift. And so the question is, do we want to allow people who are the richest people on earth now from using this gift to abuse it further? I think it’s been chipped away over issues of FOSTA and ...

Yeah, there’s only one chip in it so far.

Yeah, so the question is, do you remove it for the large companies? Do the world’s richest people deserve immunity from behavior? Because what’s resulted is it’s sort of like giving kids sugar all the time. Sure, you can have sugar. Sugar. You can keep your room messy. You can do this. What do you imagine is going to result? This is why we have what we ... This is my belief, but other people have different beliefs, so it’s going to be a big question. But as to younger people, I do think they get inundated. It’s like sludge. You can’t be protected from it. The government really should be protecting people from this — or lawyers, suing.

In the US, we usually choose lawyers.

Yeah. So question right here.

Audience member: Hi. I’m a huge fan of your podcast.

Thank you.

Audience member: My question is, if there’s a politician that starts getting rhetorically and legislatively very tough on tech, do you believe that the tech companies will kind of resort and hunker down like the hydrocarbon companies did in the ’80s and ’90s?

They’ve hired a lot of lobbyists. You might look at that. There’s a lot of data on that in terms of they didn’t have lobbyists before and now they have a ton of lobbyists, first of all. I think their approach is a little different than, say, Big Oil or Big Banking or stuff like that. They’ll show up, they’ll apologize, they’ll have meetings, they’ll have dinners, they’ll have “thought time.”

Mark is having a lot of dinners with smart people, like getting his Harvard education now, which is kind of fascinating. He’s having these ... Everyone will go. If you’re, I don’t know, the guy who wrote the Hamilton book, Ron Chernow, you’d go to dinner with Mark Zuckerberg, right? Why not? So they’re doing it that way. I think they’re allowing the discussions to go on and then secretly, behind the scenes, sort of trashing people, correct? I don’t know.

I think they seem to be ... It’s a different way of approach but there’s no question they’re doing heavy-duty lobbying on lots and lots of issues that affect them. I think the question is who can they ... They try to find people they can work with on their side. Like you have somebody like Senator Warner who’s very tough on them on some issues or Senator Klobuchar is tougher on them. Senator Bennett is moving into that direction. Senator Wyden is very interested in voting machines. That’s more of his area, so they tend to try to coopt them.

When I wrote a column this week about Sri Lanka, the first call was a Facebook person saying, “Hey girl, want to chat?” I’m like, “No! Call me Friday,” and then Friday, I’ll not pick up the phone. So it’s a much softer approach. I can’t explain it but it’s just as effective. I think Mark’s piece about his wanting legislation was fascinating. You should read between the lines of that particular thing, I think.

Okay, quickly here and here.

Audience member: I’m interested in the notion of scientific journalism in an era in which there’s a lot of examining of science itself. Between p-hacking and replication problems, academic science has started examining how the apparent rigor isn’t actually as real as we thought.

Julia Angwin: You know, that’s a great question. I say “scientific journalism” and what I mean by that is it’s better than the normal journalism, which is “three anecdotes and you’re out,” right? So when we had our meeting with Craig he was like, “Okay, let me just see if I understand this. Basically all you’re talking about is increasing the sample size?” I was like, “Yes, that’s basically it.” That’s not entirely true, right?

So we have a data ethics policy — or we had — at The Markup which actually said we will not do p-hacking, and also that we aim for replicable results and we will publish our data and our code as often as possible, which is what I’ve done all my career. So I do think journalism can aspire to the standards of science, but we are journalists, right? And so I think of us as the first people out of the trenches, right?

We’re doing the first draft and then actually science usually comes in and does a lot of followup work to validate and solidify the results. This happened with our work on criminal risk scores. We put up the data set, there’s been ... I think I have more academic citations than my husband, who is a professor, because it’s been replicated and written about so much and it’s really moved the field in terms of the field of computer science fairness and algorithms.

Okay, we have to go but the very last question, what would you investigate right now if you have a publication to work for? You will.

Trying to break my heart over here? God, there’s so much. But you know, one of the things I’m really upset about is the use of algorithms to score people in ways that actually aren’t really seen as tech. Your resume is sent through an algorithm, right? People work for algorithms, fired by algorithms. People who work everywhere, and so this idea that, and I do, I just think it’s worth pointing out, the people who are scored and sifted by algorithms strangely are often people of color and poor people, and so we’re getting into a world where actually some people get human judgment and some people are judged by machines, and that’s a very upsetting thing.

Great answer. Julia Angwin.

This article originally appeared on

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.