On this episode of Too Embarrassed to Ask, Recode’s Senior Social Media Editor Kurt Wagner talks with Kara Swisher and Lauren Goode about how Facebook is trying to fix its “fake news” problem — by showing users less news and asking them to rank the credibility of media outlets.
You can read some of the highlights from the discussion here or listen to it in the audio player above. Below, we’ve posted a lightly edited complete transcript of their conversation.
If you like this, be sure to subscribe to Too Embarrassed to Ask on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
Kara Swisher: Hi, I’m Kara Swisher, executive editor of Recode.
Lauren Goode: I’m Lauren Goode, senior tech editor at The Verge.
KS: You’re listening to Too Embarrassed to Ask, coming to you from the Vox Media podcast network. I’m actually at the Vox Media headquarters in D.C. doing this. This is a show where we answer all of your embarrassing questions about consumer tech.
LG: It could be anything at all, like will Kara Swisher change at all now that she’s an anchor monster on MSNBC.
KS: I will be even 10 times the Kara Swisher I’ve been previously.
LG: “Anchor monster” is your phrase, by the way. I just want to put that out there. I am not calling you anchor monster.
KS: It was a phrase used against someone ... It’s a long story. It’s a terrible one.
LG: We have time. Let’s hear it.
KS: Someone wanted me to use it against someone and they didn’t want to use their name on the record and I declined. They said this person is a monster.
LG: I like you as an anchor monster. Let’s go with it. Well, it’s fantastic. Actually, quickly tell everyone when your show is going to be airing.
KS: Soon. Soon. This week. I have a show on the future of work. It’s a series I started with MSNBC called “Revolution” and we taped it in San Francisco with Google’s CEO Sundar Pichai and Susan Wojcicki, the CEO of YouTube, the first one. We talked about lots of stuff, AI, nondisclosures, workplace culture, robotics. Not robotics, self-driving cars. All kinds of things. It was good. It was hugely attended and it was fun.
LG: It was really fantastic. I was there. I covered it.
KS: It was really good. At one point one of the producers was like, “That was a lot of substance.” That was very funny. It was very substantive. There was a lot of talk. I thought they were great to come forward at a really difficult time for tech and really be more frank than many, many people want tech companies to be, but it was good.
KS: Anyway, for this show, send us your questions. Find us on Twitter or tweet them to @Recode or myself or to Lauren with the #TooEmbarrassed.
LG: Because this is Kara’s favorite show. We also have an email address. Our email address is email@example.com. A friendly reminder, there are two Rs and two Ss in embarrassed and we really do like your emails. Sometimes when things come through in social media, we don’t know where they’re coming from. We might not see them. Email us if you have questions and we’ll be sure to answer them.
KS: Yes. Yes, and I am in D.C., as I said, this week. Here for a bunch of podcasts, all kinds of things, and to see my kids and things. I’m away from Lauren, which is always a painful time for me.
LG: It’s very hard. It’s hard for the both of us. I’m back here in San Francisco just holding down the fort. Making sure that ...
KS: Down the fort.
LG: Yeah, exactly. Today on Too Embarrassed to Ask, we’re going to be talking about something that’s a little bit inside baseball for us media people, but also impacts your experience on Facebook, and that’s all about the News Feed. Last week, the social media company, which is definitively the world’s biggest social network, said going forward that news would make up ... Like actual news would make up about 4 percent of people’s News Feeds content instead of around 5 percent, which is what it’s been at for a while.
KS: They’re tweaking the algorithm.
LG: They’re tweaking the algorithm. Facebook also said it planned to let users rank news sources based on credibility and trustworthiness.
KS: Worrisome. Elliot Schrage, who’s the head of their policy at Facebook, I saw at an event in Germany this week and talked a little bit about it. It made a lot of people nervous. There were a lot of questions about that. Earlier this week, Facebook also admitted in a series of blog posts that maybe the social media giant and social media in general is bad for democracy. I don’t think they blame themselves, which comes as a surprise to no one who’s been following the stories about foreign influence on Facebook during the 2016 election, and I think Twitter also admitted there were more Russian bots or problems on Twitter this week, too.
We’re delighted to have Recode’s Senior Social Media Editor Kurt Wagner back on the show to unpack all of this for us and tell us what it all means. Hey, Kurt.
Kurt Wagner: Hello.
LG: Hi, Kurt.
LG: Thanks for joining us.
Yeah. My pleasure. I’m here in San Francisco with Lauren, by the way.
LG: Bye, Kara.
We are holding down the fort.
LG: Thanks for the introduction. Kurt and I are just going to take it over now.
We may not let you back in, actually, when you come back.
LG: Go back to your two-hour flights to Germany. You mentioned earlier you took like a couple hour flight to Germany. Were you on the Concorde? The second-generation Concord?
KS: I’m a citizen of the world, Lauren. I know where I am needed and so I went to Germany to talk to the Germans about social media.
LG: I’m not making this up. She literally said it was just a couple hours longer than an original flight, right? I was like, how is that possible? All right. Well, before we get started, we should note that we’re taping this episode on Tuesday, January 23rd, and Facebook has been at the center of news for what feels like every day in 2018. Not really, but close. By the time you hear this podcast, things may have changed yet again. Let’s just keep that in mind.
KS: Let’s hope not. There’s so much news. Anyway, there’s so much news, Kurt. News is such a big thing.
A lot of blog posts.
KS: A lot of blog posts, but just in general there’s so much news. It feels like every day is like a hair-on-fire day. People love news these days because there’s so much of it and it’s so interesting. Let’s take the basics. What are they doing? Why now? When are these changes going into effect? Sort of give us the whole ...
Sure. Well, let’s go back two weeks, which is the first big announcement that they made which they said, “Hey, we’re going to change this News Feed algorithm,” which is a software that determines what you see in News Feed and what you don’t see in News Feed. They said, “We’re going to show people more stuff from their friends and family.” As a result, that means they’re going to see fewer posts from publishers or businesses.
The drive for that was that they said, “People enjoy spending time on Facebook more when they’re actually interacting with other users. We are going to try and show them more posts that lead them to comment and ‘Like’ and actually engage with people and not just scroll through aimlessly. We found that the stuff that does that is stuff from your friends and family. You’re probably more likely to maybe comment on something that your grandmother posted or your mom or your cousin or whatever than you would something from the New York Times.” As a result, that was the announcement. They kind of framed it as saying, “This is going to be good for the well-being of our users because they’re going to be more engaged with each other and they’re going to have a better time on the site.”
Of course, publishers are kind of freaking out, right, because a huge part of business for a lot of these publishers is distributing their stuff via Facebook. That’s where they either get a lot of traffic or a lot of their audience lives. That was step one. Step two, you kind of touched on briefly, Kara, was last week they said, “Hey, we’re also going to tweak this algorithm and we’re going to show you more stuff from publishers that are deemed ‘trustworthy.’” The way that they’re going to determine who is trustworthy and who isn’t is they’re going to poll their user base and they’re going to survey people and figure out who they trust and who they don’t.
People that get good scores theoretically are going to show up higher in the feed than others. Then the last thing was just yesterday, they came out with a long blog post, which I thought mostly rehashed stuff that we’d already talked about last year which we can get into, but they kind of ended ... The overall theme was, “We’ve built this thing, social media, it can be used for good and it can be used for bad. We can’t really control it, but we’re doing the best we can to make sure that it’s used for good. Sometimes it doesn’t always work out that way.”
LG: For a while Facebook had this initiative where it would ask its users to flag disreputable news. Then based on that, Facebook would literally attach some type of symbol or icon.
A disputed tag.
LG: A disputed tag. Then it would make it known to its readers that hey, this is coming from a disputed source. Is that still ongoing? Is that part of this or has that fallen by the wayside?
It’s in the same vein. I believe it was in December they said they weren’t going to use the disputed tag anymore. Instead, they were going to ...
KS: I don’t believe they did it in the first place.
Well, there were a few. I don’t know how far along they got. They definitely did it in a sense that they tested it because we saw it in the wild. Did it expand to all of Facebook? No, I don’t believe so. Instead of doing a disputed tag, which they claim actually made people kind of digging even more on their site, right? If you read a story that you think is true and then someone tells you it’s disputed, I think they found that people actually got more defensive. They said, “Oh, well, now I believe this even more than I did before.” Instead, what they’re doing is instead of claiming something’s disputed, they will show related articles.
They say, “Well, here’s one article, but here’s three additional ones that you might want to read about the same topic.” Theoretically those other three should, if it’s disputed, solve the issue by reporting the news accurately or at least in a different frame.
KS: They’ve been trying to get at this. I mean I think they’ve been trying ... They understand the problem. They’re sort of, I would say, stumbling to figure out how to deal with something. It was interesting because at DLD this week, Elliot Schrage was ... He sort of took the “mistakes were made” position. That’s how they tend to want to phrase it, that mistakes were made.
LG: The passive voice, right?
KS: Yeah, exactly.
LG: Passive tense.
KS: We know we’ve made mistakes, but sort of wanting to move along without talking about that. This new things is this trusted news source, which I have a lot of problems with, I can tell you. Talk about that. Talk about what they’re doing and then we can discuss what they’re ... I know what they’re trying to get to, but it seems ham-handed in lots of ways.
LG: Like how will people actually rank the trusted news sources?
I have issues with it as well. I think most people probably in the journalism profession do. What they’re trying to get at ... I guess the goal of this is to figure out who is most accurately and consistently reporting the news, right, and show people more of that stuff. Because if they can identify who everyone trusts and believes and they show them more of that, all of a sudden the fake news thing goes away, right? People don’t believe that they’re ... Yeah, presumably. A lot of issues with that, right? Number one for me is, why are they relying on regular people, their users, to tell them who is trustworthy and who isn’t?
KS: Yeah, they don’t want to take responsibility. It’s the same thing.
Correct. They don’t want to be the one to decide that the New York Times is more reliable than Fox News, who is more reliable than whoever, right? Let’s be honest. I mean, people don’t know what good news is. That’s why we’re in this problem to begin with, right, is people believe everything that shows up in front of them. To ask them to objectively say that the New York Times is more reliable than the Washington Post or Recode or Bloomberg or whatever seems silly because most people don’t actually have that kind of knowledge, right? They just simply ... if it shows up in their feed, they probably believe it.
KS: It’s also gameable.
LG: It doesn’t sound like it would do much for a confirmation bias in the sense that an article can technically be accurate, the facts reported may not be inaccurate or the number used or whatever it might be, but there could still be bias injected into the piece. There’s still the potential for confirmation bias, I think is my point.
Totally. Other issues, right? What if you’re an up-and-coming news organization that’s just getting started? We’ve seen a bunch of those. Even in our industry, Axios, The Information, even Recode a couple years ago was brand new, right? Most people didn’t know what Recode was. If that shows up on a survey, who’s going to say, “Yeah, Recode’s reliable and trustworthy,” if it’s been around for three months? Probably not a lot of people.
It makes it in theory ... again, it hasn’t happened yet so it’s hard to say what is actually going to happen in practice, but in theory it’s also going to make it hard for people to start new journalism ventures, even ones that are really, really good or started by a really strong journalist.
LG: Kara, what were you going to say about rigging the system as well?
KS: Well, it’s riggable. It’s riggable. Like you could see people just flooding votes to different things, to sites. You could see it being abused. It’s one of these things like, “Let’s vote what’s good.” I think what it goes to, to me, is the heart of they just don’t want to take the responsibility or make the decisions. Again, I had this back and forth with Elliot. The thing is, are you a media company? I think they’re not a regular media company, but they certainly are a media company of a new sort. That means taking responsibility for what’s on your system. I was pointing out that Snapchat takes responsibility. They curate and pick.
You know what I mean? Sometimes I don’t like the Daily Mail in Snapchat, but you know what? I don’t like my kids reading it, but guess what? It’s a news source. Using the words “trusted news source” is ... What I want is real news source. Like an actual place, not like these weird Denver Post or whatever. I think the word “trusted” is so loaded then you get partisan about it.
Because look, Fox News is a news source, and it’s a credible news source, so is the New York Times, so is Breitbart. Whatever you think of it, those are news organizations and they should be seen on Facebook. You know what I mean? What they need to take off is these other ones that are just not news sources. For some reason they really do not want to take responsibility for their platform. I find that odd.
LG: It’s kind of a unique problem because of the size of their user base. I mean, Facebook now has two billion people around the world using it. You can’t name a single media organization I think in modern history that has had that kind of audience. Even if you look at a list of the most televised events in history, they’re somewhere in the range of ... I don’t know. How many? Single-digit millions? Tens of millions maybe? When you look at the pure ...
KS: The numbers.
LG: Yeah, the pure scale. How do you moderate content in a media platform that size?
KS: They want to be seen as a utility. “We’re just a platform. We’re just this ...” but they don’t want to be regulated like a utility, but they want to be regulated like a news organization, right? They want freedom and yet they want to be seen as a utility. It’s like they want all of it.
I hate to agree with Rupert Murdoch, but maybe they should pay for credible news organizations. Now listen, I seldom agree with him. I find it a little bit ironic that he’s doing this, but I think he’s making a larger point and he’s doing it in a way that’s loud-mouthed. They should pay for great publishers if they want to use their material. I’ve always thought that.
LG: This was Rupert Murdoch, by the way, suggesting that Facebook pay trusted publishers in the same way that content curators are paid carrier fees, right?
Correct. Yeah. What we’ve seen is that Facebook actually accepts and adopts the “we are a media company.” They don’t say this, but they accept the role in some instances, right? Last year they paid a bunch of publishers to use Facebook Live, which was their live broadcasting tool.
LG: Yeah, of course. It’s good for them.
Exactly. They came out and said, “We want to get live video to become this new really popular medium. In order to do that, we’re going to seed the ecosystem by paying all these publishers to do it.” That’s an editorial decision. They chose which publishers they wanted to work with and they kind of gave them guidelines around creating content that had to live on Facebook. Those are the kinds of things that media companies do and they didn’t shy away from that, right?
When it comes to determining who’s trustworthy and who’s not, that’s a decision that clearly falls on the other side of that fence. I think the issue with most publishers is that that fence is always moving. Three or four years ago, choosing publishers to pay for Facebook Live might have been too far for them to go. Now it’s not. That is the issue, that Facebook is constantly evolving. They move incredibly quickly and the media world’s just trying to keep up and it makes it really hard.
LG: If there was a relationship tag for a publisher’s relationship with Facebook, “it’s complicated.”
LG: It’s complicated.
LG: Let’s talk about the blog posts that were published as a part of Facebook’s Hard Question series, the ones you referenced earlier. What was your take on these? Did Facebook own up or still not enough?
I mentioned earlier that to me a lot of it rang true to what we’d already heard from them over the past year. You mentioned I think at the very beginning you were like, “It feels like Facebook’s been in the news every day of 2018.” I feel like Facebook’s been in the news every day of 2017 too, with this whole Russia election issue and the fake news stuff. Overall, high level is that they came out and they kind of bulleted a bunch of different parts of the service that they try to ... Fake news, the use of democracy and the power of Facebook, and kind of pushing democracy either for good or bad.
I think again the biggest takeaway for me was the simple realization and admission that social media amplifies everything, whether it’s good or bad. If you have something positive to say, social media is great because you might get to share it with a million people, but guess what? If you have something terrible to say, it’s the same issue. Simply keeping the positive on Facebook and eliminating the negative is not an easy task when you have two billion users and a bunch of different people who believe some things are right and some things are wrong. That’s where they’ve gotten themselves into this weird pickle is that it’s like you can’t promote one without the other.
We’re finally seeing ... I shouldn’t say finally, but more publicly than ever we’re seeing the negative impacts of what social media can do versus just the positive ones.
LG: It still feels like after reading some of the ... I didn’t read all of the blog posts.
It was long. It was like six pages or something.
KS: It always is with Mark Zuckerberg.
LG: I mean, there were a lot of sources in it. There were a couple of people who work within Facebook. There were experts from outside of Facebook. I still noticed there were references to things ... Like they wrote about the damage the internet can do or sort of fessed up to the fact that Facebook wasn’t able to quickly identify things. Those phrases jumped out at me as things that were still sort of passive in their approach to how Facebook is handling things. To underscore Kara’s earlier point, this idea that it’s just a platform. You guys are doing this. We’re supposed to identify it, but still not a lot of focus on ... I don’t know. It feels like Facebook’s going through therapy, to me.
KS: Yeah, it is.
LG: It’s been going through therapy for a while, but we’re still not quite there. We’re not at that aha moment yet.
KS: What’s interesting, one of the things when I asked Elliot that onstage, I said, “Is there something wrong in your management that you just ... You don’t have any irritants there that are disagree ... Like you don’t see dangers like Facebook Live having murders on it or suicides or whatever, or not seeing that this might be a bigger problem or problematic.” One of the things he did is, he said, “You know, we did some wrong things.” Because he couldn’t help himself, he goes, “But you know, the government didn’t get it either.”
I was like, can you just say we were wrong and not have to put caveats all over the place? But the government, the whole federal government didn’t get it. The agencies didn’t get it. It just opens them up. At one point you’re just like, “You know what? We made a mistake.” Not “let’s move on.” “We made a fucking mistake.” It’s really an interesting ... I think it’s a management issue that they’re a wonderfully cohesive group of people there who’ve been together ... I don’t know. Kurt, how long? 10 years?
I mean, Facebook’s 14 years old in just a couple weeks, right? I would say ... man, the vast majority of their managers have been there more than 10 years.
KS: Together as a top leadership. There’s no irritant in there. There’s no one saying, “What? Huh?” I think it’s so cohesive that ... That’s what I was asking, is there a management problem that nobody’s disagreeing with each other? I think that’s really important. This thing is I think they feel under siege and Silicon Valley people are not used to being under siege. They don’t like it. They feel badly. You get a sense that they do feel badly and you can hear it from ex people who are talking up ... a lot of ex-Facebook people. They want people to stop blaming them like, okay, we’ve had enough being yelled at.
LG: It’s almost like if they didn’t know what to do, like if Facebook’s top executives at this point were in a position where they’re saying to themselves, “You know, we really don’t know how to fix this problem,” they probably aren’t in a position where they feel like they can admit that because of all of the issues of foreign influence and what’s going on. Admitting that kind of vulnerability would be tough. It feels like they’re taking the time to try to figure it out, but during that time things keep progressing on the platform and not always in a positive way.
KS: Well, it’s an interesting management problem, I think. I think it’s more that that they don’t anticipate disaster well enough. They only anticipate opportunity, which is I think ... that’s a Silicon Valley personality trait. It doesn’t matter because the rest of the world is impeding on them and is coming in and wanting to have answers.
Let’s talk about Facebook and Twitter’s ongoing battle with Congress. It isn’t really a battle, but again, more with the Russian trolls, memos, things like that. Can you talk about that? Also, Kurt, one of the things that was brought up in Germany for sure was that it’s not just U.S.-centric. It has impact for people on the world especially in the Philippines, for example. They have impact there that I think has not even begun to be understood compared to the U.S.
Yeah. We spend a lot of time thinking about the impact on the U.S., of course, given the companies are here and we’re here and the election was such a big deal. This is going to be an issue in elections all around the world. Not just with Facebook proper. I was actually just speaking with someone the other day who was like, “Keep an eye on WhatsApp too,” right? I mean, WhatsApp is this encrypted messaging app that’s owned by Facebook and it’s massive in Brazil and India. What happens when people start to disseminate fake news via these ... There’s groups on there, hundreds of people in these groups, links are just being shared.
They’re not public in the way that Facebook is. At the very least at least you can see things on News Feed. With WhatsApp, because of the encryption, it’s all happening kind of behind closed doors. I don’t think that this is unique to the United States and I actually don’t even think it’s unique to the ... Well, it’s kind of unique to the Facebook News Feed given the algorithm and everything we talked about, but the point is is that there is a lot of technology being used in a lot of bad ways. Facebook has been kind of the biggest culprit so far, but it’s certainly not the only one. In terms of ...
Yeah. I was going to say. In terms of Congress, we haven’t really seen a whole lot of follow-up since the hearings that they had back in November. In November they went in front of a couple different Congressional committees and testified about their role that they played unknowingly in the election. There was a feeling that perhaps Congress could come in and say, “Well, we want to start to regulate these companies.” We haven’t seen anything like that yet. I think it’s very unlikely that we would at this point.
I think that Facebook, for all of the trouble that it has caused, seemed to be the most prepared of the three companies — Facebook, Google and Twitter — that showed up. They kind of handled the bulk of all the questions. They seemed to at the very least be putting the most time and resources behind trying to figure this out, but I don’t think that there’s going to be some massive hammer that comes down on them from Congress as a result of all this. I think people are still trying to figure out, how do we prevent it from happening next time?
KS: Right. Absolutely. All right. We want to get to take calls from readers, but where does it go next, Kurt? Does it just keep going? Because I don’t feel like Congress is going to do anything about this.
KS: Again, they can’t hardly keep the government open. They have ongoing battles on all kinds of issues, from immigration and other things. This keeps popping up, but nothing seems to be happening. I know you and Tony have written — Tony Romm who writes about Washington for us — have written a lot of that.
Tony is a wizard about all of this stuff. He’s been great. If I had to predict what’s going to happen next with Facebook, to me it feels like it’s going to be product related. They’ve talked about some of this, right? Some of it’s honestly pretty boring and most general users aren’t going to ... For all the complaining that happens on Facebook, I bet most people don’t even utilize these features.
For example, at the midterms, Facebook has promised that all political advertising campaigns are going to be searchable so you’d be able to see who is providing the finances for these ads, why am I seeing this ad, who paid for it, where’s the page on Facebook that this ad originated from, so those kinds of things, so that people can check and make sure that ads aren’t coming from bad sources. At the same time, like I said, I doubt many people actually utilize that, but the fact that it’s available might keep some bad actors away.
I think more likely, though, is what we’re going to see is this News Feed-related stuff Facebook is testing in ... I think it’s six different countries and I can’t think of them off the top of my head, but separating completely the friend feed from the news and page feeds. You literally go to one section of the app and hear from Mom and Grandma and cousin. You go to a second section of the app to hear from the New York Times, Recode and your favorite retailer. The hope is that by completely separating the two, they kind of eliminate the spread of fake news because it’s just going to be less ... It’s not going to be intertwined with all of the stuff that you see from your friends and family.
Obviously, publishers are freaking out about that because the fear is, well, why would anybody go to that feed full of businesses and publishers trying to get my attention when I can just hang out with my friends? I don’t think at this point Facebook really cares that much. I think they’re so worried that they’re going to alienate their users that publishers and businesses are ... We’ve already committed. We’re not leaving Facebook, right? We need Facebook. At this point, they kind of have all the cards and they can say, “Well, this is just the way it works.”
KS: Although it’s also certainly gone down. I don’t know. I think publishers have moved their attention elsewhere.
Yeah. I think Apple News or ...
LG: I mean personally, I don’t really feel like I need Facebook. I just really need Instagram.
KS: Which is interesting. There’s a lot of questions about that, of what works best on that.
LG: He’s like, “Shall I break it to her?”
Yeah. Yeah. Whoops.
KS: All right. In a minute we’re going to read some questions about Facebook from our readers, we got a lot of them, and listeners. Kurt Wagner, whom I call Philip for reasons I can’t get into here, is going to answer them, or Barbara. I don’t know. Lauren?
LG: #Money. #FacebookDetox.
KS: We’re back with Recode’s Senior Media Editor Kurt Wagner talking about Facebook and the future of democracy and they’re incredible.
LG: There you have it. Well, it’s been a great ride.
KS: It’s a pretty lightweight episode. You’re going to be taking some questions from our readers and listeners. Lauren, would you like to read the first question?
LG: I would love to.
KS: Thank you.
LG: This person’s Twitter handle is Azeem_the_Dream. Azeem asks, “Is anyone at Facebook losing their job or getting disciplined for being negligent when the events of the 2016 election were taking place?” He’s trolling.
We want a head to fall. No. Certainly nobody that you would have ever heard of. Part of the reason is probably because this falls on Mark Zuckerberg and Mark Zuckerberg’s not going to lose his job, right? I mean, he’s the guy. The answer is no. I think that there is a legitimate amount of self-reflection, concern. People are losing sleep or certainly were losing sleep over this. I don’t think it’s being ... It’s not being taken lightly. That’s for sure. At the same time, I don’t think anyone is sitting there saying, “Well, someone’s head needs to roll for this.” As Kara mentioned, the point is more like, “Well, we missed it, yeah, but so did everyone else.” You know what I mean?
KS: “Our bad.” The only thing is, they’re so big. It’s like, “Oh god.”
LG: That’s the problem. To what I was saying earlier, they’re so big, but at the same time, when you have an audience this size, it’s like how do you even begin to moderate that?
KS: Yeah, but they put themselves on the same plane as Twitter. Twitter is no easier, but it’s much smaller. It’s like saying, “Our bad. Oh well. So what? Everybody else did it.” I’m like, “Yeah, but you’re huge.”
The problem is — and Kara, I think I’ve heard you say this a lot — is you can’t sit there and beat your chest and say, “We’re changing the world. Look at how great we are,” if you’re not willing to take the same responsibility when you screw up. I think that what it comes to is there were mistakes made. Were people aware that Russia was trying to do this? Certainly not many people publicly. Like maybe some people in the government were. You can’t sit there and pretend like you can’t take the blame if you’re willing to take all the credit when things go well. I think that’s the issue.
KS: That’s exactly right, Kurtis.
I’ve heard you say it. I basically just said whatever you’re saying.
KS: I say it. Yes. I always quote the great power of responsibility quote.
LG: Spider-Man or Voltaire?
KS: It’s Voltaire, and I said that to Sundar Pichai, who was inaccurate about that. I said, “You might want to try looking it up on Google,” and it’s true. It’s Voltaire. Many people have said that Winston Churchill did a version of it. Lots of people did, but it was originally Voltaire.
LG: I think that you should tweak it a little bit for your mayoral campaign.
KS: Yes, that’s true.
With great power comes great responsibility.
KS: How about with great power give me all the responsibility? How about that? Give me all the power and the responsibility. All right. I will run this city like a ... It would be fascism, but yes, that’s true.
All right. Chris Davies, @C_Davis: “What if you’re someone who uses Facebook for the news? Is there going to be a way to say ‘give me more of this and fewer updates from my distant aunt?’” I would agree. It’s called Twitter, Chris, but go ahead.
LG: Another reader just wrote in, David Lensly, says he’s been trying to make Most Recent his default-first News Feed. No matter how many times he chooses that option, it goes back to Top Stories. Can you tweak the News Feed?
You can. You can, actually, but it takes a manual effort. You could go to all the pages that you really enjoy. In this case, I would imagine publishers, since this person wants to hear news. If you go to their page, you can set them as what’s called “See first.” Any time they post, it will show up at the top of your feed before the algorithm kicks in. You can do that, I believe, with as many pages as you want. You could go to the Times, to Recode, to your favorite brand or business, hit “See first” on there first, and then every time they post, at the very least it will show up at the top.
You get to see those posts before you get to the algorithmic stuff. It’s a hassle because you got to go do it manually on each page, but it is possible.
LG: I learned something new today.
There you go.
LG: I didn’t know you could do that.
Yup. I’ll show you.
KS: It should be simple, though. Next question, Lauren.
LG: Next question. These are questions about local news. Dazed asked, “What would it mean if this actually trickled down to local media outlets such as some I have here like the Province in Vancouver Sun being featured prominently, making Facebook more like a local town hall.” Someone else asked, “So if Facebook is not a media company but would rank media companies in our feed preferring old well-known publishers than, I think, over new niche ones, how will local or niche news survive?”
KS: “How will we avoid the information bubble?”
It’s funny that you asked that, because they just announced the local news initiative about two days before this initial revamp of their News Feed algorithm. A quick summary, they’re going to have a local section. For your city, you would be able to go in there, see local news stories, see events, see local groups, whatever it maybe. Kind of imagine a Facebook News Feed perhaps, but tailored to your specific city. The challenge is, are people A) going to find it and B) actually spend time there? Because right now you go into Facebook, you’re immediately sucked into the main News Feed. If you go to their menu, there’s about 30 different options.
I’ve been writing about Facebook for five years and I could probably name eight of the 30. I mean, it’s like a total overboard of different kind of sections of the app that you could find yourself in. One of those 30 will be local. It’s just a test right now, but they want to make that thing a reality. I think the question is, will people A) use it and B) be able to find it.
KS: They could just serve it up to you. Google does that all the time. Google knows exactly where I am if you sign into it. It serves up very pertinent information all the time.
It could, right?
LG: Facebook does that too. I notice when I travel, I’ll be on the road for work and then let’s say I’m in Arizona or wherever and then all of a sudden the Events Feed on the right-hand sidebar changes into local events that I might want to go to.
KS: It could just ask you. “Do you want information about Arizona politics” or whatever?
It might, right? At this point, I mean it’s literally, they announced it three weeks ago. It’s brand new. It’s a total test. Let’s fast-forward three years. Maybe we’re all going to be living in our own little local news feed bubbles. I doubt that, but there’s a possibility, right? It is possible.
KS: I’m tired of the bubble thing, Kurt. The bubble thing. We’ve been in bubbles since cable. By the way, cable was ... Ever since we didn’t have three national network shows that everybody watched, we’ve been in a bubble. I don’t know if that’s ... I don’t know. I’m just like this bubble thing is ... I don’t care about the bubble.
KS: I’m sorry. People always self-select, unfortunately.
All right. Next question. Ken Haggerty. “Hi, Lauren.” Lauren?
LG: He just said hi to me. That’s it.
KS: Hi, Lauren.
LG: Sorry, Kara.
KS: “What do you think it will take to get Facebook beyond the News Feed?” It sounds like a show, “Beyond the News Feed.” The slot machine of interfaces. What do you think, Kurt?
What will it take to get Facebook ...
KS: To get beyond the News Feed, because that was such a big deal. That was such a big deal for Facebook’s growth, the News Feed.
Oh yeah. It’s still their largest moneymaker, too. I mean News Feed’s not going anywhere, right? That’s where all the ads come from.
KS: It’s also what made the company.
Yeah. I honestly think Facebook’s trying to figure that out right now. If you look at the other parts of their business, right? I mean, granted Lauren mentioned Instagram. Instagram is kind of a new version of News Feed. It’s a very different version of News Feed because it’s more just your photos and videos. It’s not so much news. There’s Instagram. There’s WhatsApp, which we talked about briefly. They have Messenger, which is a separate messaging app. They have Oculus, which is VR. I think very much they are aware that the vast majority of their business relies on one single product that is now currently under attack. I think that they want to find an alternative.
I’m not saying that they want everyone away from News Feed. I think that’s not the case. But I do think that they’re aware of ... we need to figure out something else here in case this trend continues where people feel that News Feed is a troublesome way to spend your time.
LG: Next questions is from KevinIto920. “Do you believe Facebook will face (no pun intended) similar scrutiny as Apple over having a consumer-facing product that some may consider addicting?” Yeah, that’s a really good question.
By the way, I did this interesting video for The Verge a few weeks ago now where I spoke to a few different researchers and scientists about smartphone addiction. One of the things that a researcher I spoke to said, Larry Rosen, who wrote a great book called ... He co-wrote a book called “The Distracted Mind,” said that most people are not actually addicted. They might be obsessed.
Addiction is a very, very strong word and it’s just being used a lot lately to describe the way we use these platforms and applications. Granted, some people probably are truly addicted, but it’s probably a really small percentage in most people.
KS: It’s going to be a big deal this year.
LG: It’s like a compulsion in this strange way, and it depends on the kind of kick you’re getting from it, too. Sorry. I just went off on a little tangent. Some people consider this addicting, right? Some people have said smartphones are addicting, right? Apple’s come under fire for that. Do you think Facebook is really going to face some type of serious repercussion over all this?
Yeah. I mean, I think there are under fire for it in the same way already. The question is what does that actually mean, because someone feels that they’re addicted to Facebook, or a family member is addicted to Facebook. Up until now that hasn’t actually resulted in any negative thing for Facebook other than talking about it, but it’s not like people are using it less.
I use my iPhone 24/7 probably, right? Yes, maybe I’m obsessed or addicted or whatever and I can complain about it and I can point to Apple and say, “This is your fault,” but really my behavior is not changing. I don’t think that Facebook is going to necessarily suffer from this.
You can see that they’re trying to get ahead of this a little bit. They had a ton of blog posts — shocker — that was at the end of the last year where they said, “Scrolling aimlessly through your News Feed could be bad for your health. It could be bad for your well-being.” They kind of used that research as part of the reason that they made these changes we’ve been talking about all day as saying we want people to engage more with other people, not just aimlessly scroll, because we think it’s better for your health. They’re trying to get ahead of it. They’re trying to say, “Hey, we know that some people don’t always get the best out of Facebook, and we’re going to change that.” I don’t see it changing the bottom line.
KS: Again, another issue of responsibility. They’re not the cigarette companies, but I have to say, whatever it is — Twitter, whatever — there are addictive qualities to it that they encourage. You don’t die having lung cancer from it, but it’s definitely going to be ... I think it’s going to be a much bigger issue. Because I’m not a smoker or drinker, but I definitely have ... There’s something going on that you can’t get away from it that I think is ... I take a lot of responsibility for myself, but there’s something they’re doing that is ...
LG: A lot of these apps are designed with kind of a rewards-based system in them. They’re intermittent rewards, so you don’t know when you’re going to get the “Like” or the heart or whatever it might be. You check constantly because it might not be every time, but it might be.
KS: They hire people to do this. They hire psychologists. If someone really wanted to sue, they have so much proof of people trying to get you to ... It’s like a slot machine. It is.
Notifications in and of themselves, right?
KS: They have a thousand engineers talking about it.
Do I need to get a push to my home screen every time someone likes my ... Of course not, right, but they do it and I open the app more and there you go.
LG: I mean, Snapchat just like overtly has something called streaks. Just keep up the streak. Just keep it ... Just keep going.
People love it. People love it.
KS: Well, why wouldn’t you? You’re like a mouse with a ...
It’s a game.
KS: It’s a game. All right. Lauren’s got to go soon, and we have so much to talk about with Facebook. It’s the cause of all our problems in the world, apparently. “How much of the ‘Facebook problem’” — this is in quotes, this is from Liz Weeks — “is a result of their algorithm versus the lack of internet literacy? For example, it’s one thing if an algorithm catapults truly false news into view, but our democracy is predicated on rational founder’s language, human beings using their faculties, to think about information put in front of them. I’m not convinced, despite my frustration with Facebook’s lack of responsibility, that this is wholly on them or that it’s productive to act like solving Facebook’s ...” I agree with this. I think Liz is being very rational here.
LG: By the way, Facebook and others have been involved in the News Literacy Project, which our friend Walt Mossberg is now sitting on the board of. They are starting to do things to address media literacy.
KS: Very important.
LG: Especially among young people, it’s hugely important.
KS: I was just talking to someone at Google about this.
We said this at the very beginning with the ranking thing, right? I mean, do you want regular internet users ranking your news sources as most trustworthy to least trustworthy? I personally don’t because I don’t trust people to make that decision. I think that’s at the crux of this question is ...
KS: It is citizen’s responsibility.
People are sharing this stuff, right? Facebook gave them the tool to do so, but they’re still making an active decision to share it and spread it. That’s on human beings as well.
KS: Human beings. That’s the problem. We got to get robots to replace everybody.
Man, I know. We stink.
KS: Yeah, we stink. We stink, but it’s really ... Listen, Facebook ... We also know that people can be awful. It’s not all their problem. The way people used to complain about TV many years ago making us ... idiot boxes. I don’t know. You guys don’t remember, but I do. It was people ... that was the problem of staring at those things.
KS: All right. We got two more quick questions. Go ahead, Lauren. Next one.
LG: It’s this big wooden box. You couldn’t record anything.
Okay. This is from @TechieShark: “Will Facebook ever have something like Google’s ad grant platform to support nonprofits? It would be great to see more voices like that and less like Russian-funded election ads.” I don’t know what the ad grant program is, admittedly, but Facebook does have an entire department that’s Facebook For Good, right?
Yeah. They have a social good team. It’s led by an executive who’s been there a really long time.
KS: I think Sheryl was the one that pioneered it at Google. This is something right up her alley. Sheryl Sandberg. I’m pretty sure she did.
Facebook does quite a bit of stuff that probably doesn’t get a lot of attention around nonprofits, donations, safety of use. I mean, they get a bad rap, but they’re pretty responsible for most of what they do. I don’t know that exact program either, but I’m sure they do something.
KS: I’m pretty certain Sheryl Sandberg was very much responsible with the Google one and she would carry it right over to Facebook. It’s really right up her alley and that kind of stuff. All those companies have done that for a long time. I think what happens is that, one, “nonprofit” is not quite as exciting as the others.
KS: Okay. Last question. David Glen Walker: “How does Facebook plan to increase trust in its brand and social networking as a whole? According to recent figures, only one in four people trust social networks.” Great. They’re just like the media. Kurt, what would you do if you were running Facebook? That’s our last question.
Well, I think I’d take my CEO and I’d go all around the country having chats with small-town business owners that cough. No. It’s so hard, right, because once you lose trust, it’s almost impossible to win it back. You could be trustworthy for 10 years and you burn someone once and all of a sudden that’s what they’re going to remember. I think this is a really, really tough challenge for Facebook. It’s going to be an uphill battle. I think we’re going to have to have a few more elections that don’t result in Russian meddling before people say, “Oh okay. Maybe Facebook has that problem fixed.” That is years away.
I don’t see Facebook shedding this problem for a long time. I do think that people forget and move on very quickly. With this issue so far, that hasn’t been the case despite everything happening with the president and the White House. I think a reputation is hard to shake. I don’t think this is going anywhere.
KS: It is. They will carry it with them. We all carry all our mistakes on our back. You young people. Just so you know. They’re scars. They’re scars.
That’s why I don’t make mistakes.
KS: Oh, Kurt.
It’s my simple solution.
LG: Kurt, practice now. Mistakes had been made.
KS: You will make a perfect internet CEO.
“I’d like to apologize for other people’s behavior.”
KS: Wouldn’t it be shocking if he just goes, “Yeah, we did it. Whoops. Our bad.”
“Whoops. What are you going to do about it?”
KS: No, not that. No. No. Kurt, no. That’s when you stop. You stop right there. All right. We’re so sorry. We’re so sorry. So sorry.
LG: What if Facebook, what if social networks just like all went dark for a day? What if they had like a day of darkness once a year and they were just trying to see ...
A day of darkness.
KS: That was my thing. Close down their ad thing for a week and take the hit. You know how Bezos is doing this headquarters thing? It really has gotten focus away from him killing retail. You know what I mean? It sounds crazy, but a little bit of goodwill kind of thing. If they closed down Facebook for — not Facebook, the ad platform — for a week, for example, it would have been quite like wow. All the press would have gone like the lemmings that we are, would have gone, “Oh wow, look what they did.” I don’t know.
LG: I just mean the whole thing just being down for a day.
KS: You just want that. You wanted to go back.
LG: Here’s the thing, it is good for ...
KS: That’s called 1991, just so you know.
LG: I was around then, Kara. I was around and I remember this pre-Facebook. I mean, personally, I find that most of the time when I’m in ... I do have a public page on Facebook, I should say, my journalist page, but my personal page I will find like ... Do find that most of the time now I’m mostly engaging in like groups and community stuff, which is actually what Mark Zuckerberg was talking about. We have a family vacation group.
KS: Your knitting club, for example.
LG: My cat lady club.
KS: Sorry to miss that group.
LG: I don’t have a club lady club. I just like them all in Instagram.
KS: All right. This is an ongoing issue. Facebook is still so important. Then someday when it’s not, we like Facebook. Remember how they used to be like AOL? That kind of thing. That’s the one thing we understand that these things they do. Every company has to really do a good job to hold on to their jiminy. Anyway, this has been another great episode of Too Embarrassed to Ask. Kurt, Philip, thanks.
LG: I need to hear the whole story.
KS: Someone ran into someone — him — someone who works for Recode, and thought they were a totally other person.
LG: Someone who works for Recode?
Yeah. We just made nicknames for each other.
LG: Is there a Philip? There’s no Philip on staff.
KS: I wanted to call Kurt Barbara, but he didn’t like that.
This article originally appeared on Recode.net.