clock menu more-arrow no yes mobile

Filed under:

We should opt into data tracking, not out of it, says DuckDuckGo CEO Gabe Weinberg

On the latest Recode Decode with Kara Swisher, Weinberg explains why it’s time for Congress to step in and make “do not track” the norm.

If you buy something from a Vox link, Vox Media may earn a commission. See our ethics statement.

DuckDuckGo CEO Gabe Weinberg
DuckDuckGo CEO Gabe Weinberg talks with Recode’s Kara Swisher at a live Recode Decode taping in New York City on May 16, 2019.
Keith MacDonald for Vox Media

People don’t realize just how much they’re being tracked online, says DuckDuckGo CEO Gabe Weinberg — but he’s confident that once they learn how much tech companies like Google and Facebook are quietly slurping up their private data, they will demand a change.

“They’re getting purchase history, location history, browsing history, search history,” Weinberg said on the latest episode of Recode Decode with Kara Swisher. “And then when you go to, now, a website that has advertising from one of these networks, there’s a real-time bidding against you, as a person. There’s an auction to sell you an ad based on all this creepy information you didn’t even realize people captured.”

DuckDuckGo offers a privacy-minded search engine that has about 1 percent of the search market share in the US (Google’s share is more than 88 percent), as well as a free browser extension for Firefox and Google Chrome that blocks ad networks from tracking you. But rather than waiting for a comprehensive privacy bill to lurch through Congress over many years, he’s proposed a small, simple tweak to US regulations that might help: Make not being tracked by those networks the default, rather than something you have to opt into.

“The fact that consumers have already adopted it and it’s in the browser is just an amazing legislative opportunity, just give it teeth,” he said. “It’s actually a better mechanism for privacy laws because once you have this setting and it works, you don’t have to deal with all the popups anymore. You just set it once, and then sites can’t track you.”

You can listen to Recode Decode wherever you get your podcasts, including Apple Podcasts, Spotify, Google Podcasts, Pocket Casts, and Overcast.

Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Gabe.

Kara Swisher: We’re going to very quickly bring up Gabe, come on up, Gabe Weinberg. He’s from DuckDuckGo. We’re going to talk about the awfulness of Google now.

Gabe Weinberg: Hello.

This is Gabe. He’s the founder and CEO of DuckDuckGo. This is another search engine, you do have a choice in this world, whether you know it or not.

Yes, you do.

All right. Gabe, we were just talking about inequity and wealth and stuff like that. Let’s talk about inequity of information, because that’s really what’s happened. We have given over control of our information to one company, really, in this world. I’m going to tell one quick story before you start, so you get a sense of it.

I was walking, in the early days of Google, with Larry or Sergey, I often can’t tell them apart, and there was a room full of televisions, like a Circuit City. I looked in and they were all on, all these dozens of televisions. I said, “What are you doing?” I think it was Larry, said, “We’re recording all of television.” I was like, “Why?” He said, “So we can figure out a way to search it.”

I said, “Have you gotten the copyright from those people to do that, have you actually reached out?” He said, “Why should we do that, why do we need to do that?” I said, “Well, because if you record it, then you’ll have it recorded, then you’ll have the search for it, and then nobody else can do it, and then you’ll dominate it.” He was like, “Uh-huh.” I was like, “That’s wrong to do that.” He was like, “Okay,” and we moved on. It was a really interesting moment for me that they really were super interested in owning every piece of information on the planet. It was a revelatory moment for me.

Talk about what we have now. Talk a little bit about DuckDuckGo first, about what you’re trying to do, because you have been a search engine for how long?

Eleven years.

Right. Talk about what you do precisely and how it contrasts with Google.

Yeah. DuckDuckGo is a general internet privacy company at this point, and we help you essentially escape the creepiness and tracking on the internet. We’ve been running this non-tracking search engine alternative to Google for 11 years. We’re doing about a billion searches a month, it’s about 1 percent of market share in the US now, fourth-largest search engine.

Then we also operate a mobile browser and browser extensions for Chrome and Firefox that block trackers across the internet. As you’re expressing, Google and Facebook are the largest purveyors of these trackers, but we block trackers from hundreds of companies, and then we also enable more encryption on the internet. When you go to a website, there’s a http/https version. Sometimes sites have the unencrypted version and the encrypted version, but they send you to the unencrypted version, so we force you to go to the ...

That’s in order to follow you.

Yeah. We force you to go to the encrypted version, which helps your ISPs stop from tracking you. It’s all one download that just helps you escape tracking on the internet.

Why did you decide to do this? Because a while ago, everyone was welcoming this idea of convenience. I think Margrethe Vestager said that to me, is that we’ve traded convenience for something better, and she was talking about search engines she uses in Europe that are different. Google’s convenient, Google has mail, Google has maps, Google has this. I’m using that because you’re in the search business. Why did you decide to do this in the first place?

Well, I have a tech policy background, my graduate degree is in tech policy from MIT, and I originally got into search actually because I was interested in search, but immediately after launching it, I started getting privacy questions. This was well before Snowden still, 2008. So, I did my own investigation and found two really interesting things. One, searches are essentially the most private thing on the internet. You just type in all your deepest, darkest secrets ...

Right, absolutely.

... and search, right? The second thing is, you don’t need to actually track people to make money on search. Google still to this day, and there’s been Congressional testimony ... actually, the hearing that I was at, the representative from Google said this, they still make most of their money off the same way DuckDuckGo makes money, contextual advertising, nothing to do with following you around. It’s keywords on the search page. You type in “car,” you get a car ad.

And you can do that without tracking anybody, so what I realized pretty quickly is that’s a better user experience, and just made the decision not to track people. At the beginning of that, that wasn’t the main differentiator, because we weren’t as aware of all the privacy harms, but as time has gone on, it’s become the main driver for people to adopt DuckDuckGo.

To adopt that. Talk about contextual advertising, because this is just a basic business, you type whatever. If you think about Google or what you are, search is like a database of human intentions, right?

Yes. It’s a great history of advertising, and I think it helps to explain the current market. It used to be the case, as you’re saying, when the internet started all the way up until the mid-2000s, this contextual advertising was basically all advertising. On search, it was just the context of the page, but it was also the case on publisher sites.

I’m sure you remember, sites used to sell their own ads, they would put advertising based on the content of the article. And then the mid-2000s, it switched to behavioral advertising, which is the creepy ads, the ones that kind of follow you around the internet. Two companies dominate that because they have all the data on people: Google and Facebook. But there’s been no real proof that that is really better, and it’s arguably way worse for the publishers who then have ceded all sorts of money to Facebook and ...

Talk about that shift to behavioral advertising, because when you did it ... Years ago, at Google, also there used to be a ticker that used to run across when you entered Google about what people were searching for at that moment, and they stripped it of dirty stuff, which was quite a bit, apparently. But you would see things like “horses jam French,” and you’d be like, “What the fuck is that person searching for?” I would sit there and I’m like, “I don’t even understand that query.” Talk about how it shifted from that to the idea of behavioral. Why go there?

What happened is you had publishers selling the biggest inventory, so the top of the page, banner advertising, and then they still wanted to make some more money, so they had ceded the bottom of the page to ad networks that didn’t want to sell. Google’s was the biggest, it was AdSense, based on a company they actually acquired.

Called DoubleClick.

Before that.

Before DoubleClick.

Yeah, the one before that, and then they acquired DoubleClick in 2007. Slowly, publishers ceded some of their page over to it, and then Google ended up having all this behavioral data, so if you searched for something, you could then follow people around with that search. Those advertisements became more lucrative, and then slowly, publishers ceded most of their page over to this behavioral advertising. However, my proposition is innovation in contextual advertising hasn’t happened in the last 10 years, and it may be just as lucrative.

You can imagine videos, articles, similarly parsing out what the real content of it is and putting ads just based on that, not based on you. There’s been a little evidence of this after GDPR. New York Times, for example, got rid of all the behavioral advertising in Europe and saw an increase in revenue. Now, partly that’s because they just got rid of the middleman, but partly that’s because it actually is useful advertising. Like, you write an article on airplanes, and you have an airplane ad.

Right. Why the shift then, to behavioral? Because they just decided to do it?

Yeah. It’s strategic for their companies, because if you think about their position, they’re the ones with the data monopolies, so if they are going to run an ad network, they should run it based on behavioral because no one can compete with them on that. The real reason why that’s enabled is because there’s been no real regulation in tech that would have prevented that.

No, there’s zero regulation, right.

There’s the one from 230, but that’s it.

Right. Talk about 230. Do people know what Section 230 is? Well, those who don’t, it’s the Communications Decency Act, which I wrote about for the Washington Post 109 years ago, was an act that they put through, most of which was stripped out, most of the act was deemed ... It was started by people who were worried about dirty stuff on the internet, essentially.

Within the Communications Decency Act was Section 230, which gave internet companies broad immunity from anything that happened on their platform, and it was designed so that these companies would be able to grow and not be sued to death, essentially, and their businesses would be able to ... They were small businesses at the time, and it continues to protect big companies like Google and Facebook [and] YouTube.

And when you mention the idea of ... There’s been some eking away at 230 around sex trafficking and some other things, but essentially, if you mention the idea of removing 230 to internet companies, they start immediately vomiting on their shoes because it would mean they would be subject to legal attacks, which would be unprecedented, presumably. Correct?

Yeah. I think it’s the specter of any regulation, right?


But the idea that there would be no regulation on digital forever is ridiculous. I mean, every other area of technology has regulation to pull back some of these externalities. So, I think it’s inevitable, just a matter of what it looks like and when.

All right. They moved to behavioral advertising. I want to get to that legislation in a second, because you were proposing your own legislation, correct? They get this behavioral information. Talk a little bit about what it does, because people are very unclear about what happens when this occurs.

Essentially, these trackers exist across the web. When you go to a website like Recode, there are trackers hiding behind it. You think you’re just interacting with the site you’re on, but really, there are companies like Google and Facebook and many others slurping up your information and your browsing history. Through these various mechanisms, they’re getting purchase history, location history, browsing history, search history, and when you add all that together ...

And made more important by mobile because ...

Yeah, exactly. It’s harder to block stuff on mobile, you get location, much more granular. Some of these are sending like hundreds of data points a day, so you get a really robust profile of you, and then when you go to, now, a website that has advertising from one of these networks, there’s a real-time bidding against you, as a person. There’s an auction to sell you an ad based on all this creepy information you didn’t even realize people captured. I think that’s what people are finally starting to realize, and they’ve been going on the idea that they were just interacting with this one website, but once they find out that all this tracking is going on, they become incensed, effectively.

Do you think people are actually mad?

I do, yeah.

Explain that to me, because a lot of the tech companies, they don’t care about this, they don’t care about privacy. I just interviewed Scott McNealy recently, who’s a ...

He’s a famous one, yeah.

He’s famous for saying ...

For “privacy is dead.” No, “Get over privacy, you have none of it?”

“You have no privacy, get over it.”


Privacy is dead, get over it.

We track this, like do national surveys very closely, and it’s increased again and again and again of people, once they understand what’s going on, they want to take action. There is a setting, which is part of the legislation that our legislation is based off, called “do not track.” In your browser, in most browsers, if you delve into the privacy settings, there’s something that says do not track, and our measurements — and it’s not just us, because Gizmodo measured their sites — between 10 and 25 percent, depending on what you look at, people have enabled that setting. People are like, “No one ever goes into settings and looks at privacy.” That’s not true. Literally, tens of millions of Americans have gone into their browser settings and checked this thing.

So, people do care, and that has just climbed up and up and up as the knowledge of the tracking has gone on. Because before you knew about it, you were okay with it because you didn’t realize it was so invasive, but after Cambridge Analytica and all the stories about the tracking, that number just keeps going up and up and up.

One of the things a lot of people do bring up with me still, though, is, “Well, I don’t really care. I don’t have much to hide. It doesn’t matter.” I get that all the time. Like, who cares if they know if I went to Best Buy and bought a, whatever I bought. Talk to why that might be not the best way to think about it.

There’s two answers to that. One is philosophical, in that privacy is a fundamental human right, and so you don’t need to care or hide anything to exercise your rights. You wouldn’t say that for speech. Just because you have nothing to say doesn’t mean you should never have free speech. That’s kind of on the philosophical side.

On the harm side, there are some that people don’t realize. A lot of people really don’t like the creepy ads following them around. Some people seem to be fine with that. At a deeper level, there’s this thing called the filter bubble, which is that recommendation algorithms, and in particular, search results, are tailored to you, and that means that you’re not seeing what everyone else is seeing, and that actually distorts the democracy. That’s a real harm to individual people and society.

Then there’s just the general identity theft and data breaches, which is happening over and over again, which is one of the main drivers for adopting stuff like DuckDuckGo.

You’ve created legislation that you would like someone ...

Model legislation, yeah.

... that you would like someone to submit to Congress. Explain what you want to do.

This isn’t a landscape of all the privacy legislation, and I would love to hear the nuance of it ...

Now, just to understand, we do not have a national privacy bill in this country, at all. Other countries do, and much more stringent, like GDPR in Europe, but we don’t have one. We have one in California that’s ...


... about to come online, but the lobbyists are trying to de-fang it rather substantively. There are 10 others in states across the country. There are certain states that are doing that, I think Louisiana has one. There’s all kinds of states.

Vermont has a data broker law. A lot ...

Let’s hope Alabama doesn’t have one, but go ahead. Sorry, but Jesus Christ. All right. There’s too much of a patchwork of them, correct, across the country, the idea. There’s not a national bill.

There’s no national bill. We really should have a national bill like GDPR, but one of the problems with GDPR is how it’s operationalized a lot of consent dialogs, and that’s called notice and consent, and people just end up clicking the consent.

We think a better mechanism is this thing that I was talking about earlier, this “do not track” setting in the browser that tens of millions of people already put on, and the fact that consumers have already adopted it and it’s in the browser is just an amazing legislative opportunity, just give it teeth. It’s actually a better mechanism for privacy laws because once you have this setting and it works, you don’t have to deal with all the popups anymore. You just set it once, and then sites can’t track you.

Should it be set from the beginning as “do not track?”

This is a debate, whether it should be default on or default off. I would love it to be default ... have to opt out of it, but I’d happy if it was opt in as well because I think people will ...

Oh, people keep tripping here.


... the second trip. Hello, sorry. Okay. It was the same guy, okay. It’s the same guy tripping, all right. Now he’s just fucking with me. All right, go ahead, sorry.

No worries. I think people will opt in ...

He’s Google. I don’t care.

I think people will opt in if they have the opportunity, and they can opt out of this tracking. What we’re hoping is, there’s two things. One, as you said, California’s getting de-fanged, the California bill. As that happens, the pressure for a federal bill is going down this year because if it’s all de-fanged, there’s no pressure to pass something that preempts it.

But still, all the people in the country really want something passed, so one option here is this would be a much simpler thing to pass. Just give a “do not track” mandate for no tracking for that setting. Much easier than comprehensive legislation.

The other thing is that any comprehensive legislation that gets passed, do not track can be the mechanism. So, we’re hoping that it gets added to any larger bill as the mechanism to help people opt out.

Why do people have to opt out? Why do they have to opt into it, I guess, opt into it, because you don’t have to opt into clean water, you don’t have to opt into, like, “I think I’d like my water clean or dirty.” It’s kind of crazy, the stuff that consumers have to do in order to protect themselves compared to almost any other thing they use. You don’t, again, opt in, “I’d like the tires that don’t fall off,” opt in, “I’d like the food that isn’t tainted, please.” Why is that mentality around?

I think the mentality is around because of the lobbying that it would distort the advertising business model. I would love it to be opt in by default, but in a realistic way, I think if it was operationalized as a way to opt out, I think that would be effective because, as you said before, the other argument is some people don’t care, and that gives people, really, the choice.

That they just could leave it there?


How is it, operating against a site like Google or a big site where you’re 1 percent? How do you do that?

It’s been interesting. We try to educate people about privacy and that there are alternatives. Our main issue is just not everybody knows about us. There’s 20 percent of people that we think would be interested in switching to DuckDuckGo, but it’s hard to convey all these privacy concepts.

I’ll give you an example. We’ve been talking about the filter bubble for years. In 2012, we ran a study on Google that we think influenced the 2012 election, that’s how long ago it was, but nobody ... we had to speak for 10 minutes to explain what the filter bubble was back then. But after 2016, in the last two years, now we can talk about the filter bubble, just name it and people know what it is, generally. How many people know what the filter bubble is, I’m just curious?

Explain the filter bubble.

Well, it’s the idea — first of all, that percentage is very high, so I like that — but it’s the idea that for search in particular, as an example, when you search, you expect to get the results right? If you searched for gun control or abortion, you expect, we search at the same time right here, you would expect to get the same thing. But that’s actually not what we found when we did a study on Google.

Yes, there could be different search results.

Yeah, and people don’t realize that. So in addition, we found that it varies a lot by location, and so if you take that to the extreme, let’s say that voting districts are getting different results for candidates or issues, it can skew the polarization of that district very easily over time. Because people who are undecided are actually searching for these topics, and people generally click on the first link, and if you’re controlling that first link in that district, that’s what people are going to learn about.

So what do you ... Anand was just talking about the idea of people that shouldn’t have this kind of power having this kind of power. How do you look at it? Because it’s, it is a group of maybe 1,000 people making these decisions in Silicon valley pretty much, if that many.

Recently, I did an interview with Tristan Harris and he was talking about, he called it the “climate change of culture” is what we’re going into. The only positive part about it is there are only 1,000 people whose minds you have to change on this. How do you look at that small amount of people making decisions for the entire world, really? What can we do about that?

Yeah, I look at it structurally, and I think that on the consumer side that we’ve talked about, there needs to be a way to opt out, which would lower the power, right? On the kind of structural business side, at the core of this problem, at least the one that we were talking about here, is data monopolies, right? It’s the collection of data profiles and there are ways to split those up.

So some people are talking about extreme measures, which I’d be in favor of the spinning off companies, but there are other ways to do it, such as not allowing data to be shared between different business units. So if you’re on Instagram and you are browsing something, that data cannot be shared and used on Facebook.

Which was precisely the reason Facebook bought that company.



That’s why it should have been rejected in the first place. But at this point, one thing you could do is legislate, and as part of our proposal is that if you go to that site, you can’t share data back to the other sites. So that would effectively add more competition and less power in these decisions.

Except they sit next to each other. They happen to be on that same ridiculous campus they have, they share the same kombucha stand, so it’s hard to ...

That’s why the argument for separating them more strictly is pretty solid.

What about opening up their data, that they have collected? Both companies, for example — because essentially we’re talking about Facebook and Google pretty much, and Amazon’s just starting to get into that business, the advertising business. It’s two companies right now, correct?

Yeah. In the digital advertising market, absolutely. The debate is kind of out on interoperability, which is what that is, it may help in the social network space. It’s less important in the search space because, as I was saying earlier, you actually don’t need these profiles to do good search results. They’re really using search to do ads on YouTube and Gmail and all these other places, and so it could help in social networks to help someone start a social network and be able to switch all your friends a little quicker, but it probably wouldn’t solve Google’s problem.

There hasn’t been a major social network started since 2011. Which is amazing, if you think about that. There’s a reason, because why do it? And then, and for the rest of it, I think it’s just Snapchat, correct? [Evan Spiegel’s] the chief product officer for Facebook right now, as far as I can tell, because they steal all his ideas. So how do you then shift that? How do you stop that? Just separating them? Which of the ways you think should happen?

I really think you have to get to the core of this data monopoly. That’s the key, and I think part of the problem actually has been that market definition, as people think of it as search markets, social network market. It’s what you saw in Congress too, but it really is a digital advertising market, that’s where the domination is.


So you have to change the digital advertising market. You can do that by putting in more Chinese walls between the companies. You could do that by doing something like do not track, which would force companies to be more contextual. So all of a sudden you’d be steering the industry back to the contextual from the behavioral, because 25 percent of people now would advertise on contextual, behavioral will be outlawed for them. So that would just break up the competitive advantage, basically, of the data profiles.

When you have behavior, do you think that’s even possible? Because the idea is adding more, with AI is adding more and more behavioral data in order to track you and accompany, in countries like China and other places, they’re using that behavioral data very — they’re using a lot of it including facial recognition. The companies here I think are just dying to get into facial recognition and know the controversies around it. So it doesn’t seem like backing away from behavioral, there’s something, they seem to be doubling down in behavioral.

That’s why I believe you need government regulation in it, and to your point, you can look at China as an example, and so not only should you push things back to contextual, but you probably should have some lines, bright lines, that you can’t cross in behavioral. Right? You saw San Francisco ban it the other day.

This week San Francisco banned facial recognition software being used in the city. Is that correct?

Yeah, that’s right. For government.

For government.

Another good example, which isn’t talked about that much, is political advertising on Facebook. So you know, there’s ...

Oh it’s talked about a lot, yeah.

Yeah, well not this part, there’s this disclosure, but arguably there should be a ban on behavioral advertising to some level of people, just blanket for political advertising.

To some level of people. What do you mean?

Like 1,000 people or 10,000, you could pick your cutoff and then say you cannot target ads at a population less than that amount. Because what’s happening now is the behavioral advertising just targeting you, right? Or a very small amount of people.

And it’s using, it’s manipulative in two ways. You can manipulate the targeting, so I can select just the three or five people that I think would totally be triggered by this. You could also do A/B testing and change words and images to get like the perfect manipulation. Tristan Harris has better words for this, I forget what he says, but the brainstem ...

Yes, that goes down the brainstem, yeah.

Right, and that kind of stuff should be, probably, outlawed. Especially for political advertising.

To individualized people?

Yeah, but it manages individual probably at some threshold that’s significantly high. They can make the argument, “Oh this is just like TV, our disclosures are going to be like TV.” Well, you can’t target, well you might be able to eventually, which also should be outlawed, but in TV, you weren’t targeting down to the individual person.

Right, you were targeting great groups of people.


You also couldn’t just put ads up the way they do on Facebook.

Right, and do all this testing you couldn’t do, you can’t test 1,000 ads.

So do you think our regulators are up to this task? So you’re writing your own legislation, I’m assuming you’re running your company at the same time and you don’t, you’re not a legislator?

That’s true.

So how do you look at our legislators? Is there more strength elsewhere in the world or in this country? Let’s talk just about this country, because I think most people saw the Facebook hearings.


Mark looked great in that hearing and he’s not the most articulate of people. It’s largely because most of the legislators looked so bad and looked so ignorant. So when you say Mark looks so good, it was because it was such a low, low bar that it was hard not to. I think they could have put a ham sandwich there, and it would have looked pretty good.

But that said, I have talked to legislators who are quite smart. There’s a lot of them in Washington, and especially in the regulatory agencies and elsewhere. What’s the problem from a regulatory ... is it the money? These companies are throwing it at lobbyists, at them, or is it just a lack of will, or is it ignorance? How does that change?

Right. So to answer your first question, there are very smart people there and just like running a company, they hire staff and there’s lots of Senators who have hired really good tech staff, and there was a handful, maybe five to 10, and they’re writing, they’re trying to write some of this legislation.

So I don’t think you need the whole Senate writing it, for example. So I’m actually pretty encouraged in that regard, the level of thought that’s been going into some of these proposals. I think that writing something like GDPR is complicated, Europe was doing it for 20 years, I mean, it was an update of a 1995 law and then they took five years to update it, basically.


So we’re just getting started this year, so there is some amount of time that it will take. That’s one reason we proposed this legislation, because you can do this right now. This is an easy thing you could do. Solving a lot of it all at once is difficult. I prefer to do it a little bit in piecemeal as opposed to just one thing and then be done.

That’s the other thing I don’t really like about our government system is, we pass something and then we don’t touch it for 20 years, which is what happened to the CDA. It would be better if we pass something every year or two and kind of tweaked it.

So what do you imagine that they will do? You mentioned, of the many different things that are happening? Like right now the FTC is considering fining Facebook $5 billion, which I called a parking ticket.

Yes, I agree with you.

Thank you. I’m correct.

Need another zero.

Two zeros, actually, I’ve decided.

That would do it.

Two zeros would, yes, that would do it. But the concept is fining them. Taxing them is another way. Regulatory guardrails is another way, and breakup — antitrust action, for presumably break up — and then not letting them buy anything.


Of those things, what do you imagine is the most effective right now?

I think not letting them buy things would be good.


Anything that gets at, if you’re talking about Google and Facebook and the digital advertising market, anything that gets at the data monopoly would be good. Of those, breaking up could do it, China’s wall could do it, do you not track could do it. All of those would do something.

How much time do you imagine it will take? There is a techlash going on right now, and it’s largely because of the 2016 election and the idea that the Russians were customers of these companies. When do you imagine ... do you imagine that will continue, or do you think it’s gonna peter out? And the next then, how dangerous is it for the next elections that these continue to be issues? Do you think these companies have finally sort of come to Jesus and said, “Oh dear, we’ve made some errors here.”

I think you’re seeing that because all of their announcements have just been centered around privacy, right? So they’re feeling this is a real thing at this point. Maybe Scott McNealy is still saying that, but everybody else is now embracing at least the word “privacy,” whether they mean it or not.


I think you’re gonna see the techlash continue until people think there’s some meaningful change. I worry about the breakup because that takes a really long time, historically.


Some of these other things can be done much quicker, so I’m hoping that something will get done quicker. I’m a little saddened by what’s happened in California, because I think that’s slowing things down.


But what I haven’t heard anyone talk about, which is interesting, is that got started by a ballot measure, for people who don’t know.

Everything in California gets started like that.

One person.

Sure, just so you know.

Yeah, one person did that.


There’s nothing to stop somebody from doing it again if it gets watered down.

That, someone put on something else.

Yeah, or just put out the same thing again.


You know, if it gets totally watered down, just do the same thing.

And yet we don’t have privacy in place.

So talk just very quickly, we just have a few more minutes. What can people do right now, besides using DuckDuckGo, what is some of the things they should be doing to protect themselves online and things that they don’t realize they’re being tracked — besides, please, please don’t buy one of their internet home devices.

Yeah, that would be good.

Yeah. My son goes around and unplugs them all in our house. Yeah, they don’t work. They make lovely paperweights, but go ahead.

Facebook is an interesting challenge. I haven’t been a Facebook user for a long time, and there’s plenty of studies that show that’s a healthy choice.


But there aren’t great alternatives to it, and so that one, I think you should leave, and a lot of people have.

Just leave Facebook.

Just leave it, yeah. For Google, there are actually alternatives in every category. So we’re in search, but there’s alternatives in emails, alternatives in ...

Such as?

ProtonMail is a big one, we use FastMail at DuckDuckGo, and in you know docs, there’s things like Zoho, they’re not all super private like us, but they’re often paid, but cheap alternatives that put privacy first. So I would leave the services because the idea that alternatives don’t exist is just nonsense.

Well, although it’s inconvenient because they all work together on your phones or whatever you have, they all seamlessly work together.

You know, I’ve been out for a long time, but it’s totally productive. I mean, you just click on a different app on your phone. They’re on your home screen, you go into a different one.

Okay. What else should people do to protect themselves?

So in the devices themselves, there’s a bunch of privacy settings that actually matter, and we have a blog at with advice tips that we just, education, wrote it all up. There are things like you can turn off the ad tracking on a lot of ... on Apple, for instance, you can use encryption end-to-end. You could basically encrypt your — Apple is by default — but you can do your laptop. These are things that take like an hour, but you run through the checklist and then you’re a lot more private.

So encryption, encrypting your laptop.


Encrypting your phone or having a phone ...

Encrypting everything, changing, basically doing all the opt outs you can do, which, we have a list and you can run through the settings. Then switching off of these services, kind of voting with your feet.

What about mapping?

So we use Apple Maps for our surface knowledge.

They’re such bad maps.

They are — they have gotten a lot better. I’m sure I would give them a try.

All right. Okay. Because they’re terrible maps. That’s why, I mean, it’s hard not to use a Google Map because they’re so good.

I would give them a try.

Okay. All right. Anything else? Any other things?

Those are the top ones, because I don’t want to scare people that it’s difficult. You know, it really is, you spend a few minutes on this and you can be out. I think people think it’s such a gargantuan thing to leave these companies, and I don’t think that’s actually true.

What about cameras and audio? Just so you know, when I was visiting Facebook I noticed Mark had his camera covered, had his audio covered, he had everything covered.

Yeah, absolutely. When you, people conflate privacy and security a lot, but I totally agree. So on the security side you should definitely have a webcam cover and use two-factor authentication. You know that thing that texts you all the time, you should use that for all services. So most services that you have your ... your email is the most important, but you can set up this two-factor authentication much better because most of the hacks that have happened, a lot of the identity theft has all been from phishing on your email and you know, you click on something, type in your password, and two-factor authentication is just another layer that prevents that from happening.

All right, and covering... When you get into things like facial recognition and other issues, as people start to use VR and AR and things like that, what would you advise people?

Facial recognition is hard. There are ways to actually change, like wear things that change your face so you don’t get captured by the cameras, but I think that ...

Wear things and change your face?

Yeah, you can like ...

Mask, it’s called a mask. But go ahead.

Yeah, more minor things, I forget what it was, but you can like put like aluminum foil or something that freaks things out, but I’m not recommending that. You don’t see me wearing it. I think the problem with facial is you’re going to need laws with some of this stuff, and so San Francisco’s great, starting the trend there.

Okay. To not to — not to use ... and companies not use facial recognition.

Yeah. You’re going to need, that one is going to need to be solved at a more societal level.

Do you think people realize how much facial recognition is used in this country?

No, not at all. There was a, the privacy project, the New York Times, I assume you’re aware, there was a really interesting story two weeks ago where they took, I don’t know if you read that one, they took camera ...

I was part of the project, but go ahead.

You were part of that story?

Yeah, no, that project, but go ahead.

Oh yeah, so they took live webcam footage that was just on a camera and just put it up on Amazon with their facial recognition off-the-shelf stuff, and then were able to then identify a bunch of different people. They called them up and was like, “I saw you walking here.” It took $100 and like three hours.

Right. Well, exactly. They also are using things much more seriously in workplaces to watch your face as you work, and also to decide whether to hire you based on your facial expressions. What they’re trying to do right now, from what I understand, some of the new software will track your face and expressions during an interview and then match it to their top performers whose faces they were also tracking and therefore don’t hire you if you’re not one of their top performer’s facial expressions.

It’s crazy. It’s another area, just bias in algorithms in general. But this is to your earlier point, is that a central point about regulation is there’s been no regulation for 20 years, right? So there’s just a dearth of regulation. And you’re not going to solve it in one shot.

So this bias in algorithms probably needs to be a separate bill, right? AI in general, this area, facial recognition. We need to tackle these problems separately, but they are the biggest problems and so we should be doing it year after year. I worry that it’s going to be a checkbox, that we pass something and then it’s going to be done. We’re just going to have to keep the pressure on.

Do you ever imagine — and then I’ll get to questions from the audience — that we’ll ever escape the enormous power of something like Google? Because what it’s become, it is the answer machine for everything, really. Even though you have 1 percent that’s ... They have 90 percent everywhere.

Yeah, I do. I think if we have these structural changes, I think you could see the market open up. We are a small company, we’re only 55 people, and to be like “How do you compete?” I mentioned earlier we use Apple Maps, right? We could debate how good they are. But in each of these verticals there are actually really good answers. So Yelp for restaurants, which we use. We use Wikipedia, just like Google does for ...

But Yelp is getting crushed by Google, they’ve talked about it.

Yeah, exactly. But it’s not like their answers aren’t good, right? So if the market opened up a bit, these companies could thrive more. And we use them and put together good search results just based on all these other companies. I think that the idea that they’re this magic AI that no one can compete with I think is false. Because if you look around in all these categories, there actually are good alternatives.

But given that the amount of money they have, the amount of money they make, and the fact that consumers like them, like using them. I think one of the issues around all these tech companies is Amazon’s so great about delivery even though they’re ruining the lives of retailers across the country. I’m sorry about that.

Or causing people to have job ... Or their contractors are treated badly within the stores. But gosh, they delivered my iced tea really quickly and it’s delicious. Or Google, they’re a horrible monopoly and yet wow, they were fast on giving me that answer if so-and-so is alive anymore. Or whatever they decide. I always search people if they’re dead or not, so go ahead.

Trust in Facebook has precipitously dropped in the past year, and so I don’t think it’s inevitable that trust is always at high levels, even if people kind of like, come out of it.

Right, trust in Facebook has dropped, usage has not.

Yes, well that’s because there has not been a great alternative, right? People don’t necessarily believe or want to believe the healthiness of quitting. But trust itself has eroded. So I think trust could erode in these other companies as well.

Are you worried about who runs these companies at some point? Because one of my worries has always been ... Many years ago, I wrote a story about Google trying to take over Yahoo search. I think I’ve talked about this before. It was going to get them 90 percent of the market at this point. Yahoo was still a substantively large search, did substantively large search business, and Microsoft was the third one. I was struck by, they can’t have 90 percent. Why isn’t our government stepping in to do something about that? That’s a ridiculous amount of market share.

So a line I wrote was, “At least Microsoft knew they were thugs.” Google pretends they’re all happy with their funny balls and their crazy eating habits and their weird clothes and stuff like that, but they’re still, as adorable as they are, they’re still just as evil as Microsoft was.

So I was making that point and I think it was Eric Schmidt who called me up and said, “That’s really mean, that you say we’re thugs.” And I said, “Well, I think you’re worse than thugs because you don’t know you’re thugs. And you are thugs, and therefore you’re worse than thugs.” And he was like, “We’re not thugs, we’re really good people. We’re really good people.” And I said, “I get that, but I can’t imagine a world where you have a company of this much power over information. What if someone ... “

Of course it’s like three clicks to Hitler, but that’s what I said. It was like, “But what if Hitler ran Google? What if someone who wasn’t so nice ran Google?” And it was like, “Well, they don’t!” And I was like, “Yes, but what if they do?” And he said, “But they don’t!” So are you worried about the concentration of power in the hands of, again, a very small amount of people?

Yeah. Absolutely. I worry that you’re basically saying they intend good, right? It’s like, “The road to hell is paved with good intentions.” I believe, we did this filter bubble study that they did influence the 2012 election. Now, a lot of people on the left liked the outcome.

What we found, by the way, was when you searched for stuff, we found this thing called magic keywords, where if you search for something, and you search for something subsequent, you would get extra results inserted based on the previous search. So “Obama” was a magic keyword. So if you searched for “Obama” and then “gun control,” you’d get three gun control results in the results. “Romney” was not a magic keyword. So all these people ...

No, no he was not a magic keyword.

Yeah. So all these people searched for Romney and Obama and there were tens of millions of extra Obama results inserted across the country for that entire run-up to the election. We don’t know what that changed, but I presume it actually did change a lot. Because people were searching just random issues that they wanted to hear about and they were just getting Obama’s take on it, not Romney’s take.

Do you believe that was purposeful?

No, I don’t. I think it’s totally unintentional.


It was a result of the algorithm and they were questioned on it, and they were basically like, “Oops,” and the answer was, “When we created these magic keywords, Romney was less popular at the time.” It was a year before. And that just happened. But it was probably a tiny change that no one even knew about that had a big impact.

Okay. Questions from the audience. Lots of them, let’s start here.

Audience member: Thanks for being here. I have a question about how this conversation could or should change once it moves from the digital world. So a lot of times when we talk about privacy, especially tonight, we’re talking about digital-first problems, or problems that are because of digital-first companies. But specifically the advertising example, now we’re seeing addressable advertising on TV, where you can be just as targeted as you are online in your TV screen. And that can change depending on who they think are in the room at that time.

Yeah, and streaming. It’s because of streaming.

Audience member: Yeah. All of it. How does that conversation change now that it’s not just digital and it’s really everywhere?

Gabe Weinberg: In a sense, it is digital because the streaming, the conversion from analog to digital on TV, is a digital conversion, it just doesn’t feel digital because it doesn’t look like you’re browsing the internet. But a lot of it is still going over the internet. So if you had something like do not track it could apply to all of these mechanisms, as long as it remains digital. So I think it would apply to that situation, as well as the billboards. You know, there’s been talk about individualized billboards.

Well, you saw the Tom Cruise movie, Minority Report, right? They clicked his eyes and then it says, “Hello, Mr. Nakamura. Would you like more fleece culottes?” Or whatever the heck he wanted last time.

Audience member: Well, addressable ads from cable boxes, that comes through the data that cable companies have. Doesn’t necessarily ... I mean, yes, if you probably could have some digital information coming and bleeding into that, but it’s grown ...

Well, they want to get the same information Google gets. Cable companies and telcos now are able now to get the kind of ... Because they’re like, “If they can get it, why can’t we get it?” And usually telcos were prohibited from getting that kind of ... And the point is, you’re right, if they get it, why don’t we get it?

Gabe Weinberg: Yeah, I agree with your premise that they’re trying to copy the tracking business model everywhere to compete. And if it’s outlawed in one place, it should be outlawed everywhere.

Yep. Next question. Oh, you had one here, you didn’t ...

Audience member: I have a silly question.


Audience member: Why’d you name it DuckDuckGo?

That’s a good question! That’s not a silly question.

Gabe Weinberg: Hi. It is a good question, and I wish I had a good answer, yeah.

I do know the answer to that, but go ahead.

Gabe Weinberg: Which I don’t.

It’s a really fun game.

Gabe Weinberg: Yeah. Duck Duck Goose.

Audience member: It was a good question then.

Gabe Weinberg: Yeah.

It is. Go ahead, go ahead, you had no ...

That’s exactly it.

Where were you? You were sitting there and going, “Oh, it’s either going to be like Blue Geese or Duck Duck Go?” What?

So I was on a walk with my wife.


Pre-even having a company, and I was going to start something, and I was like, “That’s a cool name.” And didn’t know what the company was.

Who thought of the name?

Me. Just popped in my head.

Okay. All right, all right, there you have it. All right, next. Right here.

Audience member: Do we have the right to be forgotten, and should we have the right to be forgotten?

Good question.

Gabe Weinberg: We currently don’t in the United States. It’s part of GDPR and it’s something we comply with. It’s an interesting question of where to draw that line, and it hasn’t been one taken up very much in the US yet.

Because of the First Amendment.

Gabe Weinberg: Yeah.

It’s never going to happen here.

Audience member: So then if you were going to write an amendment, what would the amendment be to make the right to be forgotten?

Gabe Weinberg: You would probably need a constitutional amendment, because of the First Amendment.

Yeah. To change the First Amendment. That’s not happening. But the idea is whether you want it or not, too. If there’s good and bad information about a good person, a bad person could take down good information about them. So it gets into really thorny issues. The same thing around editing tweets, right now. That’s their excuse. There’s lots of ways they could do it, by the way, they could keep ...

Audience member: Well, it’s also like check the box. So if someone has a history and they’ve paid their debt to society, should they keep getting that time and time again and not being able to move past it?

Yeah. Well, you know, they’re not going to change that. But the line from the movie The Social Network, where it’s, “The internet’s written in indelible ink.” So it’s an excellent question, I just don’t think, because of the First Amendment, that it will ever get any traction here. I think every single thing you do online, every drunken college picture, is going to stay there for ...

Gabe Weinberg: I haven’t heard anyone talking about that here.

What, drunken college pictures? I don’t have any, I’m really old. Go ahead, right here.

Audience member: How are we going to help the advertisers who are getting progressively more and more addicted to data, and have built these really robust data sets, even without the Googles and the Facebooks of the world?

Yeah, they have Axiom, others.

Audience member: Yeah, you can work with Axiom, and you can get lookalikes, you can deliver to them digitally or traditionally. So the issue, I wonder, is also the advertisers, the P&Gs of the world, who are paying, and they’re the ones who are subsidizing and encouraging this by paying for the ads?

Gabe Weinberg: Yeah, data’s starting to come out about this. The data so far is that they’re paying for a lot more for not much. Behavioral advertising has not been that effective.

Audience member: But they don’t necessarily believe that that’s true.

Gabe Weinberg: Right. I agree we need empirical evidence a lot, but I think once they realize that they haven’t been getting much for what they’ve been paying for, they’re totally happy to embrace contextual advertising again.

They’ve also given up their relationship with the consumer. Yeah, I had a really interesting conversation with the CTO of Ford, and I said, “What’s your biggest problem?” And he said, “Well, the internet companies want to suck every bit of data and chomp it back out, and we want a relationship with the consumer over time.”

They have a very different mentality towards their consumers because they buy things from them. And essentially Google and Facebook are pass-throughs. They don’t care about you. So it’d be interesting to see if they can regain power, because you’re right, it’s not effective for P&G. They do start to realize this, I think they ...

Gabe Weinberg: They don’t have any agency right now.

Audience member: And will they be upset because of the lack of transparency? Sorry.

That’s okay.

Audience member: It’s a very interesting subject.

Yeah, it is. And the question is will these big advertisers ... The problem is the big advertisers are also getting disrupted by an Amazon. Like Amazon’s selling all their Amazon goods.

Audience member: Including advertising.

Yes, everything. Yeah. Yeah, we didn’t even talk about Amazon.

Gabe Weinberg: Yeah, right now they don’t have a lot of choice because all the audience, if you want to reach billions of people, you’ve got to go to Google and Facebook. But if they’re ...

And Amazon.

Gabe Weinberg: Yeah, and Amazon. But if it’s broken up a bit and contextual is back, there can be other ad networks that aren’t Google and Facebook.

Because they are the only road. Okay, any more questions? Last one, right here.

Audience member: Can there be a DuckDuckGo for YouTube?

Can there be a DuckDuckGo for YouTube?

It’s a good question. It is one category which I didn’t hit on that doesn’t have a great answer, honestly, right now. And there are great sites to put your video on. So Vimeo is a good example that I use for privacy with my kids’ videos. But unfortunately, if you’re a creator, all the eyeballs are on YouTube. It’s another network effect. So to break that network effect is very difficult.

It’s impossible. There’s no other choice, and the expense of doing it is so high. So they can create monopolies without creating monopolies, which is really the question for our legislators.

Yeah, that’s another legislative question we didn’t address, but that one’s an argument for treating it as a utility at that point. If it’s such a natural monopoly that nothing can really compete with it, you need to regulate it as a utility.

Or change law. Or change the way we do antitrust law. I think there’s interesting people like Lina Khan and others that are talking about how to change the idea of what a monopoly is, and it doesn’t have to cause consumer harm. In fact, it could have consumer good. You all like all the freebies you get from all these companies. You really do. That’s the problem here, it’s not harmful to consumers. Except it is. It’s harmful to society. But it’s not ...

Well, there’s an interesting argument. There was another New York Times article from Brian Chen who left Facebook, and he saw ... It’s one anecdote, but it was interesting. He saw his purchases go down by 50 percent on his credit card. So you can say, well, he wasn’t seeing a lot more ads, right? Because he wasn’t on Facebook seeing all the ads. And that’s definitely part of it. But some part of it was also because the ads were probably manipulative, and they were kind of manipulating him into buying stuff. So that’s arguably a direct consumer harm. It was money out of his pocket he wouldn’t have spent otherwise from being manipulated.

Addictiveness, which is something a lot of people are talking, this ... I mean, I don’t use Instagram. I don’t use any of these services, by the way. But I was on Instagram the other day, and I was like, “I must buy that strange little object that they’re ... “ I don’t know what it was, it was some bra that was better than other bras, and it was not. But it was weird, and I don’t know what it was doing to my brain. I was fascinated.

Yeah, they probably ran a thousand ads on that thing to like ...

It was weird.

... get the exact emotional triggers.

I didn’t buy it. But I was this close.


Close, but not quite. So close. Anyway, thank you so much, Gabe. This is Gabe Weinberg from DuckDuckGo.

Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.