“How do successful companies create products people can’t put down?”
That’s the opening line of the description for Nir Eyal’s best-selling 2014 book Hooked: How to Build Habit-Forming Products. Hooked became a staple in Silicon Valley circles — it was even recommended to me when I started Vox — and Eyal became a tech celebrity. So I was interested to see him releasing a second book that seemed a hard reversal: Indistractable: How to Control Your Attention and Choose Your Life.
But Eyal doesn’t think Big Tech is addictive, and he sees the rhetoric of people who do — like me — as “ridiculous.” He believes the answer to digital distraction lies in individuals learning to exercise forethought and discipline, not demonizing companies that make products people love.
Eyal and I disagree quite a bit in this conversation, but it’s a disagreement worth having. Life is the sum total of what we pay attention to. Who is in control of that attention, and how we can wrest it back, is a central question of our age.
You can listen to our full conversation by subscribing to The Ezra Klein Show wherever you get your podcasts, or streaming it below. A partial transcript, edited for length and clarity, follows.
Talk to me about the impetus to write your new book. In 2014, you wrote Hooked, which was the handbook for getting people hooked on your products. It was even handed to me when I was thinking about starting Vox. So how do you go from being the guy who teaches the tech industry how to hook people to the guy who’s trying to teach people how to unhook?
Somebody once told me that wisdom is found in the apparent contradictions. It might seem like there’s a contradiction here, but really there’s not. Hooked wasn’t about teaching Facebook and YouTube how to use these techniques — they already knew how to use them. The idea was to democratize these techniques so that companies could build the kind of products and services that people use because they want to.
The reason I said there isn’t a dichotomy is because I think you can teach people how to build the kind of products that build healthy habits — exercise habits, nutrition habits, education habits, etc. — while also having the insight or knowledge to help people understand how to avoid distraction at the hands of some of these products as well. That’s what Indistractable is about.
I wrote it because I found that I was becoming distracted by some of these technologies — that I couldn’t stop looking at my device even when I was supposed to be spending quality time with my daughter. That’s when I realized that if I’m struggling with this — and I actually understand how these products are designed to hook — other people might have this problem as well.
I want to push on this because it doesn’t seem totally fair to say that Hooked came out and people only use that book to do good things. It was part of a broader culture here in Silicon Valley that got very excited about hijacking people’s mental processes to hook them to products. I wonder if that’s something we should rethink in a much bigger way when we are discussing responsible product development.
Well, the case study in the back of Hooked is not about some gaming company or a social network — it’s the Bible app. I chose that app very specifically for that book. Many people forget that this app is one of the most widely used pieces of software in the world. I wrote about [the Bible app] because if you’re the kind of person that believes that forming a habit with the Bible is a good thing — that it brings people closer to their faith, that it gives them purpose and meaning and solace in life — then you’ll be in favor of creating a habit around the Bible app.
But if you believe that religion is not a good thing in the world, then you’re not gonna like the Bible app. I think we see something very similar when it comes to this question of social media or gaming. I don’t want to make any value judgment because I don’t think it’s up to me to tell people how to spend their time. What is it that makes Candy Crush somehow morally inferior to watching a football game on TV?
When I look at the system as a whole, I see that we’ve developed a lot of cognitive understanding of the way people’s minds work — how their reward systems can be hijacked. The people who understand this have the money to beta-test, carefully design, and advertise products. This creates an arms race between the companies trying to manipulate us and our attempts to control the environments around us, and it’s very hard for us to keep up because they’ve got people paid to figure out this research.
That’s the problem here. I feel that you can be reasonably value-neutral on the question of how people spend their time without being quite so neutral when it comes to this broader trend toward designing addictive products.
Let me agree and disagree with you here. I agree that helping people do the things they want to do with intent is important. That’s how I define distraction. The opposite of distraction is not focus — it’s traction. Traction and distraction both have the same Latin root, trahere, which means to pull. So traction is any action that pulls you towards what you want to do; it is things you do with intent.
What you do with intent is up to you — it’s based on your values. The opposite of traction is dis-traction. So the book is designed to help you do whatever it is that is consistent with your values. As an industry insider, I will tell you that if we don’t know how to become “indistractable,” then these companies are going to get you. They understand what makes you click and what makes you tick better than you understand yourself.
Where I disagree with you is that I wish people would stop using these terms like “addiction” and “hijacking our brains.” It is 100 percent complete rubbish. It is a great story that makes for wonderful headlines and feeds into our negativity bias and our confirmation bias and gives us a wonderful excuse to not do anything about the problem. The fact is — and I’m telling you this from inside the industry — these behavioral design tactics are good but they’re not that good.
We are not puppets on a string. People cannot be manipulated to do whatever you want them to do. They understand when a product harms them. There’s some categories where this doesn’t apply — people who are pathologically addicted, or children — but the vast majority of us either moderate our behavior or stop using the product altogether.
So I want to give people the tools to understand how to get the best out of technology without letting it get the best of us. But we have to stop using these terms “addiction” and “hijacking our brain.” This just makes the problem worse because of learned helplessness.
Let me decouple that into a few ideas here. One is whether or not we should use the term addiction. I take your point here that it can go too far. On the other hand, addiction is a spectrum behavior. The way you just tried to rule it out is not internally coherent. So, for instance, most people who try heroin don’t get addicted, but then we wouldn’t say, “Well, it’s rubbish to say heroin is addictive.”
We need to decide when we should and should not use the term “addiction.” As I understand it, the general definition of addiction is something people keep doing despite it having negative consequences on their life. It’s a complex, multidimensional disorder, but I view it as a very helpful concept for a lot of things in our lives.
And It’s not just tech products. It’s things with work. It’s things with food. I’ve seen many people say that complaining that these products are addictive is like complaining that restaurants put all this salt and sugar and butter into their food. But, of course, we have a huge obesity crisis because the food industry has gotten really good at hijacking our reward systems. It’s very hard to say no — it’s very hard for me to say no. I’m somebody who struggled with my weight all my life. So, I think it’s important to recognize that we are all deeply susceptible to different forms of addiction to different degrees.
For me, the most damaging myth is that we are in full control of our actions. That makes it really hard for us to realize when we’re being effectively manipulated.
We have to start with the difference between addiction and overuse. For the vast majority of us, what we’re talking about is overuse. Addiction is a pathology. It’s defined as a persistent compulsive dependency on a behavior or substance that harms the user. It is something that is incredibly hard to stop despite our attempts. When we say “I like something a lot” or “I overuse it sometimes,” this is very different from an addiction.
When we use this term [addiction] so loosely, it becomes meaningless. Addiction is a pathology that requires three things. It requires a person with a predilection for addiction. It requires the product. And it requires pain that they cannot cope with in a healthier manner. It’s only in the confluence of those three things that actual addiction occurs.
Take out one of those three factors and you don’t have an addiction. So when we make it sound like these things are addicting us, we are giving these companies more power and control than they deserve. There have been studies that found that the No. 1 criteria of whether someone can recover from an alcohol addiction is their belief in their own power to do so, even more than the chemical dependency itself.
So would you tell alcoholics not to call themselves addicts? That seems to be the implication of your argument.
Not necessarily. It’s not helpful to think about this in a binary way because lots of products that we use addict someone without addicting everyone. We have a glass of wine with dinner, we’re not all alcoholics. Many people have sex; they’re not all sex addicts. Many products can be addictive and not addict everyone. Any analgesic, literally anything that solves pain, is potentially addictive to someone. So, of course, if you have a product used by 2.5 billion people, like Facebook, somebody is going to get addicted.
But we can’t keep perpetuating this message to everyone because it makes it true. We are doing what these companies want us to do by telling people they’re powerless. “It’s hijacking your brain. It’s addictive.” That is bullshit.
I think you’re developing a definition here that is too narrow. One example is addiction to massive online multiplayer games. Not everybody gets addicted to these — just like not everybody gets addicted to alcohol or to other drugs. But some people do. That’s why I think it’s first important to be able to say that things can be addictive without everybody getting addicted.
But the point I find particularly unconvincing is that even if it’s true that these things can be addictive, and in fact are addictive to a bunch of people, we shouldn’t say it because it creates a learned helplessness problem. Are you really going to say that all those alcoholics who come out and say, “I’m an alcoholic and I am addicted,” have just developed a learned helplessness problem? Many of the major treatment modalities for alcoholics start from accepting helplessness over the addiction as the precondition for making changes. You have to stop believing that personal effort will be enough to change your life. Only by accepting that there is something happening that is somewhat outside of your control can you begin to make the large enough contextual changes in your life to gain some control on it.
So I don’t buy the idea that if we talk about addiction, we are helping the tech companies. I don’t think that’s how anything else in the addiction space works. I actually think that if we don’t talk about addiction, we are letting these companies off the hook. That’s what I think is dangerous.
I agree. There are people out there who are addicted to Facebook, just like there are people who are addicted to using Q-tips in their ears. This is not a joke. There are addictions for literally any analgesic, anything that solves pain. I’m not against telling folks that these things are potentially addictive, because they certainly are. In fact, I wrote an article years ago that companies have a responsibility to do something for addicts because they know who they are.
But when I think what’s happening today is we’re calling everything addictive. We’re saying “this addicts everyone.” Let’s not think about this as a binary way. Good versus evil. Big, bad tech is doing this to everyone. That’s not helpful.
One of the places where the rubber meets the road is whether or not individual action is as powerful as we want it to be. Your book has a lot of wonderful individual advice, but it also has a pretty deep skepticism of systemic solutions. For instance, I was surprised to see a section extolling the company Slack for having an internal work culture that’s very protective and lets people be heard.
That’s a company that I like — [Slack co-founder] Stewart Butterfield has even been on the show — but it’s pretty clear to me that that Slack has developed something incredibly distracting that has allowed work to enter every aspect of our lives. They have been innovators in creating ways to get people to spend more time looking at their phones and feeling connected to their workplace in ways that can make it hard to just live life. So it’s funny to me see to see Slack get a positive profile in a book that is all about trying to step back from distraction.
More broadly, it felt to me like there was less systemic critique in here than I was expecting, given what a departure it is from Hooked.
I chose the Slack example very intentionally. It’s a company that most people point the finger at and say, “That’s exactly what’s distracting me.” I use the Slack example because if it really is the technology that’s distracting us, then the people at Slack should be the most distracted people on earth. How do they get anything done?
But, as you read in the book, the people who work at Slack don’t have this problem. The problem is not about the technology. The technology is the proximate cause, not the root cause. The root cause of a constantly distracted workplace is [workplace] culture. That’s why I highlight Slack: It’s a company with a fantastic company culture. Despite their use of this technology, they don’t have the kind of distracted workplace. They have on their walls of company headquarters a big pink sign that says “work hard and go home.” Everybody from the CEO on down lives out that ethos — they practice this culture of letting people shut down. The real problem is an environment where people can’t talk about these cultural issues.
I want to give myself culpability here because I am one of the people who brought Slack to my workplace. Yet, hearing that story, I found myself a little bit infuriated, because it’s great to build your culture and have big pink signs saying “work hard and go home,” but you’ve built a product and you’re making billions off of a product that is full of ways to make sure people always feel like there’s more work to do. The smallest example is the fact that every new ping is in red and everything and every room that has a new word in it is in bold. It makes you constantly feel like everything is undone.
And it doesn’t have to be that way — that isn’t the default. You can make something that makes it easier visually to not feel like there’s always pressing work to get caught up on. But they didn’t do that.
You were actually in an article in the Guardian about the folks in Silicon Valley who invented these products and who have gotten very good at not using them, and who are even sending their kids to private schools where there are no screens. It’s all well and good for folks who are profiting off all this to have recognized the harms deep inside it. But outside of that circle, there are people who are not nearly as tech-savvy or schooled on it, and I think there’s culpability there.
Culpability with who? Do you think the company is responsible for making products that are less engaging? Are we going to shake our fist at Slack and Netflix and say, “Hey, your product is too engaging, Netflix.” Or, “Ezra, stop making these podcasts so good that I want to listen to them instead of being with my family.” This is ridiculous.
Ezra, this is the price of progress. We want these companies to make products that we want to use. What is the alternative? “Please make shitty products that I don’t want to use.” No, we want products to be engaging.
A product doesn’t need to be shitty to be not maximizing engagement at every moment. Let me give you an example. You can think of what my preferences are as a consumer in a couple of ways. One is what I do right now. Another is what I ultimately want to do. Often, we have a tendency to unthinkingly say — particularly with technological products — that whatever I’m going to do in the instant that something flashes in my face is what I really want to do. We say, “See, it’s just a great product because it’s super engaging.”
But then there’s this problem which is leading people like you to write whole books about how you’re ignoring your daughter ’cause you’re staring at your phone. That wasn’t because the phone was so good. It’s because on some level, the phone has a wrong view about how you want to live your life.
Take the media. At Vox, we could just put up the most “engaging” articles, but we shouldn’t and we don’t. We do a lot of articles that are not the most engaging because ultimately our audience wants to be informed, not just engaged. Often what we want in our lives is separate from what we will do if presented with the choice immediately before us.
So I really question the way you’re defining a good product. All of us who run big products, media organizations, enterprise software companies, need to ask ourselves this: Are we doing a good job building products to help people live the lives they want to lead, or have we somehow found a way to build it that is taking them off that path? I don’t think that’s a crazy thing to ask of people in power.
I think that is not only a good ethical imperative but a good business imperative. People aren’t idiots. They’re not puppets on a string. If people find over time that a product is not serving them, then they will stop using it. This is what human beings have always done. One of our most amazing traits is that we adapt and we adopt. This is what we have always done as a species. So instead of this moral panic — instead of putting blame on everyone else, and waiting for these companies to change — there are some simple things that we can do right now in order to prevent some of these harms.
Kierkegaard said that “anxiety is the dizziness of freedom.” I think that encapsulates what we feel right now. We have so many options, so much choice, unlimited articles to read, unlimited videos to watch, and websites to learn from that it’s a little dizzying. We are now in an adjustment period where we have to learn how to deal with all of this potential distraction. But instead of crossing our arms and blaming big, bad tech, there are some very simple things we can do to make sure that we put these technologies in their place while still getting the best of them. Why wouldn’t we do things right this minute that can help us have a better relationship with these technologies?
So let me agree with part of this and then disagree with part of it. I think you’re obviously right that solutions for these things operate at different levels simultaneously. At one of these levels, individuals are trying to build the capacities and create the contexts to live a less distracted life.
But there’s an existing context we all operate in, and I think it puts a lot on people to individualize things that are collective or ecosystem-oriented. So one way for me to be less hooked into some of these products is to not use them. But if everybody I know is on them — or if my workplace is on them — it’s actually not my choice to use them or not.
That’s a terrific point. When I started exploring the deeper psychology of distraction, I started with the individual. What can we as individuals do? That’s what the first half of the book is about, and it is very important to learn those techniques. The other half of the book is about this greater context of how we operate in an environment.
You can implement these techniques, but if your boss or editor calls you at 10 pm, then you’re distracted. That’s why it’s so important to understand that we do operate in a larger culture. We do operate in the larger environment. But I think the culprit is less the tools and more about the workplace environment. As I said before, distraction is a symptom of his dysfunctional work culture.
“Time management is pain management”
Let’s talk a bit about this concept of distraction, and start actually at the individual level. You have a part of the book where you talk about the four psychological roots of distraction. Do you want to run through those?
The punchline here is that we as a species are not designed for satisfaction. The self-help and the personal development industry tells us that if we are not happy, we are not normal. Nothing could be further from the truth — evolution designed us to never be satisfied. So there are these four cognitive quirks, like hedonic adaptation and negativity bias, that keep us perpetually perturbed, and that keep us wanting more.
Can you explain hedonic adaptation a bit more for those who may not be familiar?
Hedonic adaptation is the idea that as soon as we have any improvement in our life, we tend to go back down to baseline. We see this with people who have won the lottery and those who experience traumatic events. They experience a decline [or increase] in happiness for a while but then return to their baseline happiness. So hedonic adaptation keeps us at this base level of happiness, and causes us to perpetually want more and more.
The other thing you write in this section is that “distraction is not about distraction itself. Rather it’s about how we respond to it.” You spend a lot of time on the root causes of distraction. One way of thinking about that is that the root cause of distraction is evolutionary, but you also have a lot of more tangible causes of distraction. Do you want to talk through some of those?
This is particularly relevant when it comes to kids. Throughout history, we’ve had moral panics around things manipulating our kids’ brains. When I was growing up, it was Super Mario Brothers. Before that, it was television. Before that, it was radio. And before that, it was a comic book and the pinball machine. But [these technologies] are only proximate causes. They distract us from the real issue: Our kids are in crisis. Sure, they are overusing technology as an escape. But we don’t ask ourselves, what are they escaping from?
In my research, I spoke with Richard M. Ryan — the founder of self-determination theory — who believes that what’s going on has to do with what [he] calls “needs displacement hypothesis”: the idea that kids go online to fulfill the psychological nutrients that they’re not getting offline. So when we just blame the service, it is harmful because we don’t look for the deeper reason why our kids are behaving this way.
I suspect that what’s changed is not whether or not people need to escape, but how many opportunities we have for escape. Ten, 15, 20 years ago, it was actually a little hard to escape, but now with our phones, we always have escape right nearby. If you don’t like what you’re feeling right now, you can just escape it.
You talk in the book about just sitting with a feeling that is uncomfortable until it evaporates. But, now we don’t have to sit with those feelings because of our devices. So we come to associate the feelings like calming that spike of anxiety or boredom with the device. And that is where I think we begin to get unhealthy and overdependent associations.
That is such an important aspect of managing distraction — it’s the first step.
I started the book with the question: Why do we get distracted? It turns out, this is an age-old problem — even Socrates and Plato talked about it 2,500 years ago. The problem for most folks isn’t a knowledge gap. We know that if we want to be healthy, we have to eat right and you have to exercise. We don’t have to buy a diet book for that. If we want to have healthy relationships, we have to be fully present with the people we care about. If we want to do well at our jobs, we have to do the hard work. We know this stuff. What fascinated me is why don’t we do what we know we should do?
To answer that question I had to start with, why do we do anything? What’s the nature of human motivation? Turns out that psychologically speaking and neurologically speaking, the source of all human motivation is pain. This creates what is called the homeostatic response.
We feel this physiologically all the time. If you feel cold, you put on a jacket. If you’re hot, you take it off. If you’re hungry, you eat. Physiological discomfort spurs us to action. The same thing applies to psychological discomfort. When we feel lonely, we check Facebook. When we are uncertain, we Google something. If we’re bored in the car, we’ll check the news or look at sports scores.
If all behavior is spurred by a desire to escape discomfort, this means that time management is pain management. That’s something I’d never realized before. We talk about all these life hacks and productivity tricks, but fundamentally, if we don’t control the discomfort we’re trying to escape, we will always get distracted by something, as people have always been. So, that has to be the first step.
But given, as you say earlier, that we are creatures who are built to be constantly uncomfortable, constantly in discomfort, how do we constantly manage pain? After all, most of us live in a state of material plenty unknown at any point in human history, but we still feel this pain constantly because of the way we are wired.
That’s a good point. There’s a simple answer that’s hopefully not too simplistic. It’s a mantra I repeat to myself daily: “The antidote to impulsiveness is forethought.” By planning ahead, we utilize a unique gift of our species — [that] we can see the future better than any other animal. So what we have to do is to plan ahead. No matter what the algorithms these companies have, we have the power to plan ahead. In the moment, it’s too late. If the chocolate cake is on the fork headed to your mouth, too late, you’ve already lost. You have to act in advance to make sure you don’t do the things you don’t want to do.
I agree with this. I still love the book Predictably Irrational by Dan Ariely. The whole idea is that if you understand when you are going to be irrational, you can predict it, and if can predict it, you can plan ahead for it. But one of the reasons I pushed you on having a society-level critique is that your assumption is that you have freedom and resources to plan ahead.
Something we see happening now is an unexpected version of the digital divide. Ten years ago, the panic was that rich kids would have iPads and iPhones and poor kids wouldn’t even have access to broadband internet. Now it’s turning out to be the opposite: Poor kids are spending much more time on screens because their parents don’t have the time to monitor their behavior or they don’t have access to other activities where they live. Whereas wealthier kids are increasingly only allowed to play with wooden blocks.
So I worry about the development of distraction inequality. Increasingly, focus will be an advantage in our society. But whether you develop that power depends a lot on context, and not everybody has that context. I think should concern us.
I couldn’t agree more. I agree that the world is bifurcating into two types of people: people who allow their attention to be manipulated and controlled by others and those who stand up and say, “No, I am indistractable. I live with intent.” One of those “others” is certainly the technology we use. But another is the people in our lives — our bosses telling us what to do, our significant others, our kids. So we can either take the prohibition approach or we can [focus on] increasing people’s ability to get the best out of technology without letting it get the best of us.
One of the clear certainties of human existence is that technological innovation improves living standards over time. We want this to continue. I believe we should look at these technologies and ask ourselves, “Is this something that is serving me, or am I serving it?” But we don’t want that skepticism to turn into cynicism. I think that cynicism can be quite toxic.
That answer really tracks both where I agree with you and where I really disagree. When I hear that there will be two kinds of people, those who will “allow” others to distract them and those who won’t, I just don’t believe we have as much control as you believe we have. For instance, I’ve done a lot to reduce distraction in my life. I live in a place with beautiful weather. I have the means to go do the things I want to do. I have leverage at work to structure when I’m on and off. And I still fail at this constantly.
So for someone who doesn’t live in a safe area with nice weather, who doesn’t have leverage at their work, etc., the idea that they can just stand up and decide to be “indistractable” is asking more than many people can offer. It makes me very skeptical about willpower as the answer.
Well, I’m not advocating for willpower. I’ll be the first to admit that I have very poor self-control. That’s why I wrote this book. I think the solution is access. What I do want to be careful of is this prohibitionist approach. I think that’s an impossible answer. There is no putting this stuff back in the Pandora’s box — it’s too late.
What we have to do is to learn how to live with it. And so the idea is we have to learn these tactics. We have to learn the truth about all distraction and how doing things that we don’t intend to do can be harmful in many forms, not just the technology aspect.