Beginning with Google’s development of targeted online ads, the most successful companies in the world have been powered by “surveillance capitalism” — a term popularized by the guest on the latest episode of Recode Decode, Shoshana Zuboff.
“All of the economic imperatives now that define surveillance capitalism are aimed at, how do we get better and better prediction products?” Zuboff told Recode’s Kara Swisher. “How do we win the most lucrative prediction products, so that not only are we predicting the future, but really increasingly, our prediction products are equal to observation.”
There are just a couple problems: One, when customers are fully informed about how their data is being used, they don’t like it. So, companies like Google and Facebook have decided to “take without asking,” Zuboff said. And whoever has all that data has a tremendous amount of power — so much so that the same people who unwittingly provided more data than they realized to tech companies can then be manipulated toward commercial and political outcomes.
“Right now, surveillance capitalists sit on a huge asymmetry of knowledge,” she said. “They have an asymmetry of knowledge, a concentration of knowledge unlike anything ever seen in human history ... We have an institutional disfiguring of these huge asymmetries of knowledge and power which are antithetical to democracy.
“You cannot have a well-functioning democracy with massive inequalities of knowledge and power,” Zuboff added. “That’s eroding democracy from the big institutional level, but now from the individual level, from the inside out. The fact that our autonomy is comprised, that these things are happening outside of our awareness, that they can take hold of our behavior and shift it and modify it in ways that we don’t know.”
Below, we’ve shared a lightly edited full transcript of Kara’s conversation with Shoshana.
Kara Swisher: Hi. I’m Kara Swisher, editor-at-large of Recode. You may know me as the surveiller of capitalists, but in my spare time I talk technology, and you’re listening to Recode Decode from the Vox Media Podcast Network.
Today in the red chair is Shoshana Zuboff, a professor emerita of Harvard Business School who has written several books about technology and economics. Her most recent book is called The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. That’s a lot there, Shoshana. That’s got a lot going on.
So, let’s talk a little bit about your background so people get a sense. This is getting a lot of attention, your book, and especially I’ve been using the word surveillance quite a lot, especially about surveillance economy, surveillance dates, and things like that. A long-time issue in human history, but now it’s more important than ever. Why don’t we talk a little bit about how you got to writing this particular book and some things you’ve done in the past. So, love to hear a little bit of your background.
Shoshana Zuboff: Well, I think the impetus for this book, which has been a long time in the making, seven years just to produce this book but many years before in the ideas and development, the real driver here is the sense that our hopes and dreams for the digital future, our sense of an empowering and democratizing digital future...
Which it was, at the beginning.
Which it was at the beginning, and a sense that this dream was slipping away. And that the reasons why it was slipping away, the causes of this shift, were not really clear, not really well understood, of forces taking shape very much behind the scenes. It’s almost like we woke up and suddenly the internet was owned and operated by private capital under a kind of regime, a new economic logic that really was not well understood.
So my motivation, Kara, has come from really wanting to spend the time to understand, to name exactly what this economic logic is and how its own imperatives, its own compulsions created a completely different trajectory toward the digital future. Something that we didn’t buy into, that we didn’t expect, and because it’s so unprecedented, it is by its very nature difficult to perceive.
Absolutely. And also difficult to control. I like the word compulsion, because I think that’s a really good way to put it. It’s an emotional word, but it’s not. It’s actually, it has to do what it’s doing.
It has to do what it’s doing. It’s a machine that’s got to move in the direction that it’s moving. The people in it are not bad people. They’re not bad actors, but they themselves now are caught up in an economic machine that sometimes they even don’t understand very well, and where it’s driving, and what its imperatives are, and most important, what the consequences of those imperatives are.
Right. Right. Interestingly, I just did an interview on Twitter with Jack Dorsey. That was sort of a bit of a goat rodeo, but it was interesting because a lot of the questions, I kept asking for specifics and he couldn’t do them. It was really fascinating. I think people found that part the most fascinating, besides the platform being terrible to try to conduct any kind of conversation on.
But, let me hear from your background. Talk about some of the things you’ve done before this, and then I want to get to the term “surveillance capitalism,” which I think is a fantastic way to put it. Let me hear the trajectory of your career. You started where to get to this kind of topic?
Well, as far as my professional career, I began studying the shift to the digital in 1978.
Were you born then, Kara?
I was. I’m very old. I look fantastic, but I’m actually quite old.
Well, hat’s off. I’ll date myself. I’ll come right out there and date myself. I started in 1978 interviewing office workers, Linotype workers, factory workers ...
As the shift was happening.
... who were the first, the front line of our workforce that was shifting to the digital medium. That, over time, led to my first book, In the Age of the Smart Machine: The Future of Work and Power.
Talk about that time for people to understand it. I had a K-Pro. I had a Trash-80. I know all these things, but talk about what was happening then. I had one of those suitcase phones and everything else.
Well, you know, what was happening then was typical of the 20th century story of capital, which was the real titanic struggles in society were between capital and labor, and the forces of capital came down in the economic domain, in our workplaces, on our lives as workers, as employees, even as managers eventually, you know, in our factories, in our offices.
And so, that was the front line where I first began to understand that this shift to the digital wasn’t only a change in the equipment that we use, but a change in the whole way that we construe and relate reality into our own experiences. The removal from the essential, or the removal from the embodied experience, toward more abstraction, towards the intellectualization of work and so forth. I understood early on that this was going to require profound “retraining.” I kind of hate that word because it so trivializes the real deal here, which is that ...
It’s beyond retraining. It’s a whole new idea.
It’s beyond retraining.
Well, even before we get to entrepreneurship, the idea that I’m working in a factory. I used to deal with a machine. It was a whole-body experience running that thing.
Mm-hmm. It was mechanical.
And now I’m looking at a screen, and I’m looking at information, and I’ve got to understand it. If I’m going to be included in the workforce, then I’ve got to have the intellectual skills to understand this new milieu and make a contribution.
Unfortunately, what happened in our society is that most businesses went the other direction. They didn’t include the workforce in this shift. Now, 30 years later, we’ve got so many people excluded from the workforce. So many people taking drugs and having no place to go and not being part of this future, and at the same time, we’ve got businesses who are complaining, “Hey, we don’t have a skilled workforce.”
Right, right, right. Absolutely.
”Where are our skilled workers?” This is the profound irrationality that we’ve seen developing in the system.
So why back then did it happen? I’m sorry to dwell on the past, but I think it’s very important to set the table of why we got here. Is there one thing that struck you at the time, an example of that?
Of the exclusion?
Well, I mean, it was ubiquitous. Honestly, Kara, there wasn’t a single company. I had researched companies all over the world going through this transformational process, and I rarely found a company that was taking the deep-seated need for a new level of educational effort, and inclusion, and public/private collaboration around that. The bigger story here is, you know, I write about this in the new book, the neoliberal paradigm, the idea that we’re moving into — this is back in the 1980s — we’re moving into a shareholder value-maximization universe.
Everything is cost-down. Everything is cost-cutting. Everything is automation. Everything is offshoring. Everything is outsourcing. So really, the workforce and the development of a robust, smart, inclusive workforce that was going to carry us way into the 21st century, that was not on anybody’s radar. I really was kind of a voice in the wilderness on that subject. I think that has come back to haunt us now, to haunt our society, to haunt our people, but also, to haunt our competitive capacity.
I do think this idea that people would matter in this equation... I know if you remember a crazy movie, it’s one of my favorite movies, Desk Set with Katharine Hepburn and Spencer Tracy, where he ...
It’s a wonderful movie about that issue. It’s very much ...
For another century. Absolutely.
... For another century. They were all researchers, and they knew all the facts, and then the computer came in and would spit out the fact instantly, and so they could replace the entire thing. It was such an such a tricky little movie, actually.
It was trying to be light and bright, but it wasn’t at all light and bright. In the end, the humans prevailed, but they didn’t. You sort of like, “That’s going to last for five seconds.” They had a great cast; Joan Blondell, I think, was in it.
But the concept that I always think about when I first start to see these things, and especially saw ... I worked at a newspaper and I saw them around classifieds. I covered retail and I was like, “These people don’t need to advertise in retail.” You know? “They don’t need to advertise.” Classifieds are static, expensive. The person who’s taking the ad is a jerk, and they don’t work. These were the things. Your whole business model is terrible, and classifieds do. The classifieds online do [work].
I thought they weren’t going to just, say, I don’t know, whatever, a $70 million business in San Francisco, the Chronicle for example, it would be collapsed to seven and never go back. It wasn’t going to be 70. Seven was the amount. What I kept thinking was every single thing that can be digitized would be digitized. And of course it would be digitized, and there’d be no question about it, and therefore so many jobs will be eliminated. So many people would not be able to be trained properly, and figuring out how to train them was really difficult. Very difficult to do, unless someone was really paying attention.
Yeah. And someone who’s paying attention, if they’re willing to spend some money.
Looking toward the future and all the things that we’re supposed to count on our institutions to do.
Right. Which they didn’t do.
Which they didn’t do, largely because of this ideology that swept in and ...
Maximized shareholder profit.
And now finally, again, all these years later, decades later, we’re finally getting the critiques of exactly the destruction and devastation wrought by that shareholder value-maximization paradigm that has had everybody by the throat for so long and ... Well. Okay.
No. Go ahead.
Well, I was just going say that that’s kind of the segue to the new work in a way too, because that paradigm scraped the life out of so many of our institutions and our businesses, you know? So now, whether you’re trying to deal with an insurance company, or a telephone company, or an airline, or your health care provider, or even the school system, these institutions have been scraped to the bone, and it’s so hard for us to get the information and the support and the relationship and all the things that we’re looking for, let alone the voice and the influence. So the institutional world has become a very impoverished, frustrating place for most people, unless you’re super wealthy and you can buffer yourself from these things.
That’s really what drove us to the internet. You know, back in the day, in the late ’90s when the World Wide Web broke on us and ...
We called it the World Wide Web, remember?
You bet. And we rushed there really, looking for the succor, the voice, the influence, the information, the connection that we couldn’t get in these hierarchical silos that were just now cost-down. You know, you get seven minutes with your doctor, and you know?
And so forth. So we went to the internet looking for what had been taken from us in the real world, and for a while, that promise really was alive. You know? That you could get information that had been siloed away.
Right. From government or whatever.
You could contact people up in a hierarchy that would never pay any attention to you, and you could create connection and networking that ...
Whether it was medical, whether it was anything.
Exactly, and find like-minded people, or people with a similar ailment, or people trying to tackle a similar problem. The promise of empowerment and democratization was real for a few years. The way I read that history was that there came a moment, and I write about it, that a lot of it had to do with the financial emergency of the dot-com bust and so forth, where that began to turn. That’s when surveillance capitalism was discovered, invented, stumbled into.
That’s when Google was invented.
It happened at Google.
Well, Google was invented out of the bust. That was when it got ...
The Google that we know came out of the bust, and it came out of the bust, building on that vocab, came out of the bust gangbusters because it had discovered surveillance capitalism. It had discovered this economic logic, and that’s what saved it, and that’s what spread from there. So, there was a window when our hopes and the promise of the digital milieu of a new information civilization, which reintegrated these principles of the individual and the democratization and so on, there was reality there.
But that window slowly closed. It was closing even before we knew it. We still were thinking that it was this one thing when it was already turning into something very different.
You know, the idea of it was that idea of reachability. It was a sort of a Star-Trekian version of ... You know what I mean? That we all shared information freely, and that it was easy to reach people, that you could connect.
I will never forget going to AOL. That’s got to be in the ’90s, ’96. No, earlier than that, 4. 5. ’94. There was a bunch of quilters there who had met online, on America Online, and they had made a quilt all together. It was such a metaphorical thing, with a big AOL symbol for Steve Case. They wanted to meet him because he had a personality with them online. They’d never met him. They had never met each other, but they had created this thing together.
I remember thinking, “What a hopeful idea. It’s a silly quilt, but at the same time, what a wonderful connect...” These people connected from all across the country. They brought cookies, and they petted him, and everything else. It was really like a moment. Like, this is a possibility. It was silly, but it was also very profound. I remember thinking, because it was about cooperation, and across borders, and across geographies, and across loneliness, all kinds of things, and it was fascinating. And then: no. You know what I mean? No.
A funny thing happened on the way to the forum.
On the way. That’s right. Exactly. That was a good movie. Talk about what happened, and how you sort of coined this term, which again, I love.
Thank you. Well, the way I tell the story … surveillance capitalism, like mass production capitalism, was invented at a time and place. You could say invented, discovered, cobbled together, trial and error, experiment. But the thing is, it was a human thing, and it was discovered, invented, elaborated, in a moment of emergency in Silicon Valley with the bursting of the dot-com bubble, a lot of pressure on all those young startups, all those fledgling companies. Google was right there. It had the best search engine, it had some of the smartest people, these brilliant founders, great values, and have ...
Allegedly great values. I was there.
Well, publicly stated great values. I can’t opine beyond that.
It was right after they got their first plane that everything fell apart, but go ahead.
Okay. Well, you’re the ...
I remember that. I was like, “Oh. I see.”
You’re the onsite reporter, girl, so.
No, I just have to say. When I saw that first plane, then they had several.
Tell the story.
I was like, “Oh, oh dear. It’s done.”
I hear you, I hear you. So, what happened was, even though it was widely understood that they had the best search engine, even they were now under tremendous financial pressure, and even their very swanky venture capitalists were threatening to withdraw support. So, long story short, they went through a dark night of the soul. They had been very public about rejecting online advertising as a disfiguring force, both in general on the internet and specifically for their search engine.
They did like the purity of it at the beginning, they really did. They really did mean that, and I do remember there’s a story in Fortune called “Chaos at Google.” I remember them doing the O’s with “chaos” in there. And I remember thinking, “Oh dear, now they’re going to have to ...” You know, there was pressure, you’re right 100 percent.
Go over. Yeah.
So, you know, this kind of pressure really changes the situation for people, and they’re not the only ones who have experienced this kind of thing. But you know, then you got to make some tough choices. And essentially, what they did was declare a state of exception.
That state of exception is a powerful concept. You get to suspend your principles. In politics, you get to suspend the parliament, suspend the congress, suspend democracy in order to operate under emergency. So, they declared a state of exception. And at that point, there was already a situation where they knew that they had a lot of collateral behavioral data that was left over from people’s searching and browsing behavior. The data was set aside, considered waste, not adequately stored or organized. So people have been fooling around with it, understood that it had a lot of predictive value.
Under the state of exception, what they decided to do was use these data logs, “data exhaust,” for their predictive power. Combine those with their already frontier computational capabilities. And even in those days, they were calling it AI. You know, AI is a moving target, as you know better than anyone. In every era it’s AI, but it keeps changing.
So, combine these unused data with their computational capabilities, and use that to predict a piece of future behavior. And this, a future human behavior. In this case, where someone was likely to click. And what they were going to do is now sell this to their advertisers. Coming out of the black box, a product, a computational product that predicts this little piece of human behavior, where someone is going to click. So those online advertising markets suddenly were transformed.
Not just advertisers figuring out keywords and where to place their ads. Now they’re transformed into a different kind of market. These markets, if you just zoom out a tiny bit, what you see is that these markets are now trading in behavioral futures. They’re trading in these tiny products that predict future human behavior. Again, specifically here, click-through behavior.
So now we have a logic where the surveillance capitalism is unilaterally claiming private human experience. Because of course, the folks who are searching and browsing didn’t know that they were exuding these collateral data, or that those data were being saved.
Right, which they were. Because they would put them up on the wall at Google. If you’ve ever been there early in the day, they have the scrolling queries.
That’s right, in the lobby.
And then you would watch them. And you could see that it was so valuable, it was like gold going ... and they spun it into gold, really.
They spun it into gold. That’s exactly what they did, Kara. And in fact, there’s stories about Larry Page actually worrying about that, that scrolling display in the lobby, that it gave away too much of exactly how intimate and how insightful and how personal these flows of data were.
So, the logic here becomes, unilaterally claiming this private human experience for a market dynamic. Now we’re taking it into the market. Once we take it into the market, it comes out the other side as behavioral data. We combine that behavioral data with computation. And out of that we produce these prediction products that tell us what you are likely to do now, soon, and later.
Right. And as they add more data into it, like location, or whatever you do. I used to call it to them, a database of human intention. You now have the database of human intention.
Okay girl, well then you got it in one.
It was fascinating.
This is the database of the human future. And those online targeted ad marketplaces were the first precursors really of what have become the dominant form of information capitalism in our time, where we are trading futures in human behavior. That has become how surveillance capitalism rose to dominance, how it makes its enormous revenues, how it has earned its market cap and become the largest, most powerful companies on earth. By convening these markets to business customers, not to us, that want to know what we are going to do in the future.
And all of the economic imperatives now that define surveillance capitalism are aimed at, how do we get better and better prediction products? How do we win the most lucrative prediction products, so that not only are we predicting the future, but really increasingly, our prediction products are equal to observation. Because ultimately, as you just mentioned Kara, first we go for scale, we need a lot of data. Then we go for scope, we need all different kinds of data, out from the online universe into the real world.
Where we’re going, all the sensors, all the cameras, all the devices, all the internet of things. Then we’re going deep into personality, emotions, facial recognition, voice. But then finally, we’re going beyond scale and beyond scope to something I call action, economies of action. How do we actually intervene in the state of play to shift, modify, tune, herd your behavior ...
To where we want it.
... toward our guaranteed outcomes, our guaranteed commercial outcomes. Because the more we can do that, the more powerful the predictive data.
Which was the premise of advertising in sort of a spray-and-pray method in the old days. Like, “Oh, this ad will make you want to use Kodak.” But it was very ...
Yeah girl, but without the digital. Now they’ve got an unprecedented in human history digital architecture of intense detailed knowledge. Which also means intense kind of power. What is this knowledge that has never existed before? And what is the kind of power that accrues to them, that with that knowledge from all this ubiquitous architecture that allows them to know so much about us? What is the kind of power that accrues to that, that allows them to now use this architecture as a global means of behavioral modifications actually, to tune in ...
That is used in some places that way.
... yes, to tune and herd and shape us with methods that are designed to be out of our awareness.
Right. That’s exactly what I was just talking about, it’s that you don’t understand it and you shouldn’t have to understand it. You don’t understand why a car is unsafe. You don’t need to be an engineer to understand that you should be protected in that way. And what they do is, they force you to do ... I was saying this to Dorsey on the thing, I’m like, “You say we’re sick and then you force us to cure ourselves when you created the illness.” It’s kind of ... it would just fascinate, and they’re all like, “What?”
Which I think, one of the parts of it that I find really is the ... they push away the power they have. They pretend they do not have this power. And then what I began to realize recently and over the last year or two is that they’re incompetent to the task. They don’t have the skills necessary. They don’t have the ethical underpinnings, they don’t have the knowledge about society. They don’t have the emotional quotient to do it. The whole thing is so abstract that they can’t even begin to get what’s happening. The question ... talk about how you came up with the idea of surveillance. Because surveillance is a very … heavy word.
Loaded, it’s heavily loaded. It reminds one of China, surveillance, watching, spying, things like that. Talk about how you coined this term.
Because I think it’s completely appropriate, but talk about that.
All right, yeah. I hear you, and that’s a really good question. And I want our listeners to know that it’s not hyperbolic.
Mm-hmm. No, it’s not.
Yeah. And it was very intentional. Because, you know, think about the term mass production capitalism, which historians have used a lot, or later, managerial capitalism, which historians have used a lot. These adjectives that modify the capitalism, what they’re doing is, they’re pointing to the pivotal piece that is the value creation hub, that critical success factor for value creation that defines this unique market form.
So for mass production capitalism, it was the mass production system that was the source of value creation. In contrast to, say, mercantile capitalism. For managerial capitalism, it was the whole professional managerial hierarchy, the administration, all of that, that created the value that drove the economies of this new capitalism and made it so successful.
So when you’re saying surveillance, someone, I think it was Roger McNamee said the other day, “Capitalism is like chicken. You can make it taste like anything.” And as you add whatever the special factor is. And in this case, surveillance is it.
Well, what happened in this discovery process was, they realized that there were behavioral data all over the place that had tremendous predictive value. And it was more data than they needed to improve their products and services. So it was surplus data. So, how are we going to get this surplus data? Because people aren’t giving it to us.
Or, if they give it to us it’s by accident and they don’t know we’re taking it. If we ask them for it, they’re not going to give it to us. Because really, any time, every piece of research going back to the early 2000s, any time you tell people about these practices of taking their experience, turning it into data, using it to project and so forth, nobody wants any part of it. As you said a moment ago, everybody wants security, everybody wants to be protected from it. Nobody wants to be part of this.
Though they like free things. But go ahead.
Well, that’s another story. They understood early on that if they’re going to get this surplus data, they had do it surreptitiously. They had to do it through what I call the social relations of the one-way mirror. To take without asking. And early on, you look at many of those early patents and you see the scientists actually defining in a very positive way, “We can get data that people did not intend to disclose. We can get data that people don’t even know they disclosed, because we can fit together different bits and pieces and make deductions and inferences. Therefore, we can come up with profiles and insights and patterns about individuals and groups and so forth that people don’t even know they’re giving away and did not agree to give away.”
So from the beginning, for this thing to work to get that behavioral surplus, they had to do it secretly. They had to do it backstage. They had to do it with mechanisms that were designed to keep us ignorant, designed to bypass our awareness.
Mm-hmm. And then call it a black box.
Well actually, better yet, don’t call it anything.
Don’t call it ... right, right.
It’s like ...
“No, they’re doing it over there.”
Well, “We’re not doing anything, what are you talking about?”
Or aren’t these ... or if we give you this map and you turn it on, you will have an even better experience. And I’m like, “No.”
Right, so what they’re ...
And even using it, I’m disturbed, you know. And I don’t turn on any of the saving functions.
You know, one time I took a few weeks off and I got together all the manuals I could find that great magicians had ever written to describe their craft and how they actually pull off these incredible treks. And what I learned from that was, the key pivot for a great magician is the idea of misdirection.
Right. So boom, I’m over here, you’re over there. Your eyes are there, I’m working over here. And then going back to look at the rhetoric and the practice of surveillance capitalists right from the beginning, it’s so clear that misdirection has been an essential piece of this: “We’re giving you free services. And we’re connecting the world, we’re making a community, and you can search for everything, democratization of knowledge.”
It’s not that some of that isn’t true, it’s just that it’s misdirecting us to this piece of the iceberg, when the whole other part of the iceberg is underwater, unavailable, uninspectable, obfuscated, intentionally hidden. And you know, fast-forward, 2012, 2013, the scholarly write-ups about the Facebook emotional contagion experiments. Where the smart people, they’re researchers from Facebook and academics, they write about the outcomes of this research in which they discovered that they can use subliminal cues online to manipulate offline behavior.
Online, we can do something that changes you enough to actually change your behavior in the real world. This is a very big deal. In the scholarly write-up they brag about this. They say, “Now we know that we can use the online medium to change behavior in the real world,” and they boast very clearly, very explicitly, “And we can do this bypassing the individual’s awareness.”
That is a critical success factor to this entire economic logic.
Right, you have to not know why you’re pushing that red button, but they make you.
Ergo, surveillance capitalism.
We’re here with Shoshana Zuboff, the author of The Age of Surveillance Capitalism. She’s a professor emerita at Harvard Business School and has written lots of books about technology and economics. And we’re just talking about this idea that they’re sneaking around, I mean, pretty much they’re sneaking around, and we don’t what they’re doing, and we’re agreeing to it, tacitly, by not doing anything. Or being taken advantage of.
Which way do you look at it? Because I think people do accept ... you know, they accept, especially because they’re enormous companies. I was just talking to someone this week when Eero was bought by Amazon. I have Eero in my house. I like it. It’s a mesh network. My kids like it because it makes their whatever, Red Dead Redemption 26 work better.
It was bought by Amazon and I remember thinking, “Oh God, they got into my house.” I didn’t let any of them into my house and I like this mesh network. Or I had a Ring thing in the front of my house and Amazon bought that and then Google’s Nest was in my house and I had to take it out.
You’re going down, girl.
I know, they’re gonna get me. I don’t know what they’re gonna do.
They got you.
But they don’t. Interestingly, my kids unplugged the Nests. Like, “We don’t want them watching us.”
But they’re good products. They’re cool.
The temperature products are good.
They were before the economic logic hijacked them.
Right, exactly. Like, hey, it’s great to be able to manipulate your temperature on an app. Great. What a great product.
But then I realized the other day, they’re watching my temperature. I don’t know what use that is but there’s some use to it. There’s some fascinating use. What do we do? Because even, I literally am thinking they have me coming and going, and I’m pretty aware of this stuff.
I know they’re sneaky bastards. I got that. I know about them and then they ...
If anyone’s had the close-up, bird’s-eye view, it’s you.
What the worst part is, I think when I talk to them is they don’t think they are. I’m like, “Are you lying? Or lying to yourselves?” It’s a really weird ... “I don’t know how this happened, Kara, I don’t know how we have all this data, I don’t know how we misused it.”
Then you get sort of essential bullshit from people like Mark Zuckerberg who’s like, “What we wanna do is bring you relevant ads,” and I’m like, “Said nobody to anybody ever. I do not want those.” Maybe I do, but not really. It’s not something I requested. What do we do?
Part of what you’re talking about here is the misdirection, the romance.
Weaving this romantic fantasy about it.
You like a magician. Who doesn’t like magician? Who wants to see the girl ...
Reconnect you or relevant ads and we’re the new church.
But look, this is ...
I hate the word “relevant ads,” but go ahead.
This is economic history. This is big-time flows of capital. These are corporations. I think there are a couple of important things for our listeners to know. One is that there are some of what the philosophers call category errors that have been foisted upon us.
One is that this is how the digital works. Everything that we’re talking about here, this is just a consequence of digital technology.
It sucks up information.
You want the digital, this is what you get.
That is absolutely dead wrong.
Yeah, they can turn it off.
Right. Dead wrong. We know that there were wonderful models and reports and projects and early developments, the smart home, before surveillance capitalism became public when Google IPO’d in 2004 and we began to actually see this economic logic at work.
The whole idea was a simple closed loop. You got devices in the home. Those devices are producing useful information for the occupant of the home. Simple closed loop. Two nodes. The devices and the occupant. It’s the occupant that gets the data, it’s the occupant that decides what it means, with whom to share, and so on and so forth.
You fast-forward, you brought up the Nest thermostat. Analyses of the Nest thermostat now show that any vigilant consumer who’s got one needs to review a minimum of 1,000 privacy contracts because Nest is a hub for all these smart devices. Each one siphons your data to third parties and third parties and third parties in infinite regress.
This is an economic logic that is like a parasite just glommed onto the digital milieu and hijacked it in a completely different direction. What is this direction? We’re in the beginning of the 21st century. One of the things that I think is so important for us to think about is that we’re talking about … When we talk about surveillance capitalism, just as industrial capitalism gave us the culture and the quality and the moral milieu of our industrial society and our industrial civilization, right now surveillance capitalism dominates, and if we don’t stop it, it’s going to define the moral milieu and the culture and the nature of 21st century society.
Right now, what that looks like is an extremely unequal society where ... In an information society, we shift from really an emphasis on labor and the division of labor as the key thing that organizes us, to learning and a division of learning is the key thing that organizes us. Who gets to know stuff? Who decides who gets to know stuff? Who decides who decides who gets to know stuff?
It all goes to Mark Zuckerberg, but go ahead.
These are the dilemmas of knowledge, authority, and power that define our 21st century society. Right now, surveillance capitalists sit on a huge asymmetry of knowledge. They have an asymmetry of knowledge, a concentration of knowledge unlike anything ever seen in human history.
And with that knowledge comes, as we’ve talked about before, the ability to actually shape and modify our behavior to tune us and herd us toward their commercial outcomes. This is now a new axis of social inequality that’s not only economic inequality — which is still critically important — but also knowledge inequality and the inequality of decision rights, the inequality of our capacity to be autonomous and self-determining, the inequality of human agency.
We have an institutional disfiguring of these huge asymmetries of knowledge and power which are antithetical to democracy.
You cannot have a well-functioning democracy with massive inequalities of knowledge and power. That’s eroding democracy from the big institutional level, but now from the individual level, from the inside out. The fact that our autonomy is comprised, that these things are happening outside of our awareness, that they can take hold of our behavior and shift it and modify it in ways that we don’t know.
And make it very noisy.
This is eroding our moral autonomy, our ability to claim our future for our own agency, for our own decisions, for our own choices, our own promises of where I wanna go and how I wanna get there.
Essentially, we’re stupid from the top and we have no choice and we’re being spied on from the bottom.
And being pushed around without our knowledge. Stupid and manipulated is what you’re saying.
These qualities of moral autonomy and individual sovereignty, these are the elements that are the constituent forces of democracy. You can’t imagine a democratic society without imagining people who have these qualities.
We’re getting eroded from the inside and from the outside and when we see something like Cambridge Analytica, which has been a big “aha” for a lot of people all over the world, what we see is this erosion in play, using exactly the methodologies of surveillance capitalism, just slightly pivoting them toward political outcomes instead of commercial outcomes, using them to change our behavior. And the only way they can do that is by mustering these huge asymmetries of knowledge, turning that into power to intervene on us and modify us and control us and manipulate us and undermining our individual sovereignty.
What do we do? We only have a few more minutes. What do we do? Regulation, what happens? What has to?
We’ve got sort of three big categories of what we do. No. 1, we need a sea change in public opinion. We need to wake up. We need to name what’s going on, we need to grasp it, we need to understand it. They have been allowed to develop in this direction for the last 20 years as democracy has slept. They have been unimpeded by law, unimpeded by regulation. That has to change.
And the way that’s gonna change is a sea change in public awareness. The outrage, the sense of intolerability, this is not okay. As we become aware as a public, we’re putting pressure on our democratic institutions. We need new law, we need new regulatory regimes that interrupt and outlaw the key mechanisms of surveillance capitalism, including the very principles of taking human experience unilaterally and translating it into data. Including the very principles of do we want a dominant capitalism that trades in behavioral futures?
Is that the way we wanna make money in the 21st century? That’s No. 1.
No. 2, we need new forms of collective action. In the 20th century, we had collective bargaining, we had the institution of the strike, we had people coming together to create power, to balance capital. We need to do that now beyond the economic domain. We’re just called “users,” but we’re not just users. We have political, social, and psychological vested interest in what’s going on and in the possibility of a free and democratic future.
It’s interesting. “Users” are only used with drug addicts.
Think about it.
It’s their name for us, not our name for ourselves.
I had the most incredible meeting. I think it was Van Jones was speaking in front of a group of young African American kids in a church and I wouldn’t have said this but he did this, it was really amazing. He said, “How many of you download stuff from the internet?”
And they said, “Oh, what a stupid old man. Yeah, of course we do. Everybody does.” Then he says, “How many of you upload things to it?” And they were like, “What?” And he goes, “You’re all digital sharecroppers.” It was an astonishing thing to say in front of ... But he was right.
You are being used by the powers that be to till their land. Your land is now their land and your information is now theirs. It was really an eye-opening moment for me and I was sort of like ... And then the kids of course got it. Like, “Oh. If we’re not part of the ownership of it, we are being used.” It was really fascinating.
What were are is the free source of raw material for this whole economic logic.
We’ve got changing public consciousness, outrage, intolerability, mustering democracy, new law, regulation, intervening, outlawing. We’ve got new forms of collective action.
And a third critical piece is the opportunity for competitive solutions. We get the new companies, the right companies, the new leadership to create the new ecosystems and alliances that really provide an alternative trajectory to a digital future, the kind of place that we wanted in the first place. The kind of place that is human. That we can call home.
And the tools are useful.
And the tools are for us, not for them, about us. The knowledge is for us, not about us. If we get that new competitive solution, we’ve got ... Those new competitors, literally, Kara, have an opportunity to have every single human being on Earth as their customer.
Because there is no one on Earth who voluntarily wants to tangle with surveillance capitalism. They have foreclosed the alternatives. They have hijacked the internet. They have hijacked the digital milieu. They have hijacked our homes, our cars, and our bodies.
This is not okay. This is not how it’s supposed to be. It’s not healthy capitalism. It’s not a healthy 21st century society and it’s a deadly, deadly recipe for human freedom and for democracy. This is not the future we want for our kids.
You’re speaking my religion. But let me just end on this. I did an interview with Mark Zuckerberg, and one of the things he put forward and I was hammering him on all the things, these kinda things, saying exactly, not as eloquently as you have, but I was hammering him.
One of the things he said, well, you know, what they’re doing in China, they’re doing all this surveilling, this facial recognition, this and that, I’m thinking “you’d love to do that, Mark Zuckerberg.” But he essentially was putting out the term, it’s either Xi or me.
If we aren’t running the internet, if you constrain us, us big companies, the Chinese internet where they do do facial recognition, where they do allow social scores and things like that. I was thinking when he said this, I’m like, “I don’t like either choice. I don’t like you, I don’t like China, I don’t like any of it.”
Once again, we’re back to misdirection, Kara, because ...
I thought so.
What that statement does is that statement has given up on democracy.
Some people may think, these folks, surveillance capitalists think that we can substitute computation for democracy. Computation for politics. That’s what the Google City is, substituting computation for politics.
I believe in democracy. I believe that the values of the Enlightenment, in the arc of human history, these values were produced five minutes ago. That humankind has sacrificed for millennia in order to get to the ideas of human autonomy and individual sovereignty and democracy, that the demos can regulate itself, that we cannot let go of these ideas.
Every generation has to step up to the responsibility to reclaim, to fight, to resuscitate, to maintain the flourishing and the growth and the deeper rooting institutionalization of these ideas. We cannot let this go. Mark has already let it go. He’s a cynic on democracy, but I’m not. I don’t think you are.
No, not me.
And I don’t think most of our listeners are.
Yeah. Shoshana, this was fantastic.
It was great talking to you. Thank you for coming on the show. I urge you to read this book. It’s called The Age of Surveillance Capitalism: The Fight For A Human Future at the New Frontier of Power. It’s critical that we think of these issues and thanks to you all for listening.
This article originally appeared on Recode.net.