From the very first page, I was consumed with an overwhelming imperative: everyone needs to read this book as an act of digital self-defense.
Amy Goodman interviews Shoshana Zuboff Democracy Now! Mar 1, 2019
Corporations created a new kind of marketplace out of our private human experiences. That is the conclusion of an explosive new book that argues big tech platforms like Facebook and Google are elephant poachers, and our personal data is ivory tusks.
AMY GOODMAN: “Facebook planned to spy on Android phone users, internal emails reveal.”
That’s a headline in Computer Weekly.
“You Give Apps Personal Sensitive Information. Then They Tell Facebook.”
That’s from The Wall Street Journal. Those are just two of the headlines this past week.
This comes as a new report in Britain calls Facebook “digital gangsters.”
The book is titled The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.
Its author, Shoshana Zuboff. She writes, quote, “At its core, surveillance capitalism is parasitic and self-referential. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of labor, surveillance capitalism feeds on every aspect of every human’s experience.”
Shoshana Zuboff is professor emerita at Harvard Business School. She joins us now for the rest of the hour.
Welcome to Democracy Now!
SHOSHANA ZUBOFF: Thank you so much, Amy.
AMY GOODMAN: It’s great to have you with us.
SHOSHANA ZUBOFF: Thank you. It’s a privilege.
AMY GOODMAN: OK, well, let’s start at the beginning. Define “surveillance capitalism.”
SHOSHANA ZUBOFF: Surveillance capitalism departs in many ways from the history of market capitalism, but in a fundamental way it is continuous with that history. We know that capitalism has evolved by taking things that live outside of the market, bringing them into the market dynamic, transforming them into commodities that can be sold and purchased. So, famously, industrial capitalism claims nature for the market; it is reborn as real estate, as land that can be sold and purchased. It claims work for the market, reborn as labor that can be sold and purchased.
So, surveillance capitalism continues this tradition, but with that dark twist. In our time, surveillance capitalism claims private human experience for the market dynamic as a free source of raw material that is translated into behavioral data. These data are then combined with advanced computational abilities to create predictions—predictions of what we will do, predictions of our behavior, predictions of what we will do now, soon and later. And these predictions are then sold to business customers in a new kind of marketplace that trades exclusively in human futures.
This was first invented in the context of online targeted advertising at Google back in 2000, 2001, in the teeth of financial emergency during the dotcom bust. But the same economic logic has now traveled not only from Google to Facebook and throughout the tech sector, but now throughout the normal economy into virtually every economic sector.
AMY GOODMAN: So, comment on these last few headlines, just of the last week. I mean, for example, the report in Britain that calls Facebook “digital gangsters.”
SHOSHANA ZUBOFF: Well, since just about a year ago now—we’re coming up on the 1-year anniversary of the Cambridge Analytica revelations. One of the consequences of those revelations is not only that a lot of us all around the world have been put on alert that all is not well in the digital realm—that’s one thing—but a second thing is that, at least in the U.K., the government has taken this very, very seriously. And there’s been a parliamentary committee investigating Facebook.
This committee was able to get leaked documents, secret documents, from Facebook that had not been reviewed by the public. And just last week, they issued their 108-page report. It’s very powerful, very damning. And among other things, they refer to Facebook as behaving like “digital gangsters,” because they have understood that Facebook has been essentially stealing—in other words, as I’ve described, illegitimately taking—our private human experience for its production processes that create these prediction products, which is what they sell and how they make money.
The key thing that I want our viewers to know is that surveillance capitalism doesn’t stop at Facebook. And right now, it’s a hugely positive development that we are looking at Facebook with this kind of scrutiny and perhaps moving to finally regulate this corporation. But that is the beginning, not the end, of our challenge. Surveillance capitalism is an economic logic that includes, but moves far beyond, Facebook at this point in time. And so, we are going to need the social response that addresses, interrupts and outlaws this new economic logic, not just a single company or not just a couple of companies.
AMY GOODMAN: You write, “A global architecture of behavior modification threatens human nature in the twenty-firs century.” Explain.
SHOSHANA ZUBOFF: All right. Well, once you understand that surveillance capitalism is an economic logic, it is not the same as technology. This is one of the big lies that has been perpetrated, that these methodologies are the only way that digital technology can work, that there is an inevitablist propaganda that has been fed to us. So we need to pull these issues apart.
We have digital technology, which we believed would be emancipatory, empowering, democratizing. And it still can be. In the last 20 years, it has been overtaken, hijacked by an economic logic whose economic imperatives put it on a collision course with democracy, both from below and from above. One of the things that surveillance capitalists learned is that the most powerful predictions of human behavior come from actually intervening in our behavior, touching our behavior, to nudge, to influence, to tune, to herd our behavior toward its commercial outcomes. And what this has done is made them take hold of the digital milieu, all of the devices, beginning with our phones and our laptops, but the sensors, the facial recognition, the smart dishwasher, the smart television set, the smart car, the smart city.
All of this digital infrastructure now has been taken by surveillance capitalism as a way to nudge and tune and herd our behavior toward its guaranteed outcomes. It does this with subliminal cues. It’s a highly scientific process. It does this in ways that it brags about are always outside of our awareness, so that we have no right of combat, we cannot resist, we cannot say no, and we cannot exit. So, this is what I call a global means of behavioral modification, where essentially this great digital architecture, that we built in order to be an emancipatory and life-giving process for us and help us in our lives, has now become commandeered by surveillance capitalism as a means to modify our behavior toward its commercial ends, which is a direct assault on human autonomy, a direct assault on our decision rights, a direct assault on the whole notion of individual sovereignty.
Back in the 1970s, there was a Senate committee that included people like Edward Kennedy and Sam Ervin. These folks met for many months, and they decided that behavioral modification was a pernicious action, that it was a complete defiance of democratic principles. And they decided that no federal money would fund any kind of program based on behavioral modification in prisons, in schools, in hospitals. Today, in the year 2019, we’ve just spent the last two decades where, as democracy slept, the private sector, under the aegis of surveillance capitalism, has been able to command the digital to create a, literally, ubiquitous means of behavioral modification, without anybody saying no, and without most of us even noticing or understanding what has occurred.
AMY GOODMAN: You talk about herding, so let’s go to herds, elephants, where you say that tech platforms like Facebook and Google are elephant poachers, and our personal data is ivory tusks.
SHOSHANA ZUBOFF: Well, we’ve been fed a lot of lies, a lot of euphemism, a lot of misdirection. Those are some of the strategies that have allowed surveillance capitalism to succeed.
One of them is the notion, you know, if it’s free, you’re the product. Right? Everybody has heard that cliché. I confront that head on. Once you understand that we are in the regime of an economic logic, not of technology itself, it’s like going backwards in—like, we’re in Wonderland.
Now we go backwards through the looking glass, and we come out in a place called reality, where we can start to see clearly. And when we start to see clearly, what we see is, first of all, these services are not free. We think the services are free. But they think that we’re free. We’re their free raw material. We think that we’re the product. But they understand that we are not the product, we are simply the free source of raw material, like those elephant tusks. Everything about us, like what our problems are, what our real needs are, what our real concerns are—everything about us is ignored. They have no interest in us. It doesn’t matter if we are happy or sad. It doesn’t matter if we’re doing well or poorly. It only matters that we do these things in ways that they can scrape the experience and turn it into data.
There are a few other interesting lies here. We think we’re searching Google; Google is actually searching us. We think that these companies have privacy policies; those policies are actually surveillance policies. We’re told that if we have nothing to hide, then we have nothing to fear. The fact is, what they don’t tell us and what we are forgetting, that if you have nothing to hide, then you are nothing, because everything about us that makes us our unique identities, that gives us our individual spirit, our personality, our sense of freedom of will, freedom of action, our sense of our right to our own futures, that’s what comes from within. Those are our inner resources. That’s our private realm. And it’s intended to be private for a reason, because that is how it grows and flourishes and turns us into people who assert moral autonomy—an essential element of a flourishing, democratic society.
AMY GOODMAN: We’re talking to professor Shoshana Zuboff. Well, she’s professor emerita at Harvard Business School. She has this remarkable new book, The Age of Surveillance Capitalism. I wanted to get your comment on this latest news headline: “A New York regulator is ramping up a promised investigation of how Facebook gathered sensitive personal information from popular smartphone applications, after a report by The Wall Street Journal revealed many such apps were sending the social-media giant data including users’ body weight and menstrual cycles.”
SHOSHANA ZUBOFF: All right, well, so we’re living in a time right now where every week there are a series of mini-scandals. And this is one of the mini-scandals this past week. There were several, and this was one of them. So, what happens is, we get mobilized around a mini-scandal. If you understand surveillance capitalism and you understand its economic imperatives, that it needs always scale, volumes of behavioral data, that it needs scope, varieties of behavioral data, that it needs the kind of behavioral data that comes from actually intervening and influencing our actions, as we talked about a moment ago, then all of these mini-scandals are utterly predictable as the routine, humdrum, please-pass-the-salt, everyday operations of any self-respecting surveillance capitalist.
So, these apps that The Wall Street Journal researched—and I cover this in depth in the book—just about every app that you download is shunting your data to third parties. Virtually every app is doing that. When you look at those third parties, the two Goliaths among those third parties are Facebook and Google. Most of the sites, the URLs that these data get shunted to, are owned by Facebook and Google. So what this means is that you download an app. Many of these apps are—you know, we use them to help us with our daily life, because we have needs that—you know, we need support. No one’s really helping us with our lives. Certainly, our institutions are not. So, we have apps that help us with our health, that help us with our fitness, apps that help us keep track of our menstrual cycle, apps that help us think about our mental health. These very personal information going into these apps, it doesn’t stop there. All going to third parties, primarily these Goliaths, Facebook and Google.
AMY GOODMAN: In this—we’re finishing up this segment right now.
SHOSHANA ZUBOFF: Yes.
AMY GOODMAN: Then we’re going to do Part 2 and post it online at democracynow.org. But what surprised you most as you did this research?
SHOSHANA ZUBOFF: At every stage of this research, there were times when I would be sitting in my study—I worked on this book for seven years—I’d be sitting in my study, and I’d start screaming, literally, out loud, often to no one but my beautiful dog, because there were so many revelations for me. I think the biggest one is understanding that we’re entering the 21st century now with a new domain of social inequality. We’ve been focused on economic inequality. It’s tremendously important. We now enter the 21st century, where private surveillance capital has institutionalized asymmetries of knowledge unlike anything ever seen in human history. They know everything about us; we know almost nothing about them.
All right. Well, surveillance capitalism is a further evolution of capitalism that follows in the old pattern of taking things that live outside the market, subordinating them to the market dynamic as commodities that can be sold and purchased—but with a dark twist. Surveillance capitalism unilaterally claims our private human experience as a free source of raw material for its own production processes. It translates our experience into behavioral data. Those behavioral data are then combined with its advanced computation capabilities, what people today refer to as AI, machine intelligence.
AMY GOODMAN: Artificial intelligence.
SHOSHANA ZUBOFF: Artificial Intelligence. Out of that black box come predictions about our behavior, what we will do now, soon and later. Turns out there are a lot of businesses that want to know what we will do in the future. And so, these have constituted a new kind of marketplace, a marketplace that trades exclusively in behavioral futures, in our behavioral futures. That’s where surveillance capitalists make their money. That’s where the big pioneers of this economic logic, like Google and Facebook, have become so wealthy, by selling predictions of our behavior, first to online targeted advertisers, and now, of course, these business customers range across the entire economy, no longer confined to that original context of online targeted advertising.
AMY GOODMAN: So how do you protect yourself?
SHOSHANA ZUBOFF: So how do we protect ourselves? All right. First thing, all of this has rooted and flourished in the last two decades, while democracy slept. And the question is: How did they get away with it? There are a bunch of answers to that question; I go into about 16 explanations for that. But there are a couple that are right at the top of the list, that we should talk about.
One is our ignorance, because key to this whole methodology, and why it’s called surveillance capitalism, is that all of this is conducted in secret. All of this is conducted through the social relations of the one-way mirror, ergo surveillance. The vast amounts of capital that have been accumulated here are trained to create these systems in a way that keeps us ignorant. Specifically, the data scientists write about their methods in a way that brags about the fact that these systems bypass our awareness, so that they bypass our rights to say yes or no, I want to participate or I don’t want to participate, I want to contest or I don’t want to contest, I want to fight or I don’t want to fight. All of that is bypassed. We are robbed of the right to combat, because we are engineered into ignorance.
So, what Naomi says is, first of all, if we are going to defend ourselves, we have to start by understanding what the heck this is. We have to start by naming it. Naming is power. Understanding is power. Getting past this ignorance is power. That’s the first step. Once we understand that we’re dealing with an economic logic, not digital technology—digital technology is perfectly easy to imagine without surveillance capitalism; impossible to imagine surveillance capitalism without digital technology. So we’re talking about the puppet master here, not the puppet. The digital is merely the puppet. Surveillance capitalism is the puppet master.
So, once we name and understand that this is an economic logic, then it’s our job, as citizens of democratic societies, to use our new understanding to summon the resources of our democratic institutions, to insist that our elected officials now go beyond naming to actually interrupt and outlaw these practices. Do we really want to be living in a society where the dominant form of capitalism is one that makes its money by trading in human futures? Because the consequence of that kind of business logic is on a direct collision course with democracy, in two ways.
First, it must take on human autonomy. It must—it must cast human autonomy as its enemy, because human autonomy means friction. It’s harder to take our experience, it’s harder to influence our behavior for the very best sources of predictive behavioral data—it’s harder to do that if we know what’s going on and we find ways to resist. So, it is against autonomy. It is against individual sovereignty and our decision rights over our own experience. That eats away at democracy from below, because we cannot have flourishing democratic societies without individuals who understand themselves as a moral center of critical thinking and autonomous action. That’s number one.
Number two, it assaults democracy from above, because it means that we enter the 21st century with a new kind of institutional paradigm that introduces extreme inequalities of knowledge. Under the aegis of private surveillance capital, we have institutionalized private companies with asymmetries of knowledge unlike anything we’ve seen in human history. They know everything about us; we know almost nothing about them. Their knowledge about us is used for others’ profit gain, not used to actually solve our problems and improve our lives. So this is a huge asymmetry of knowledge, which also gives rise to a huge asymmetry of power, because from great knowledge comes great power. In this case, the power to actually modify, influence our behavior in directions that are consistent with their commercial purposes. This is a pernicious, corrosive effect on democracy.
By the way, we saw these same methods being used by Cambridge Analytica with those revelations a year ago, with only a tiny difference. All they did was take these same everyday, routine methods of surveillance capitalism, pivot them just a couple of degrees toward political outcomes rather than commercial outcomes, showing that they could use our data to intervene and influence our behavior, our real-world behavior, and our real-world thinking and feeling, in order to change political outcomes.
AMY GOODMAN: And for people who don’t quite understand what the Cambridge Analytica scandal was, explain.
SHOSHANA ZUBOFF: Cambridge Analytica was a private company owned by the plutocrat Robert Mercer, who also funded the Donald Trump campaign. This company learned how to commandeer Facebook data at scale. They purchased access to Facebook data at scale, so that they could use these methodologies of surveillance capitalism to understand individual personality types and tailor subliminal cues and messages to individuals in ways that would manipulate their behavior, manipulate their political attitudes, and actually try to influence their real-world political action and voting behavior.
We know that they were very successful. The forensics are still being explored. We don’t know if we will ever understand completely every aspect of their influence. But we know that they had tremendous influence both in the Brexit vote of 2016 and in the U.S. presidential elections. And we know this because of a whistleblower who was the key architect of this strategy, a young man named Chris Wylie, who, for all of his bad deeds, had the courage to become our civilization’s prodigal son by finally fessing up to what he had done and trying to atone for it by explaining it to our public, to all the peoples of the world, so that we could be put on alert.
AMY GOODMAN: I want to turn to Eric Schmidt. In 2010, then-Google CEO Eric Schmidt spoke at the Washington Ideas Forum about Google’s future vision.
ERIC SCHMIDT: With your permission, you give us more information. If you give us information about who some of your friends are, we can probably use some of that information—again, with your permission—to improve the quality of your searches. … One of the things that eventually happens, in that proceeding line of reasoning, is we don’t need you to type at all, because we know where you are, with your permission, we know where you’ve been, with your permission. We can more or less guess what you’re thinking about.
AMY GOODMAN: Respond, as Google’s former CEO Eric Schmidt lays it out, the do-no-evil Google.
SHOSHANA ZUBOFF: Well, I took about a month off from these seven years that I was writing this book to read every manual that every great magician had ever written about their skill. And it turns out that one of the pivotal skills of a great magician is the idea of misdirection. You direct people’s attention in one way so that you can perform your trick with the other hand.
So, what you’ve just heard here is a bit of artful misdirection: “with your permission,” “with your permission.” That is a cynical lie, because “with your permission” means that you click on that “I agree.” That “I agree” is a box that we all click on because we have no choice, because for everyday, effective social participation, we have no choice other than to march ourselves through the supply chains that are the very channels through which Google and other surveillance capitalists scrape our private experience and turn it into behavioral data. So we have to play in their gardens in order to just get through our day. So, the idea that we are giving permission is one of the big lies. This is a little piece of kabuki here. I’ll be the sun, you be the moon. I’ll say I give you—you know, you give me permission; you say, “Yeah, I agree.” But we all know that it’s a lie.
Eric Schmidt has been quoted with another wonderful piece of misdirection, when confronted about the fact that turns out that security agencies were getting access to Google search data for some of their own tasks. One of the statements that he made—I think it was about a year earlier than that statement—he said everybody should understand that, quote, “search engines do retain.”
Another brilliant piece of misdirection. Search engines do not retain. Surveillance capitalism retains data, not search engines. This is another way in which we must separate the puppet from the puppet master, which these guys constantly want to confuse, because they want us to believe this is the only way the digital future could be, and therefore you’ve got to hunker down and you’ve got to get used to it. You have to resign yourself to this inevitability. That is a lie that we must confront and we must resist.
The good news about this, Amy, is that our societies have learned in the past how to successfully confront a rogue capitalism, the excesses of a raw destructive capitalism, and bring it to heel and tether it to the principles of a democratic society and tether it to the real interests of people.
We did it to end the Gilded Age. We did it during the Great Depression. We did it in the postwar era. We summoned democracy, with new law, with new regulatory regimes, with new forms of collective action. Back then, it was collective bargaining and trade unions, the right to strike. But we summoned democracy to stop the violent excesses of capitalism and turn it into some kind of equilibrium, however imperfect, that we could call market democracy.
We’ve done it before, and we can do it again.
AMY GOODMAN: And what would the building blocks of that be? In fact, I want to couple that with, and I know you have to go, but how you protect yourself, because you now know so much about how surveillance capitalism works. What you do?
SHOSHANA ZUBOFF: Well, you know, I do what most people who are even a little bit informed do. You know, I have something that blocks ads, that blocks tracking, a browser that can scramble my location, and, you know, these various kinds of things. But, you know, this stuff really makes me angry, Amy, because essentially what we’re doing with this stuff is we are finding ways to hide in our own lives. And this makes me furious, that as citizens of the 21st century, as citizens of a democratic society, we have to find ways to disguise ourselves. We have some of our best young artists, who are coming out with, you know, fabrics and camouflage that you put on your face or put on your body, so that when you’re out on the streets you’re protected from facial recognition and voice recognition. This is intolerable. This is not the world I want my children to grow up in.
AMY GOODMAN: Which goes to this issue of this kind of surveillance policing leads to self-policing, the squelching of creativity, of a self-expression.
SHOSHANA ZUBOFF: The self-policing, the chilling effect, as scholars call it, the self-censorship that we’re familiar with online. But now what we’re seeing is we’re self-censoring in our real lives, in the real world, because anyone could have the camera on their phone running. As—
AMY GOODMAN: As we now—
SHOSHANA ZUBOFF: As a good example, with Senator Feinstein, that were able to get the camera running.
AMY GOODMAN: As the kids were confronting her to support a Green New Deal.
SHOSHANA ZUBOFF: As the kids were confronting her to support the Green New Deal. But for most of us, that is an intolerable invasion, that we can be recorded anywhere. And this is happening to young people all throughout their lives. So, young people have no place where they can be off stage, where they can be private and be nurturing and developing the inner resources that we need for the capacity of moral autonomy and individual judgment that is so necessary to a democratic society.
AMY GOODMAN: Which goes back to this whole issue of how do societies protect themselves, because you hold out great hope.
SHOSHANA ZUBOFF: I do.
AMY GOODMAN: I mean, we say the word “regulation,” but what does that mean? What has to be taken apart here?
SHOSHANA ZUBOFF: All right. Well, first of all, it’s not going to be the same regulation that worked a century ago. Right? A century ago, for example, we confronted child labor. We didn’t say, “All right, let’s have a negotiation. A 7-year-old can only work in a factory three hours a day.” We didn’t do that. We said, “No child labor, period. Those children are at home, and they go to school.”
Today, we can’t be talking about things like data ownership. That’s like saying 7-year-olds work three hours a day. Data ownership is after the horse is out of the barn, because so much of these data should not exist in the first place. These are data that are illegitimately taken from our lives, without—despite what Eric Schmidt says, without our permission, because it’s done without our knowledge, so we couldn’t possibly give our permission.
So, we have to have the new generation, that new century, of regulation, that interrupts these specific mechanisms, that says private human experience is out of bounds, is off base, it is not available to be the next virgin wood of capitalism. Private human experience is essential for a democratic society. It cannot be turned into a commodity. That’s number one.
We have to interrupt, and even outlaw, behavioral futures markets, because the consequence of a business that sells behavioral futures is a business that is on a collision course with democracy. There can be no free society in a society dominated by a capitalism that must take our behavior in order to sell predictions of our behavior for the benefit of business customers.
AMY GOODMAN: Were you shocked by how little politicians understand? A younger generation will understand more the capacity of the digital world and how it can help you and how it can hurt you. But the lack of even understanding? Those that are being regulated—the Facebooks, the Googles, the Amazons of the world—when they are faced with any possible regulation, they write the regulations.
SHOSHANA ZUBOFF: I know that seems a little discouraging. You know, I learned as a young student—I used to sit in on Milton Friedman’s classes to figure out what he was teaching people at University of Chicago, when I was an undergrad. And even as a 17-year-old, I heard Friedman talking about the idea that public opinion today is legislation in 20 years. So there’s a lag.
Let’s say that if we can change public opinion today, it’s not going to take us 20 years to get to these new laws and new regulations, but, you know, it’s going to take us a few years. It’s not an overnight deal. So, but having that kind of perspective, honestly, Amy, I was not surprised by the sheer just blundering cluelessness of our lawmakers as they interviewed Zuckerberg and other surveillance capitalists.
We have work to do. But this is work that can be done. Our elected officials can be educated. If they don’t want to get educated, we can elect different people. This is stuff that we can fix. I do believe that. You know, even into the 20th century, we still had courts, judges who were making decisions that completely sided with the industrialists, whom we have now renamed as “robber barons.” In time, all of these actions are reinterpreted as history shakes out, and democracy finally finds its way through to the light. And that’s what I believe—that’s the process that we’re in now. This thing is 20 years old. We’re at the beginning, not the end. We name it, we tame it.
That’s the work now, to reignite our democracy, wake it up for this work of the 21st century.