The Age of Surveillance Capitalism: Excerpts
Gemini version of this post
I finally finished reading The Age of Surveillance Capitalism by Shoshana Zuboff, and I must say it’s a masterpiece. During my reading journey, I highlighted cool things I found key, so I would later put them together. I highly recommend reading the book to understand the full context of the following excerpts. I transcribed everything by hand from my e-ink reader, so, excuse me if are typos, bugs or warnings. Feel free to express your thoughts! And fuck copyright, btw. If you want a 100% free copy of the book, ask for it!
Chapter 1. Home or Exile in the Digital Future
III. What Is Surveillance Capitalism?
Competitive pressures produced this shift, in which automated machine processes not only know our behavior but also shape our behavior at scale. With this reorientation from knowledge to power, it is no longer enough to automate information flows about us; the goal is to automate us.
V. The Puppet Master, Not The Puppet
It was in 1912 when Thomas Edison laid out his vision for a new industrial civilization in a letter to Henry Ford. Edisor worried that industrialism’s potential to serve the progress of humanity would be thwarted by the stubborn power of the robber barons and the monopolist economics that ruled their kingdoms. He decried the “wastefulness” and “cruelty” of US capitalism: “Our production, our factory laws, our charities, our relations between capital and labor, our distribution—all wrong, out of gear.” Both Edison and Ford understood that the modern industrial civilization for which they harbored such hope was careening toward a darkness marked by misery for the many and prosperity for the few.
Chapter 2. August 9, 2011: Setting the Stage for Surveillance Capitalism
III. The Neoliberal Habitat
The absolute authority of market forces would be enshrined as the ultimate source of imperative control, displacing democratic contest and deliberation with an ideology of atomized individuals sentenced to perpetual competition for scarce resources. The disciplines of competitive markets promised to quiet unruly individuals and even transform them back into subjects too preoccupied with survival to complain.
IV. The Instability of the Second Modernity
A précis of [Thomas] Piketty’s extensive research may be stated simply: capitalism should not be eaten raw. Capitalism, like sausage, is meant to be cooked by a democratic society and its institutions because raw capitalism is antisocia. As Piketty warns, “A market economy… if left to itself… contains powerful forces of divergence, which are potentially threatening to democratic societies and to the values of social justice on which they are based.
This is the existential contradiction of the second modernity that defines our conditions of existence: we want to exercise control over our own lives, but everywhere that control is thwarted. Individualization has sent each one of us on the prowl for resources we need to ensure effective life, but at each turn we are forced to do battle with an economics and politics for whose vantage point we are but ciphers. We live in the knowledge that our lives have unique value, but we are treated as invisible.
V. A Third Modernity
The digital milieu has been essential to these degradations. [Nancy] Kim points out that paper documents once imposed natural restraints on contracting behavior simply by virtue of their cost to produce, distribute, and archive. Paper contracts require a physical signature, limiting the burden a firm is likely to impose on a customer by requiring her to read multiple pages of fine print. Digital terms, in contrast, are “weightless.” They can be expanded, reproduced, distributed, and archived at no additional cost. Once firms understood that the courts were disposed to validate their click-wrap and browse-wrap agreements, there was nothing to stop them from expanding the reach of these degraded contracts “to extract from consumers additional benefits unrelated to the transaction.”
VI. Surveillance Capitalism Fills the Void
Eventually, companies began to explain these violations as the necessary quid pro quo for “free” internet services. Privacy, they said, was the price one must pay for the abundant rewards of information, connection, and other digital goods when, where, and how you want them. These explanations distracted us from the sea change that would rewrite the rules of capitalism and the digital world.
VII. For a Human Future
“The bankers may not know it,” Fito Montes reflected, “but the future will need the past. It will need these marble floors and the sweet taste of my gypsy cakes. They treat us like figures in a ledger, like they are reading the number of casualties in a plane crash. They believe the future belongs only to them. But we each have our own story. We each have our life. It is up to us to proclaim our right to the future. The future is our home too.”
Chapter 3. The Discovery of Behavioral Surplus
II. A Balance of Power
Finally, people often say that the user is the “product.” This is also misleading, and it is a point that we will revisit more than once. For now, let’s say that users are not products, but rather we are the sources of raw-material supply. As we shall see, surveillance capitalism’s unusual products manage to be derived from our behavior while remaining indifferent to our behavior. Its products are about predicting us, without actually caring what we do or what is done to us.
III. Search for Capitalism: Impatient Money and the State of Exception
Google had relegated advertising to steerage class: its AdWords team consisted of seven people, most of whom shared the founders’ general antipathy toward ads. The tone had been set in Sergey Brin and Larry Page’s milestone paper that unveiled their search engine conception, “The Anatomy of a Large-Scale Hypertextual Web Search Engine,” presented at the 1998 World Wide Web Conference: “We expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of customers. This type of bias is very difficult to detect but could still have a significant effect on the market… we believe the issue of advertising causes enough mixed incentives that it is crucial to have a competitive search engine that is transparent and in the academic realm.
IV. The Discovery of Behavioral Surplus
Advertising had always been a guessing game: art, relationships, conventional wisdom, standard practice, but never “science.” The idea of being able to deliver a particular message to a particular person at just the moment when it might have a high probability of actually influencing his or her behavior was, and had always been, the holy grail of advertising. The inventors point out that online ad systems had also failed to achieve this elusive goal. The then-predominant approaches used by Google’s competitors, in which ads were targeted to keywords or content, were unable to identify relevant ads “for a particular user.” Now the inventors offered a scientific solution that exceeded the most-ambitious dreams of any advertising executive: “There is a need to increase the relevancy of ads served for some user request, such as a query or a document request… to the user that submitted the request. … The apparatus, message formats and/or data stuctures for determining user profile information and using such determined user profile information for ad serving.” In other words, Google would no onger mine behavioral data strictly to improve service for users but rather to read users’ minds for the purposes of matching ads to their interests, as those interests are deduced from the collateral traces of online behavior.
VII. The Secrets of Extraction
Mass production was aimed at new sources of demand in the early twentieth century’s first mass consumers. Ford was clear on this point: “Mass production begins in the perception of a public need.” Supply and demand were linked effects of the new “conditions of existence” that defined the lives of my great-grandparents Sophie and Max and other travelers in the first modernity. Ford’s invention deepened the reciprocities between capitalism and these populations. In contrast, Google’s inventions destroyed the reciprocities of its original social contract with users. The role of the behavioral value reinvestment cycle that had once aligned Google with its users changed dramatically. Instead of deepening the unity of supply and demand with its populations, Google chose to reinvent its business around the burgeoning demand of advertisers eager to squeeze and scrape online behavior by any available means in the competition for market advantage. In the new operation, users were no longer ends in themselves but rather became the means to others’ ends.
The last thing that Google wanted was to reveal the secrets of how it had rewritten its own rules and, in the process, enslaved itself to the extraction imperative. Behavioral surplus was necessary for revenue, and secrecy would be necessary for the sustained accumulation of behavioral surplus. This is how secrecy came to be institutionalized in the policies and practices that govern every aspect of Google’s behavior onstage and offstage. Once Google’s leadership understood the commercial power of behavioral surplus, [Eric] Schmidt instituted what he called the “hidden strategy.” Google employees were told not to speak about what the patent had referred to as its “novel methods, apparatus, message formats and/or data structures” or confirm any rumors about flowing cash. Hiding was not a post hoc strategy; it was baked into the cake that would become surveillance capitalism.
Chapter 4. The Moat Around the Castle
I. Human Natural Resources
In this future we are exiles from our own behavior, denied access to or control over knowledge derived from its dispossession by others for others. Knowledge, authority, and power rest with surveillance capital, for which we are merely “human natural resources.” We are the native peoples now whose tacit claims to self-determination have vanished from the maps of our own experience.
II. The Cry Freedom Strategy
It is worth noting that an understanding of this logic of accumulation would have usefully contributed to the EU Commission’s deliberations on the WhatsApp acquisition, which was permitted based on assurances that the data flows from the two businesses would remain separate. The commission would discover later that the extraction imperative and its necessary economies of scale in supply operations compel the integration of surplus flows in the quest for better prediction products.
IV. Shelter: Surveillance Exceptionalism
During these years, scholars noted the growing interdependencies between the intelligence agencies, resentful of constitutional constraints on their prerogatives, and the Silicon Valley firms. The agencies craved the lawlessness that a firm such as Google enjoyed. In his 2008 essay “The Constitution in the National Surveillance State,” law professor Jack Balkin observed that the Constitution inhibits its government actors from high-velocity pursuit of their surveillance agenda, and this creates incentives for the government “to rely on private enterprises to collect and generate information for it.”
Indeed, “At Obama’s Chicago headquarters… they [Google] remodeled the electorate in every battleground state each weekend… field staff could see the events’ impact on the projected behaviors and beliefs of every voter nationwide. Research by media scholars Daniel Kreiss and Phillip Howard indicates that the 2008 Obama campaign compiled significant data on more than 250 million Americans, including “a vast array of online behavioral and relational data collected from the use of campaign’s web site and third-party social media sites such as Facebook…”
Political correspondent Jim Rutenberg’s New York Times account of the data scientists’ seminal role in the 2012 Obama victory offers a vivid picture of the capture and analysis of behavioral surplus as a political methodology. The campaign knew “every single wavering voter in the country that it needed to persuade to vote for Obama, by name, address, race, sex, and income,” and it had figured out how to target its television ads to these individuals.
Meanwhile, a list of Google Policy Fellows for 2014 included individuals from a range of nonprofit organizations whom one would expect to be leading the fight against that corporation’s concentrations of information and power, including the Center for Democracy and Technology, the Electronic Frontier Foundation, the Future of Privacy Forum, the National Consumers League, the Citizen Lab, and the Asociación por los Derechos Civiles.
Chapter 5. The Elaboration of Surveillance Capitalism: Kidnap, Corner, Compete
III. The Dispossession Cycle
People habituate to the incursion with some combination of agreement, helplessness, and resignation. The sense of astonishment and outrage dissipates. The incursion itself, once unthinkable, slowly worms its way into the ordinary. Worse still, it gradually comes to seem inevitable. New dependencies develop. As populations grow numb, it becomes more difficult for individual and groups to complain.
The firm wants to enable people to make better choices, but not if those choices impede Google’s own imperatives. Google’s ideal society is a population of distant users, not a citizenry. It idealizes people who are informed, but only in the way the corporation chooses. It means for us to be docile, harmonious, and, above all, grateful.
V. Dispossession Competition
“Google has done incrementally and furtively what would plainly be illegal if done all at once.”
VI. The Siren Song of Surveillance Revenues
Writing in the New York Times, Democratic FCC appointee [Tom] Wheeler went to the heart of the problem: “To my Democratic colleagues and me, the digital tracks that a consumer leaves when using a network are the property of that consumer. They contain private information about personal preferences, health problems and financial matters. Our Republican colleagues on the commission argued the data should be available for the network to sell.”
In another trend, surveillance in the interest of behavioral surplus capture and sale has become a service in its own right. Such companies are often referred to as “software-as-a-service” or SaaS, but they are more accurately termed “surveillance as a service,” or “SVaaS.”
As the Wall Street Journal reports, new startups such as Alfirm, LendUp, and ZestFinance “use data from sources such as social media, online behavior and data brokers to determine the creditworthiness of tens of thousands of U.S. consumers who don’t have access to loans,” more evidence that decision rights and the privacy they enable have become luxuries that too many people cannot afford.
Chapter 6. Hijacked: The Division of Learning in Society
I. The Google Declarations
The six declarations laid the foundation for the wider project of surveillance capitalism and its original sin of dispossession. They must be defended at any cost because each declaration builds on the one before it. If one falls, they all fall:
- We claim human experience as raw material free for the taking. On the basis of this claim, we can ignore considerations of individuals’ rights, interests, awareness, or comprehension.
- On the basis of our claim, we assert the right to take an individuals’ experience for translation into behavioral data.
- Our right to take, based on our claim on free raw material, confers the right to own the behavioral data derived from human experience.
- Our rights to take and to own confer the right to know what the data disclose.
- Our rights to take and to own confer the right to decide how we use our knowledge.
- Our rights to take, to own, to know, and to decide confer our rights to the conditions that preserve our rights to take, to own, to know, and to decide.
III. Surveillance Capital and the Two Texts
The shadow text is a burgeoning accumulation of behavioral surplus and its analyses, and it says more about us than we can know about ourselves. Worse still, it becomes increasingly difficult, and perhaps impossible, to refrain from contributing to the shadow text. It automatically feeds on our experience as we engage in the normal and necessary routines of social participation.
IV. The New Priesthood
The top five tech companies have the capital to crowd out competitors: startups, universities, municipalities, established corporations in other industries, and less wealthy countries. In Britain, university administrators are already talking about a “missing generation” of data scientists. The huge salaries of the tech firms have lured so many professionals that there is no one left to teach the next generation of students. As one scholar described it, “The real problem is these people are not dispersed through society. The intellect and expertise is concentrated in a small number of companies.”
More than six hundred years ago, the printing press put the written word into the hands of ordinary people, rescuing the prayers, bypassing the priesthood, and delivering the opportunity for spiritual communion directly into the hands of the prayerful. We have come to take for granted that the internet enables an unparalleled diffusion of information, promising more knowledge for more people: a mighty democratizing force that exponentially realizes Gutenberg’s revolution in the lives of billions of individuals. But this grand achievement has blinded us to a different historical development, one that moves out of range and out of sight, designed to exclude, confuse, and obscure. In this hidden movement the competitive struggle over surveillance revenues reverts to the pre-Gutenberg order as the division of learning in society shades toward the pathological, captured by a narrow priesthood of privately employed computational specialists, their privately owned machines, and the economic interests for whose sake they learn.
Just over thirty years ago, legal scholar Spiros Simitis published a seminal essay on the theme of privacy in an information society. Simitis grasped early on that the already visible trends in public and private “information processing” harbored threats to society that transcended narrow conceptions of privacy and data ownership: “Personal information is increasingly used to enforce standards of behavior. Information processing is developing, therefore, into an essential element of long-term strategies of manipulation intended to mold and adjust individual conduct.”
Demanding privacy from surveillance capitalists or lobbying for an end to commercial surveillance on the internet is like asking Henry Ford to make each Model T by hand or asking a giraffe to shorten its neck. Such demands are existential threats. They violate the basic mechanism and laws of motion that produce this market leviathan’s concentrations of knowledge, power, and wealth.
Chapter 7. The Reality Business
I. The Prediction Imperative
Just as scale became necessary but insufficient for higher-quality predictions, it was also clear that economies of scope would be necessary but insufficient for the highest quality of prediction products able to sustain competitive advantage in the new markets for future behavior. Behavioral surplus must be vast and varied, but the surest way to predict behavior is to intervene at its source and shape it. The processes invented to achieve this goal are what I call economies of action.
Now the reality business requires machine-based architectures in the real world. These finally fulfill [Max] Weiser’s vision of ubiquitous automated computational processes that “weave themselves into the fabric of everyday life until they are indistinguishable from it,” but with a twist. Now they operate in the interests of surveillance capitalists. There are many buzzwords that gloss over these operations and their economic origins: “ambient computing,” “ubiquitous computing,” and the “internet of things” are but few examples.
Two vectors converge in this fact: the early ideals of ubiquitous computing and the economic imperatives of surveillance capitalism. This convergence signals the metamorphosis of the digital infrastructure from a thing that we have to a thing that has us.
IV. Surveillance Capitalism’s Realpolitik
The message is that surveillance capitalism’s new instruments will render the entire world’s actions and conditions as behavioral flows. Each rendered bit is liberated from its life in the social, no longer inconveniently encumbered by moral reasoning, politics, social norms, rights, values, relationships, feelings, contexts, and situations. In the flatness of this flow, data are data, and behavior is behavior. The body is simply a set of coordinates in time and space where sensation and action are translated as data. All things animate and inanimate share the same existential status in this blended confection, each reborn as an objective and measurable, indexable, browsable, searchable “it.” From the vantage point of surveillance capitalism and its economic imperatives, world, self, and body are reduced to the permanent status of objects as they disappear into the bloodstream of a titanic new conception of markets.
V. Certainty for Profit
Deloitte [Touche Tohmatsu Limited] acknowledges that according to its own survey data, most consumers reject telematics on the basis of privacy concerns and mistrust companies that want to monitor their behavior. This reluctance can be overcome, the consultants advice, by offering cost savings “significant enough” that people are willing “to make the [privacy] trade-off,” in spite of “lingering concerns….” If price inducements don’t work, insurers are counseled to present behavioral monitoring as “fun,” “interactive,” “competitive,” and “gratifying,” rewarding drivers for improvements on their past record and “relative to the broader policy holder pool.” In this approach, known as “gamification,” drivers can be engaged to participate in “performance based contests” and “inventive based challenges.”
VI. Executing the Uncontract
The uncontract is not a space of contractual relations but rather a unilateral execution that makes those relations unnecessary. The uncontract desocializes the contract, manufacturing certainty through the substitution of automated procedures for promises, dialogue, shared meaning, problem solving, dispute resolution, and trust: the expressions of solidarity and human agency that have been gradually institutionalized in the notion of “contract” over the course of millenia. The uncontract bypasses all that social work in favor of compulsion, and it does so for the sake of more-lucrative prediction products that approximate observation and therefore guarantee outcomes.
Chapter 8. Rendition: From Experience to Data
I. Terms of Sur-Render
A December 2013 letter from Google’s finance director to the US Securities and Exchange Commission’s Division of Corporate Finance provides a vivid glimpse of these facts. The letter was composed in response to an SEC query on the segmentation of Google’s revenues between its desktop and mobile platforms. Google answered by stating that users would be “viewing our ads on an increasingly wide diversity of devices in the future” and that its advertising systems were therefore moving toward “device agnostic” design that made segmentation irrelevant and impractical. “A few years from now,” the letter stated, “we and other companies could be serving ads and other content on refrigerators, car dashboards, thermostats, glasses, and watches, to name just a few possibilities.”
Extension wants every corner and crevice, every utterance and gesture on the path to dispossession. All that is moist and alive must hand over its facts. There can be no shadow, no darkness. The unknown is intolerable. The solitary is forbidden.
II. Body Rendition
As we know, once a third party captures your surplus, it is shared with other third parties, who share it with other third parties, and so on.
Chapter 9. Rendition from the Depths
I. Personalization as Conquest
“The team in there now is finding ways to activate commercial intent inside Messenger,” a Facebook executive reported. The idea is to “prioritize commerce-driven experiences” and design new ways for users to “quickly buy things” without the tedium of entering credit card information, flipping pages, or opening applications. Pop-up buttons appear during your conversations with friends whenever the system detects a possible “commercial intention.” Just tap to order, buy, or book, and let the system do the rest.
A digital assistant may derive its character from your inclinations and preferences, but it will be skewed and disfigured in unknown measure by the hidden market methods and contests that it conceals.
II. Rendition of the Self
There is compelling evidence to suggest that the unique dynamics of the Facebook milieu eventually complicated this picture of “real personality,” as we shall explore in Chapter 16, but in 2011 these early findings encouraged three University of Maryland researchers to take the next logical step. They developed a method that relies on sophisticated analytics and machine intelligence to accurately predict a user’s personality from publicly available information in his or her Facebook profile. In the course of this research, the team came to appreciate the magic of behavioral surplus, discovering, for example, that a person’s disclosure of specific information such as religion or political affiliation contributes less to a robust personality analysis than the fact that the individual is willing to share such information in the first place. This insight alerted the team to a new genre of powerful behavioral metrics. Instead of analysing the content of user lists, such as favorite TV shows, activities, and music, they learned that simple “meta-data”—such as the amount of information shared—“turned out to be much more useful and predictive than the original raw data.”
In the name of, for example, affordable car insurance, we must be coded as conscientious, agreeable, and open. This is not easily faked because the surplus retrieved for analysis is necessarily opaque to us. We are not scrutinized for substance but for form. The price you are offered does not derive from what you write about but how you write it. It is not what is in your sentences but in their length and complexity, not what you list but that you list, not the picture but the choice of filter and degree of saturation, not what you disclose but how you share or fail to, not where you make plans to see your friends but how you do so: a casual “later” or a precise time and place? Exclamation marks and adverb choices operate as revelatory and potentially damaging signals of your self.
In his 2015 interview, [Michal] Kosinski observed that “all of our interactions are being mediated through digital products and services which basically means that everything is being recorded.” He even characterized his own work as “pretty creepy”: “I actually want to stress that I think that many of the things that… one can do should certainly not be done by corporations or governments without users’ consent.” Recognizing the woefully asymmetric division of learning, he lamented the refusals of Facebook and the other internet firms to share their data with the “general public,” concluding that “it’s not because they’re evil, but because the general public is bloody stupid… as a society we lost the ability to convince large companies that have enormous budgets and enormous access to data to share this goodness with us…. We should basically grow up finally and stop it.”
III. Machine Emotion
Clients are advised to “play your audience emotions to stay on top of the game.” The [Realeyes] company’s website offers a brief review of the history of research on human emotions, concluding that “the more people feel, the more they spend…. Intangible ‘emotions’ translate into concrete social activity, brand awareness, and profit.”
Back in 1997, [Rosalind] Picard acknowledged the need for privacy, “so you remain in control over who gets access to this information.” Importantly for our analysis, in the final pages of her book she expressed some concerns, writing that “there are good reasons not to broadcast your affective patterns to the world… you probably do not want it to be picked up by an army of sales people who are eager to exploit mood-based buying habits, or by advertisers eager to convince you that you’d feel better if you tried their new soft drink right now.” She noted the possibility of intrusive workplace monitoring, and she voiced reservations about the possibility of a dystopian future in which “malevolent” governmental forces use affective computing to manipulate and control the emotions of populations.
IV. When They Come for My Truth
Experience is not what is given to me but rather what I make of it. The same experience that I deride may invite your enthusiasm. The self is the inward space of lived experience from which such meanings are created. In that creation I stand on the foundation of personal freedom: the “foundation” because I cannot live without making sense of my experience. No matter how much is taken from me, this inward freedom to create meaning remains my ultimate sanctuary.
As the prediction imperative drives deeper into the self, the value of its surplus becomes irresistible, and cornering operations escalate. What happens to the right to speak in the first person from and as my self when the swelling frenzy of institutionalization set into motion by the prediction imperative is trained on cornering my sights, blinks, and utterances on the way to my very thoughts as a means to others’ ends? It is no longer a matter of surveillance capital wringing surplus from what I search, buy, and browse. Surveillance capital wants more than my body’s coordinates in time and space. Now it violates the inner sanctum as machines and their algorithms decide the meaning of my breath and my eyes, my jaw muscles, the hitch in my voice, and the exclamation points that I offered in innocence and hope. What happens to my will to will myself into the first person when the surrounding market cosmos disguises itself as my mirror, shape-shifting according to what it has decided I feel ot felt or will feel: ignoring, goading, chiding, cheering, or punishing me? Surveillance capital cannot keep from wanting all of me as deep and far as it can go.
Chapter 10. Make Them Dance
II. Facebook Writes the Music
Most pointedly, Facebook’s declaration of experimental authority claims surveillance capitalists’ prerogatives over the future course of others’ behavior. In declaring the right to modify human action secretly and for profit, surveillance capitalism effectively exiles us from our own behavior, shifting the locus of control over the future tense from “I will” to “You will.” Each one of us may follow a distinct path, but economies of action ensure that the path is already shaped by surveillance capitalim’s economic imperatives. The struggle for power and control in society is no longer associated with the hidden facts of class and its relationship to production but rather by the hidden facts of automated engineered behavior modification.
III. Pokémon Go! Do!
By July 13, [John] Hanke admitted to the Financial Times that in addition to “in-app payments” for game kit, “there is a second component to our business model at Niantic, which is the concept of sponsored locations.” He explained that this new revenue of stream had always been in the plan, noting that companies will “pay us to be locations within the virtual game board—the premise being that it is an inducement that drives foot traffic.” These sponsors, Hanke explained, would be charged on a “cost per visit” basis, similar to the “cost per click” used in Google’s search advertising.
IV. What Were the Means of Behavioral Modification?
The new global means of behavioral modification that we see under construction at Facebook and Niantic represent a new regressive age of autonomous capital and heteronomous individuals, when the very possibilities of democratic flourishing and human fulfillment depend upon the reverse. This unprecedented state of affairs rises above debates about the Common Rule. It goes to the heart of our allegiance to the ideals of a democratic society, with full knowledge of the challenges that burden those ideals.
From the first lines of the preface of the subcommittee’s 1974 report, authored by Senator [Sam] Ervin, it should be evident to any twenty-first-century captive of surveillance capitalism that US society has undergone a social discontinuity more profound than the mere passage of decades suggests. It is worth reading Ervin’s own words, to grasp the passion with which he located the subcommittee’s work at the heart of the Enlightment project, pledging to defend the liberal ideals of freedom and dignity: “When the founding fathers established our constitutional system of government, they based it on their fundamental belief in the sanctity of the individual…. They understood that self-determination is the source of individuality, and individuality is the mainstray of freedom…. Recently, however, technology has begun to develop new methods of behavior control capable of altering not just an individual’s actions but his very personality and manner of thinking… the behavioral technology being developed in the United States today touches upon the most basic sources of individuality and the very core of personal freedom… the most serious threat… is the power this technology gives one man to impose his views and values on another…. Concepts of freedom, privacy and self-determination inherently conflict with programs designed to control not just physical freedom, but the source of free thought as well. … The question becomes even more acute when these programs are conducted, as they are today, in the absence of strict controls. As disturbing as behavioral modification may be on a theoretical level, the unchecked growth of the practical technology of behavioral control is the cause for even greater concern.”
Just as surveillance capitalism was initially able to root and flourish under the protection of a so-called “war against terror” and the compulsion for certainty that it stirred, in the middle of the twentieth century the means of behavioral modification migrated from the lab to the world at large primarily under the cover of cold-war anxieties. Later, the behavior-change professionals of the 1960s and 1970s were summoned into civilian practice by a society turned fearful after years of urban riots, political protests, and rising levels of crime and “delinquency.” The senators reasoned that calls for “law and order” had motivated the search for “immediate and efficient means to control violence and other forms of anti-social behavior. The interest in controlling violence replaced more time-consuming attempts to understand its sources.
The First Amendment, the subcommittee argued, “must equally protect the individual’s right to generate ideas,” and the right to privacy should protect citizens from intrusions into their thoughts, behavior, personality, and identity lest these concepts “become meaningless.”
Today’s means of behavioral modification are aimed unabashedly at “us.” Everyone is swept up in this new market dragnet, including the psychodramas of ordinary, unsuspecting fourteen-year-olds approaching the weekend with anxiety. Every avenue of connectivity serves to bolster private power’s need to seize behavior for profit. Where is the hammer of democracy now, when the threat comes from your phone, your digital assistant, your Facebook login? Who will stand for freedom now, when Facebook threatens to retreat into the shadows if we dare to be the friction that disrupts economies of action that have been carefully, elaborately, and expensively constructed to exploit our natural empathy, elude our awareness, and circumvent our prospects for self-determination? If we fail to take notice now, how long before we forget who we were before they owned us, bent over the old texts of self-determination in the dim light, the shawl around our shoulders, magnifying glass in hand, as if deciphering ancient hieroglyphs?
Chapter 11. The Right to the Future Tense
I. I Will to Will
In fulfilling my promise, I make it a manifest. This act of will is my claim to the future tense.
When we refer to the past, we see only objects, but the view to the future brings “projects,” things that are yet to be. With freedom of will we could have “left undone” but for our commitment. “A will that is not free,” [Hannah] Arendt concludes, “is a contradiction in terms.”
II. We Will to Will
This is the essence of the uncontract, which transforms the human, legal, and economic risks of contracts into plans constructed, monitored, and maintained by private firms for the sake of guaranteed outcomes: less contract utopia than uncontract dystopia.
So let us establish our bearings. Uncertainty is not chaos but rather the necessary habitat of the present tense. We choose the fallibility of shared promises and problem solving over the certain tyranny imposed by a dominant power or plan because this is the price we pay for the freedom to will, which founds our right to the future tense. In the absence of this freedom, the future collapses into an infinite present of mere behavior, in which there can be no subjects and no projects: only objects. In the future that the surveillance capitalism prepares for us, my will and yours threaten the flow of surveillance revenues. Its aim is not to destroy us but simply to author us and to profit from that authorship. Such means have been imagined in the past, but only now are they feasible. Such means have been rejected in the past, but only now have they been allowed to root. We are ensnared without awareness, shorn of meaningful alternatives for withdrawal, resistance, or protection.
III. How Did They Get Away with It?
- Dependency: The free services of Google, Facebook, and others appealed to the latent needs of second-modernity individuals seeking resources for effective life in an increasingly hostile institutional environment. Once bitten, the apple was irresistible. As surveillance capitalism spread across the internet, the means of social participation become coextensive with the means of behavioral modification. The exploitation of second-modernity needs that enabled surveillance capitalism from the start eventually imbued nearly every channel of social participation. Most people find it difficult to withdraw from these utilities, and many ponder if it is even possible.
We have only gradually come to understand that the specific methods of domination employed by industrial capitalism for more than two centuries have fundamentally disoriented the conditions that support life on Earth, violating the most basic precepts of civilization. Despite the many benefits and immense accomplishments of industrial capitalism, it has left us perilously close to repeating the fate of the Easter islanders, who wrecked the ground that gave them life, then fashioned statues to scan the horizon for the aid and succor that would never come. If industrial capitalism dangerously disrupted nature, what havoc might surveillance capitalism wreak on human nature?
If we are to rediscover our sense of astonishment, then let it be here: if industrial civilization flourished at the expense of nature and now threatens to cost us the Earth, an information civilization shaped by surveillance capitalism will thrive at the expense of human nature and threatens to cost us our humanity.
Chapter 12. Two Species of Power
VI. A Technology of Human Behavior
[B.F.] Skinner concluded that the literature of freedom and dignity “stands in the way of further human achievement.” He argued that the missing puzzle piece holding back the urgent development of the “instruments and methods” essential for a technology of behavior was the stubborn allegiance to these antique notions among people determined to preserve “due credit” for their actions. The belief in “autonomous man” is a regressive source of resistance to a rational future, an “alternative explanation of behavior” that obstructs the advancement of society.
VII. Two Utopias
Until the rise of surveillance capitalism, the prospect of instrumentarian power was relegated to a gauzy world of dream and delusion. This new species of power follows the logic of Planck, Meyer, and Skinner in the forfeit of freedom and knowledge, but those scientists each failed to anticipate the actual terms of this surrender. The knowledge that now displaces our freedom is proprietary. The knoledge is theirs, but the lost freedom belongs solely to us.
Chapter 13. Big Other and the Rise of Instrumentarian Power
I. Instrumentarianism as a New Species of Power
Big Other does not care what we think, feel, or do as long as its millions, billions and trillions of sensate, actuating, computational eyes and ears can observe, render, datafy, and instrumentalize the vast reservoirs of behavioral surplus that are generated in the galactic uproar of connection and communication. In this new regime, objectification is the moral milieu in which our lives unfold. Although Big Other can mimic intimacy through the tireless devotion of the One Voice—Amazon-Alexa’s chirpy service, Google Assistant’s reminders and endless information—do not mistake these soothing sounds for anything other than the exploitation of your needs. I think of elephants, that most majestic of all mammals: Big Other poaches our behavior for surplus and leaves behind all the meaning lodged in our bodies, our brains, and our beating hearts, not unlike the monstruous slaughter of elephants for ivory. Forget the cliché that if it’s free, “You are the product.” You are not the product; you are the abandoned carcass. The “product” derives from the surplus that is ripped from your life.
II. A Market Project of Total Certainty
Instrumentarian power, like Goethe’s Faust, is morally agnostic. The only moral imperative here is distilled from the point of view of a thin utopian gruel. If there is sin, it is the sin of autonomy: the audacity to reject the flows that herd us all toward predictability. Friction is the only evil. Obstruction in law, action, or rethoric is simply reactionary. The norm is submission to the supposed iron laws of technological inevitability that brook no impediment. It is deemed only rational to surrender and rejoice in new conveniences and harmonies, to wrap ourselves in the first text and embrace a violent ignorance of its shadow.
Is this to be our home: the automation of the self as the necessary condition of the automation of society, and all for the sake of others’ guaranteed outcomes?
III. The China Syndrome
According to a report in China Daily, debtors on the list were automatically prevented from flying 6.15 million times since the blacklist was launched in 2013. Those in contempt of court were denied sales of high-speed train tickets 2.22 million times. Some 21,000 defaulters have missed out on executive positions at enterprises as a result of their debts. The Industrial and Commercial Bank of China said it had refused loans worth more than 6.97 billion yuan ($1.01 billion) to debtors on the list. No one is sent to a reeducation camp, but they may not be allowed to purchase luxury goods.
Chapter 14. A Utopia of Certainty
IV. Confluence as Machine Relations
“The intelligent edge,” Microsoft developers are told, “is the interface between the computer and the real world… you can search the real world for people, objects and activities, and apply polices to them…. Once people and their relationships are rendered as otherized, equivalent “things in the cloud,” 25 billion computational actuating devices can be mobilized to shape behavior around safe and harmonious “policy” parameters. The most “profound shift,” [Satya] Nadella explained, is that “people and their relationships with other people is now a first-class thing in the cloud. It’s not just people but it’s their relationships, it’s their relationships to all of the work artifacts, their schedules, their project plans, their documents; all of that now is manifest in Microsoft Graph.” These streams of total information are key to optimizing “the future of productivity,” Nadella exulted.
The result is that “policies” are functionally equivalent to plans, as Big Other directs human and machine action. It ensures that doors will be locked or unlocked, car engines will shut down or come to life, the jackhammer will scream “no” in suicidal self-sacrifice, the worker will adhere to norms, the group will swarm to defeat anomalies. We will all be safe as each organism hums in harmony with every other organism, less a society than a population that ebbs and flows in perfect frictionless confluence, shaped by the means of behavioral modification that elude our awareness and thus can neither be mourned nor resisted.
Chapter 15. The Instrumentarian Collective
III. The Principles of an Instrumentarian Society
[Alex] Pentland avoids the question “Whose greater good?” How is the greater good determined when surveillance capitalism owns the machines and the means of behavioral modification? “Goodness” arrives already oriented toward the interests of the owners of the means of behavioral modification and the clients whose guaranteed outcomes they seek to achieve. The greater good is someone’s, but it may not be ours.
Computation this replaces the political life of the community as the basis for governance. The depth and breadth of instrumentation make it possible, Pentland says, to calculate idea flow, social network structure, the degree of social influence between people, and even “individual susceptibilities to new ideas.” Most important, instrumentation makes it possible for those with the God view to modify others’ behavior. The data provide a “reliable prediction of how changing any of these variables will change the performance of all the people in the network” and thus achieve the optimum performance of Skinner’s superorganism. This mathematics of idea flow is the basis of for Pentland’s vision of a “plan“ that dictates the targets and objectives of behavior change. Human behavior must be herded and penned within the parameters of the plan, just as behavior at Nadella’s construction site was continuously and automatically molded to policy parameters. Pentland calls this “tuning the network.”
Pentland ignores the role of empathy in emulation because empathy is a felt experience that is not subject to the observable metrics required for computational governance. Instead, Pentland subscribes to the label Homo imitans to convey that is mimicry, not empathy, and certainly not politics, which defines human existence.
Chapter 16. Of Life in the Hive
I. Our Canaries in the Coal Mine
In a subsequent elaboration on the psychological consequences of experiencing oneself from the “outside looking in,” a 2017 survey of young British women ages 11-21 suggests that the social principles of instrumentarian society, so enthusiastically elaborated by Pentland and endorsed by surveillance capitalist leaders, appear to be working effectively. Thirty-five percent of the women said that their biggest worry online was comparing themselves and their lives with others as they are drawn into “constant comparison with often idealized versions of the lives, and bodies, of others.”
II. The Hand and the Glove
The magnetic pull that social media exerts on young people drives them toward more automatic and less voluntary behavior. For too many, that behavior shades into the territory of genuine compulsion.
III. Proof of Life
Young life now unfolds in the spaces of private capital, owned and operated by surveillance capitalists, mediated by their “economic orientation,” and operationalized in practices designed to maximize surveillance revenues. These private spaces are the media through which every form of social influence—social pressure, social comparison, modeling, subliminal priming—is summoned to tune, herd, and manipulate behavior in the name of surveillance revenues. This is where adulthood is now expected to emerge.
When News Feed was first launched in 2006, it transformed Facebook from a site where users had to visit friends’ pages to see their updates to having those messages automatically shared in a stream on each person’s home page. Hundreds of thousands of users joined opposition groups, repelled by the company’s unilateral invasion of privacy. “No one was prepared for their online activity to suddenly be fodder for mass consumption,” recalled the tech news site TechCrunch on News Feed’s tenth anniversary in 2016, as it offered readers “The Ultimate Guide to the News Feed,” with instructions on “how you can get your content seen by more people,” how to appear “prominently,” and how to resonate with your “audience.” Ten years earlier a TechCrunch reporter had presciently noted, “Users who don’t participate will quickly find that they are falling out of the attention stream, and I suspect will quickly add themselves back in.”
Facebook’s science and design expertise aim for a closed loop that feeds on, reinforces, and amplifies the individual user’s inclination toward fusion with the group and the tendency to over-share personal information. Although these vulnerabilities run deepest among the young, the tendency to over-share is not restricted to them. The difficulty of self-imposed discipline in the sharing of private thoughts, feelings, and other personal information has been amply demonstrated in social research and summarized in an important 2015 review by Carnegie Mellon professors Alessandro Acquisti, Laura Brandimarte, and George Loewestein. They concluded that because of a range of psychological and contextual factors, “People are often unaware of how it can be used, and even in the rare situations when they have full knowledge of the consequences of sharing, uncertain about their own preferences….” The researchers cautioned that people are “easily influenced in what and how much they disclose. Moreover, what they share can be used to influence their emotions, thoughts, and behaviors….” The result is alteration in “the balance of power between those holding the data and those who are subjects of that data.”
Young people crave the hive, and Facebook gives it to them, but this time it’s owned and operated by surveillance capital and scientifically engineered into a continuous spiral of escalating fusion, amply fulfilling Shaffer’s five criteria for achieving an addictive state of compulsion. Potency is engineered according to a recipe dictated by the hidden attributes of those who crave valorization from the group to fill the void where a self must eventually stand.
IV. The Next Human Nature
A three-phase investigation in 2014 found that spending a lot of time browsing profiles on Facebook produced a negative mood immediately afterward. Then, upon reflection, those users felt worse, reckoning that they had wasted their time. Instead of walking away, they typically chose to spend even more time browsing the network in the hope of feeling better, chasing the dream of a sudden and magical reversal of fortune that would justify past suffering. This cycle not only leads to more social comparison and more envy, but it can also predict depressive symptoms.
Life in the hive favors those who most naturally orient toward external cues rather than toward one’s own thoughts, feelings, values, and sense of personal identity.
VI. No Exit
In the closing lines of Jean-Paul Sartre’s existential drama No Exit, the character Garcin arrives at his famous realization, “Hell is other people.” This was not intended as a statement of misanthropy but rather as a recognition that the self-other balance can never be adequately struck as long as the “others” are constantly “watching.” Another mid-century social psychologist, Erving Goffman, took up these themes in his immortal The Presentation of Self in Everyday Life. Goffman developed the idea of the “backstage” as the region in which the self retreats from the performative demands of social life.
I ask if this twenty-first-century work of self-presentation is really that much different from what Goffman had described: have we just traded the real world for the virtual in constructing and performing our personas? There is a lull as the students reflect, and then a young woman speaks: “The difference is tht Goffman assumed a backstage where you could be your true self. For us, the backstage is shrinking.There is almost no place where I can be my true self. Even when I am walking by myself, and I think I am backstage, something happens—an ad appears on my phone or someone takes a photo, and, I discover that I am onstage, and everything changes.
Chapter 17. The Right to Sanctuary
I. Big Other Outruns Society
The same themes appear from the perspective of psychology. Those who would eviscerate sanctuary are keen to take the offensive, putting us off guard with the guilt-inducing question “What have you got to hide?” But as we have seen, the crucial developmental challenges of the self-other balance cannot be negotiated adequately without the sanctity of “disconnected” time and space for the ripening of inward awareness and the possibility of reflexivity: reflection on and by oneself. The real psychological truth is this: If you’ve got nothing to hide, you are nothing. One empirical study makes the point. In “Psychological Functions of Privacy,” Darhl Pedersen defines privacy as a “boundary control process” that invokes the decision rights associated with “restricting and seeking interaction.” Pedersen’s research identifies six categories of privacy behaviors: solitude, isolation, anonymity, reserve, intimacy with friends, and intimacy with family. His study shows that these varied behaviors accomplish a rich array of complex psychological “privacy functions” considered salient for psychological health and developmental success: contemplation, autonomy, rejuvenation, confiding, freedom, creativity, recovery, catharsis, and concealment. These are experiences without which we can neither flourish nor usefully contribute to our families, communities, and society.
III. Every Unicorn Has a Hunter
“No exit” is the necessary condition for Big Other to flourish, and its flourishing is the necessary condition for all that is meant to follow: the tides of behavioral surplus that will meet every market player with guaranteed outcomes, the bypass of trust in favor of the uncontract’s radical indifference, the paradise of effortless connection that exploits the needs of harried second-modernity individuals and transforms their lives into the means to others’ ends, the plundering of the self, the extinction of autonomous moral judgment for the sake of frictionless control, the actuation and modification that quietly drains the will to will, the forfeit of your voice in the first person in favor of others’ plans, the destruction of the social relations and politics of the old and slow and still-unfilfilled ideals of self-determining citizens bound to the legitimate authority of democratic governance.
Chapter 18. A Coup from Above
I. Freedom and Knowledge
The combination of knowledge and freedom works to accelerate the asymmetry of power between surveillance capitalists and the societies in which they operate. This cycle will be broken only when we acknowledge as citizens, as societies, and indeed as a civilization that surveillance capitalists know too much to qualify for freedom.
III. The New Collectivism and Its Masters of Radical Indifference
Among the few reports that have managed to assess Facebook’s operations, the theme is consistent. This secret workforce—some estimates reckon at least 100,000 “content moderators,” and others calculate the number to be much higher—operates at a distance from the corporation’s core functions, applying a combination of human judgment and machine learning tools. Sometimes referred to as “janitors,” they review queues of content that users have flagged as problematic. Although some general rules apply across the board, such as eliminating pornography and images of child abuse, a detailed rulebook aims to reject as little content as possible in the context of a local assessment of the minimum thresold of user tolerance. The larger point of the exercise is to find the point of equilibrium between the ability to pull users and their surplus into the site and the risk of repelling them. This is a calculation of radical indifference that has nothing to do with assessing the truthfulness of content or respecting reciprocities with users. This tension helps to explain why disinformation is not a priority. One investigative report quotes a Facebook insider: “They absolutely have the tools to shut down the fake news….”
IV. What Is Surveillance Capitalism?
[Thomas] Paine argued for the capabilities of the common person and against aristocratic privilege. Among his reasons to reject aristocrathic rule was its lack of accountability to the needs of people, “because a body of men holding themselves accountable to nobody, ought not to be trusted by any body.”
Like the adelantados and their silent incantations of the Requerimiento, surveillance capitalism operates in the declarative form and imposes the social relations of a pre-modern absolutist authority. It is a form of tyranny that feeds on people but it is not of the people. In a surreal paradox, this coup is celebrated as “personalization,” although it defiles, ignores, overrides, and displaces everything about you and me that is personal.
Its solution to the increasingly clamorous demands for effective life pivots on the gradual elimination of chaos, uncertainty, conflict, abnormality, and discord in favor of predictability, automatic regularity, transparency, confluence, persuasion, and pacification. We are expected to cede our authority, relax our concerns, quiet our voices, go with the flow, and submit to the technological visionaries whose wealth and power stand as assurance of their superior judgment. It is assumed that we will accede to a future of less personal control and more powerlessness, where new sources of inequality divide and subdue, where some of us are subjects and many are objects, some are stimulus and many are response.
This “seventh extinction” will not be of nature but of what has been held most precious in human nature: the will to will, the sanctity of the individual, the ties of intimacy, the sociality that binds us together in promises, and the trust they breed. The dying off of this human future will be just as unintended as any other.
VI. Be the Friction
So it is for me and perhaps for you: the bare facts of surveillance capitalism necessarily arouse my indignation because they demean human dignity. The future of this narrative will depend upon the indignant citizens, journalists, and scholars drawn to this frontier project; indignant elected officials and policy makers who understand that their authority originates in the foundational values of democratic communities; and, especially, indignant young people who act in the knowledge that effectiveness without autonomy is not effective, dependency-induced compliance is no social contract, a hive with no exit can never be a home, experience without sanctuary is but a shadow, a life that requires hiding is no life, touch without feel reveals no truth, and freedom from uncertainty is no freedom.