What is Behavioural Surplus?
Behavioural surplus is the extra information that can be recorded as data whenever we interact with an app, platform or website. Nobody considered this important until someone at Google discovered in the early 2000s they could make a lot of money with it. Behavioural surplus can be processed as data, in order to predict an individual's future behaviour with greater certainty. That certainty can then be sold to advertisers.
This business model, invented by Google, now permeates the Internet and our electronic devices, and even determines what types of products are developed next, according to American scholar and author Shoshana Zuboff in her book The Age of Surveillance Capitalism published in 2019.
I am just over half-way through the book and it is a thrilling read. Zuboff presents a thorough, well-researched argument. We live in an age of surveillance capitalism, which, like industrial capitalism, has had its pioneers, time of discovery, experimentation and spread, all of which has led to normalisation and acceptance by society.
The author is critical about the element of secrecy that underlies the gathering and selling of behavioural data, and she exposes in detail what methods corporations (Big Tech) use to systematically side-step governmental attempts at regulation.
What type of behavioural data can be collected?
In chapter three, titled "The Discovery of Behavioral Surplus", Zuboff describes the extra information that became available when users interacted with the Google Search engine in the early 2000s:
For example, in addition to key words, each Google search query produces a wake of collateral data, such as the number and pattern of search terms, how a query is phrased, spelling, punctuation, dwell times, click patterns, and location.
A careful reader may notice that I wrote behavioural in the above subtitle, but retained the author's American spelling behavioral when quoting her chapter title. The fact that I use British spelling may or may not be interesting information on its own, but if added to an accumulating pile of similar tidbits of experiential and behavioural information about me, it could be useful in predicting my future behaviour.
This is what behavioural surplus is. While companies like Google aim to trivialise the importance of this data in public debate, using synonyms like digital 'waste', 'exhaust' and 'breadcrumbs', according to Zuboff, this data is central to how money is made.
While this surplus data was at one time processed by Google in order to improve user experience, it is now the core of its business for selling predictions. Considering how large and successful Google is today, behavioural surplus is big business.
A striking claim in Zuboff's book is that under surveillance capitalism, new product development is driven by the motivation to find ways to collect more behavioural data and also to discover new types of user behaviour to mine. This has led to physical wearable products like like smart watches and glasses, but also to networked household items and cars.
These items of convenience are created and designed to learn more about us, in order to be able to predict individual behaviour more accurately. Zuboff explains that:
The idea of being able to deliver a particular message to a particular person at just the moment when it might have a high probability of actually influencing his or her behavior was, and had always been, the holy grail of advertising.
The sale of advertising space on our devices is done via live, automated bidding wars through tools that were also invented by Google in the same period.
A tale of secrecy: the product and the customer
You may have heard the saying that if software is free, then you are the product. Soshana Zuboff adds important nuance to this cliche when she states:
We are no longer the subjects of value realization. Nor are we, as some have insisted, the "product" of Google's sales. Instead, we are the objects from which raw materials are extracted and expropriated for Google's prediction factories. Predictions about our behavior are Google's products, and they are sold to its actual customers but not to us. We are the means to others' ends.
The type of capitalism that has dominated the last century is fairly transparent. Companies try to make and sell products and/or services to other companies and individual customers. If they are unsuccessful, they fail. We see this form of capitalism at play in the advertisements we see at home and outdoors, our shopping experiences, and stories about successes and failures of companies in the news. It is not difficult to understand that, no matter how friendly or pleasant a company, a hotel, a travel agent may be, profit is always the bottom line.
While I know the names of some traditional big businesses (Shell, Unilever, Johnson & Johnson), before today I couldn't name a single data broker that buys and sells user data in the shadows, even though this market is apparently huge (ever heard of Equifax, Nielsen Marketing Cloud or Data Axle?). Nor do I know what a Google behaviour prediction profile looks like as a product, or understand how companies can be certain that these live bidding wars will lead to more revenue. The mechanics of surveillance capitalism are hidden.
Zuboff argues that secrecy is key to this business model. I recently looked into buying an ergonomic desk chair and was quickly discouraged by the exorbitant price tags. Imagine a furniture company that sells ergonomic desk chairs with built-in WiFi chips and data-collecting sensors, at a price close to the production cost. If the company clearly explained what it does, namely record when and how often I sit on the chair, my body temperature and heart rate, I might still decide to buy this chair because, at such a low cost, this feels like a fair exchange. But if they were to sell the same cheap data-gathering chair without informing me of its true purpose, I can only be satisfied as a customer so long as I am left in the dark.
You could argue in hindsight that I should have been more skeptical about the low cost of an otherwise expensive product, but then: why was secrecy necessary in the first place? Hiding something as significant as surveillance from your customers does not seem like a smart thing to do if you want to ensure a long-lasting relationship with them.
Clarity
Reading The Age of Surveillance Capitalism has been an eye-opener for me. I have been writing about various aspects of data privacy—as a parent, someone's partner, a hobbyist tinkerer, a teacher, an employee— for nearly three years now. My motivation to resist invasive data scraping methods and eventually write about my experiences came from a gut feeling that something is not right about devices and software that are always communicating with company servers in nontransparent ways, nor is it right to pay for a device like a smartphone and have so little control over it.
Shoshana Zuboff's book is helping me bring into sharp focus what happens to personal data on a corporate level, and why it is so difficult to find out and understand how these mechanisms actually work. The insight that companies like Google, which thrive on selling user experience, might actually design new products specifically to gather more types of behavioural data, has clarified my understanding of how the world around me operates.
Zuboff explains that these developments have been able to take place more or less by stealth because surveillance capitalism is unprecedented—we have had nothing to compare it to that might have rung alarm bells and made us wary of engaging with these exciting new products. Like most other people, I fully embraced all the wonderful free or very cheap software tools that came out during the first decade of the 2000s, like Gmail, Google Drive, Evernote, Dropbox and all social media platforms.
I wonder why it never occurred to me to ask how these companies were making money. It was an unprecedented experience, but I worry that the birth of surveillance capitalism and the scraping of my behavioural data was made possible due to a sense of individual entitlement. Perhaps I never stopped to ask critical questions about the economic aspect of these massive scale free software tools because I felt I somehow deserved these freebies, forgetting that even in traditional capitalist exchanges, companies don't give out things for free without financial motivation.
Luboff, however, argues that it was (and still is) the companies' own entitlement that enabled steamrolling and side-stepping regulation or resistance. Her analogy to European colonisers in history is food for thought.
Conclusions
We now find ourselves in a position where data gathering for profit has become ubiquitous. I have seen some resistance to AI models being trained on human-created work, but there is also a pervasive mood of defeatism, because we are once again catching up to the facts. Plus, AI is a new shiny toy we can play with, and mostly for free.
The creepy thing is that surveillance capitalism always hungers for more, and now Google and all the companies that have followed its lead are finding ways to place behaviour and experience scraping sensors in our real-life environments: smart TVs, health-sensors, cars—all the 'Internet of Things' products that are crowding around to get into our lives. Zuboff presents the image of a future dystopia in which we are all fully immersed in a computerised surveillance environment, rather than having to go to our smartphones or laptops, which is a likely and truly frightening idea.
I look forward to reading the rest of her book, hopefully to also find out what we can do about the structural changes that have happened all around us in part because we tend to let our guards down when presented with shiny new toys.
Certainly, naming the organisations and exposing the element of secrecy is a starting point. I have just read that the EU is trying to get Big Tech companies to expose their algorithms with the Digital Service Act of 2022 in an effort to protect individual privacy, among other goals. The trouble, as I have found out by writing up this article, is that the hidden financial system of behavioural surplus scraping is not something that is easily explained.
Documentation
Shoshana Zuboff Wikipedia page
Digital Services Act Wikipedia page
The Ultimate List of Data Brokers To Watch Out For
-----Discuss on Mastodon-----
Subscribe to my blog via email or RSS feed.
Find me on Mastodon.
Back to Blog