Summary: In this blog post we discuss the rising awareness of surveillance capitalism and how the existing and “in research” privacy tools can help alleviate it and give a ray of hope for data security to the world. The term “Surveillance Capitalism” was coined by Shoshana Zuboff, who is a privacy activist, author and also a professor at Harvard Business School. Zuboff also wrote a book called “The Age of Surveillance Capitalism” which is a must read, if you’d like to dig into this topic even more.

This article is an open call for social & political scientists, STS researchers, philosophers, and more to join our community and join the discussions around privacy technology. We welcome your skillset. See our Welcome Package here.


What does 'Surveillance Capitalism' mean?

“Surveillance capitalism” is a new kind of business model that runs predictive models on your behaviour data from social media and uses those predictions to enhance and grow the business. In a more colloquial notion, it refers to an economic system centred around the commodification of personal data with the core purpose of profit-making.

If you're already familiar with surveillance capitalism, you can skip ahead to our discussion on whether privacy tech is a solution. If you aren't familiar with this area, let’s get acquainted with a few terms first:

  1. Behavioural surplus: Closely related to behavioural economics, behavioural surplus is data that goes beyond online product and service use. It can include information related to a person's location, age, profession, lifestyle, habits, and a range of personal and professional preferences.
  2. Instrumentarian Power: Zuboff's concept of "instrumentarian power" is the power of governments and corporations to use technology and infrastructure to manipulate people in subtle but effective ways. It turns people into "instruments" that are used in predictable ways to achieve the government's and the corporations' goals.
  3. Behavioural future markets: Prediction products are then traded in what I call “behavioural futures markets,” where surveillance capitalists sell certainty to their business customers. Google's “click-through rate” was the first globally successful prediction product, and its ad markets were the first to trade in human futures.
  4. Surveillance Dividend: There is no formal definition of this term yet but the way New York times describes it is “Google created the first insanely lucrative markets to trade in human futures, what we now know as online targeted advertising, based on their predictions of which ads users would click. Between 2000, when the new economic logic was just emerging, and 2004, when the company went public, revenues increased by 3,590 percent. This startling number represents the “surveillance dividend”. Please follow this article for more information.

According to Zuboff, these companies have been collecting data or behavioural surplus from our internet activity and selling it to companies that are running predictive machine learning models on this data to predict future behaviour or the product someone will be interested in at the very moment (or later) and send recommendations or advertisements on the same. These predictive products are being sold on what is called “behavioural future markets” where “certainty” is being sold by these surveillance capitalists to their business customers, who are basically paying or putting their trust on the “Instrumentarian power” of these capitalists. That is how they are exponentially increasing their “Surveillance Dividend”

Some of the most common business models of surveillance capitalism according to Zuboff are:

  1. Voting Manipulation: According to Zuboff, the Trump 2016 election campaign as well as the Brexit campaign took help from a data analytics company called “Cambridge Analytica”  who used the data of millions of people to build predictive models of their future behaviours and to even send manipulative recommendations to change their mind about who they will vote for.
  2. Guaranteed footfalls: Zuboff has also brought into light the fact that different fast food chains have exploited and manipulated data of a famous augmented reality game, where players have to chase some characters on the real world which will show up in their smartphone. This game had been manipulated to meet the needs of these fast food chains for getting “guaranteed footfalls” outside their stores.
  3. Using subliminal advertising on data from social media and internet activity: This is one of the most common use cases of user data, detecting emotions from the images that people upload on a social media platform or from other internet activities,  and then advertise products accordingly(also known as subliminal advertising).

All of the above mentioned examples can be a part of the hypothetical “5-factor personality test” click here for more.


Now let’s get into the technology side, what do we mean by privacy preserving techniques? We have to keep in mind that when we use the term 'privacy preserving' here, we refer to preserving the identity of the data holder or the sensitive information that can be leaked from the data. Some of these technologies are:

  1. Differential Privacy: Differential privacy is a system for publicly sharing information about a data set by describing the patterns of groups within the data set while withholding information about individuals in the data set.
  2. Federated learning: Federated Learning is a way of training predictive models on someone’s personal data without pulling the data to these servers, but instead sending the model to the local devices where the data exists , training the models there and sending the results to the server.
  3. Encrypted Deep Learning: Encrypted deep learning is a field that is sort of a merge between deep learning & cryptography. It uses the same core technologies that are used in federated learning/encrypted federated learning and applies it to the actual prediction of a deep learning model, but the goal is to predict on encrypted data while the model is also encrypted.

So, could privacy-preserving tools help mitigate the problems posed by surveillance capitalism?

The privacy-preserving technologies that exist are more about protecting the identity of the people whose data is being used for making group-wise predictions.

Privacy preserving technologies could help preserve the privacy of the individual, in the sense that it could prevent the aggregation and storage of large amounts of your personal data for the purpose of ad targeting. At OpenMined, the more we read about this, the more we think that although privacy tools can solve an important part of this problem, the answer might be:

No.

Privacy preserving tools alone won't be able to solve all aspects of these multi-faceted issues.

Privacy preserving techniques are tools - and although they might allow machine learning models to be trained in a privacy preserving way, they don't prevent the insights gained from the data from being used in intrusive ways, such as voter manipulation.

In fact, it's possible that privacy tools could make it even easier for companies to implement highly targeted advertising or voter manipulation, since they would enable the predictive models to be trained on ever-more sensitive data.

When you invent a ship, you also invent the shipwreck. What will be the shipwreck of privacy-preserving technology? We think it's important for the privacy tech community to continue actively talking about this with the philosophers, social scientists, political scientists, and science & technology studies (STS) researchers to make sure the tools we're building don't cause more harm than good. OpenMined began as a community of technical people building open-source code, but it will take a variety of skillsets to truly address the privacy issues we see today, and to help ensure our code is used for good. This article is an open call for social & political scientists, STS researchers, philosophers, and more to join our community and join the discussions around privacy technology. We welcome your skillset.

We'd love to hear your input. If you'd like to join the conversation around this, join us on the writing team at slack.openmined.org.

If you're new to privacy-preserving tech, you can read this explanation series we made for beginners: Privacy-Preserving Data Science, Explained.


If you want to join our mission on making the world more privacy preserving:


References:

  1. "Behavioral Surplus" Is Not Evil – It's Essential to Customer Experience
  2. Surveillance Capitalism wiki page
  3. Privacy in the era of Data Bureaucracy
  4. Explainer: what is surveillance capitalism and how does it shape our economy?
  5. NY times article “You Are Now Remotely Controlled”
  6. Evaluation of Privacy-Preserving Technologies for Machine Learning
  7. Surveillance Capitalism Core Concepts
  8. A Threat to Global Democracy: How Facebook & Surveillance Capitalism Empower Authoritarianism
  9. Derren Brown Tricks Advertisers With Subliminal Messaging
  10. HiNative question(Would anyone explain the meaning of "instrumentarian power" by Shoshana Zuboff?)
  11. Project-Syndicate Surveillance Capitalism blog
  12. OpenMined’s tutorial on Homomorphic Encryption