Summary :-   In this blog post we’re going to discuss the rising awareness of surveillance capitalism and how the existing and “in research” privacy tools can help alleviate it and give a ray of hope for data security to the world. The term “Surveillance Capitalism” was coined by Shoshana Zuboff, who is a privacy activist, author and also a professor at Harvard Business School. Zuboff also wrote a book called “The Age of Surveillance Capitalism” which is a must read, if you’d like to dig into this topic even more.

“Surveillance capitalism”  as Shoshana Zuboff explains it is a new kind of Business model that runs predictive models on your behavior data from social media and uses those predictions to enhance and grow their business.  In a more colloquial notion it refers to an economic system centered around the commodification of personal data with the core purpose of profit-making. Since personal data can be commodified it has become one of the most valuable resources on earth.

via GIPHY

gif by MOST EXPENSIVEST

Before we get started let’s get acquainted with a few terms :-

  1. Behavioral surplus :- Behavioral Surplus is a huge concept and very closely related to behavioral economics but for starters it is data that goes beyond online product and service use. It can include information related to a person's location, age, profession, lifestyle, habits, and a range of personal and professional preferences.
  2. Instrumentarian Power :- Zuboff's concept of "instrumentarian power" is the power of governments and corporations to use technology and infrastructure to manipulate people in subtle but effective ways. It turns people into "instruments" that are used in predictable ways to achieve the government's and the corporations' goals.
  3. Behavioral future markets :- Prediction products are then traded in what I call “behavioral futures markets,” where surveillance capitalists sell certainty to their business customers. Google's “click-through rate” was the first globally successful prediction product, and its ad markets were the first to trade in human futures.
  4. Surveillance Dividend :- There is no formal definition of this term yet but the way New York times describes it is “Google created the first insanely lucrative markets to trade in human futures, what we now know as online targeted advertising, based on their predictions of which ads users would click. Between 2000, when the new economic logic was just emerging, and 2004, when the company went public, revenues increased by 3,590 percent. This startling number represents the “surveillance dividend”. Please follow this article for more information.

So according to Zuboff These companies have been collecting data or behavioral surplus from our internet activity and, selling it to companies that are running predictive Machine Learning models on this data to predict future behavior or the product someone will be interested in at the very moment or later and send recommendations or advertisements on the same. These predictive products are being sold on what is called “behavioral future markets” where “certainty” is being sold by these surveillance capitalists to their business customers, who are basically paying or putting their trust on the “Instrumentarian power” of these capitalists. That is how they are exponentially increasing their “Surveillance Dividend”

via GIPHY

gif by Hani Mitwasi

Some of the most common business models of surveillance capitalism according to Zuboff are :-

  1. Voting Manipulation :- According to Zuboff the trump campaign as well as the Brexit campaign took help from a data analytics company called “Cambridge Analytica”  who used the data  of millions of people to build predictive models of their future behaviors and to even send manipulative recommendations to change their mind about who they will vote for..
  2. Guaranteed footfalls :- Zuboff has also brought into light the fact that different fast food chains have exploited and manipulated data of a famous augmented reality game, where players have to chase some characters on the real world which will show up in their smartphone. This game had been manipulated to meet the needs of these fast food chains for getting “Guaranteed footfalls” outside their stores.
  3. Using subliminal advertising on data from Social Media and internet activity :- This is one of the most common use cases of user data, detecting emotions from the images that people upload on a social media platform or from other internet activities,  and then advertise products accordingly(also known as subliminal advertising).

All of the above mentioned examples can be a part of the hypothetical “5-factor personality test” click here for more.

via GIPHY

gif by Ryan Seslow

Now let’s get into the technology side, are the privacy preserving tools already existing in the market or in research good enough? Yes they can be, and they are already solving a lot of the world’s problems related to Data Privacy including those in healthcare and finance. We’ve to keep in mind that these are privacy preserving technologies, they come into play only in preserving the identity of the data holder or the sensitive information that can be leaked from  the data! Some of these technologies are :-

  1. Differential Privacy :-  Differential privacy is a system for publicly sharing information about a data set by describing the patterns of groups within the data set while withholding information about individuals in the data set.
  2. Federated learning :- Federated Learning is an algorithm that was developed by Brendan Mc.Mahan et.al. federated Learning is a really bizarre way of training predictive models on someone’s personal data without pulling the data to these servers, but instead sending the model to the local devices where the data exists , training the models there and sending the results to the server.
  3. Encrypted Deep Learning :- Encrypted deep learning is a field that is sort of a merge between deep learning & cryptography. It’s a brand new field and uses the same core technologies that are used in federated learning/encrypted federated learning and applies it to the actual prediction of a deep learning model, but the goal is to predict on encrypted data while the model you’re using for prediction is also encrypted.

Surveillance capitalism has become an integral part of the society now, and as I said earlier the technologies that exist are more about protecting the identity of the people whose data is being exploited, for making predictions. Almost every application of the tech industry  these days requires huge amounts of data. Machine Learning and data science is an integral part of the whole ecosystem, and all of that when augmented is a small fraction of “Surveillance Capitalism”. Well these technologies can help, but apparently only in the privacy preserving part of this entire workflow and that in itself is solving a lot of the world’s problems. Internet companies have been using behavioral economics since the dawn of the “dot com buzz”, so it is a part of the entire workflow now, but if you’re wondering if your identity is secure or not, while your data is being used to train these machine learning models(business models), then a lot of research is going into that domain. Some of the names that come to my mind when I think of research in privacy preserving technologies are :-

  1. Openmined.org - OpenMined is an open source community whose goal is to make the world more privacy preserving by lowering the barrier-to-entry to private-ai technologies.
  2. Google:-  Google has been doing a lot of research work in privacy preserving AI using Differential Privacy (more information in this link). It has its own library for Differential; privacy which contains a set of libraries of ε- and (ε, δ)-differentially private algorithms, which can be used to produce aggregate statistics over numeric data sets containing private or sensitive information. The functionality is currently available in C++, Go and Java. Apart from that google is also very much concerned about its privacy preserving research measures, it recently banned several malicious apps check this article for more information.
  3. Facebook :- Facebook has been doing some really impressive work for data privacy, as announced by Zuckerberg himself at Facebook’s annual conference F8 last year, Facebook is currently focusing on 6 principles that are helping them improve their privacy, i.e. Private interactions, reduced permanence, Encryption, Safety, Interoperability, & Secure data Storage. Some of these steps that the company has taken include end-to-end encryption on their messaging platforms and also ensuring privacy on their products before shipping them.

Follow this link to see the progress in privacy preserving machine learning that was made in the previous year.

I’d like to conclude by saying that! “Surveillance Capitalism” is one of the business  models that internet companies use to make money, and behavioral economics has been around the corner for a long time, the downside being their business model  i.e.  them selling it to their business customers who are using it to  get votes for presidential campaigns(NYTimes article), guaranteed footfalls for food joints, from games like Augmented Relaity games! These are the places where eyebrows rise. Also on the other hand it is a little funny to think that we are accomplices of this whole process, no matter what happens we cannot give up using social media right? Neither can we give up using smart devices because they’re useful and an integral part of our own lives.

Credits :-

  1. "Behavioral Surplus" Is Not Evil – It's Essential to Customer Experience :-  https://blog-idcuk.com/behavioral-surplus-for-cx/
  2. Surveillance Capitalism wiki page :- https://en.wikipedia.org/wiki/Surveillance_capitalism
  3. Privacy in the era of Data Bureaucracy :- https://medium.com/secure-and-private-ai-writing-challenge/privacy-in-the-era-of-social-bureaucracy-16f34da710b3
  4. Explainer: what is surveillance capitalism and how does it shape our economy? :- https://theconversation.com/explainer-what-is-surveillance-capitalism-and-how-does-it-shape-our-economy-119158
  5. NY times article “You Are Now Remotely Controlled” :- https://www.nytimes.com/2020/01/24/opinion/sunday/surveillance-capitalism.html
  6. Evaluation of Privacy-Preserving Technologies for Machine Learning :- https://medium.com/outlier-ventures-io/evaluation-of-privacy-preserving-technologies-for-machine-learning-8d2e3c87828c
  7. Surveillance Capitalism Core Concepts :- https://www.youtube.com/watch?v=FoEiiKbFrdw
  8. A Threat to Global Democracy: How Facebook & Surveillance Capitalism Empower Authoritarianism :- https://www.youtube.com/watch?v=XwXffKw73jA
  9. Derren Brown Tricks Advertisers With Subliminal Messaging :- www.youtube.com/watch?v=43Mw-f6vIbo
  10. HiNative question(Would anyone explain the meaning of "instrumentarian power" by Shoshana Zuboff?) :-  https://hinative.com/en-US/questions/12597995#:~:text=Zuboff's%20concept%20of%20%22instrumentarian%20power,government's%20and%20the%20corporations'%20goals.
  11. Project-Syndicate Surveillance Capitalism blog :- https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwj09PXZpNDrAhVzzTgGHbjPD_EQFjABegQICxAE&url=https%3A%2F%2Fwww.project-syndicate.org%2Fonpoint%2Fsurveillance-capitalism-exploiting-behavioral-data-by-shoshana-zuboff-2020-01%23%3A~%3Atext%3DPrediction%2520products%2520are%2520then%2520traded%2Cto%2520trade%2520in%2520human%2520futures.&usg=AOvVaw3_ua8Skb3yrrOmKC0jaBjK
  12. OpenMined’s tutorial on Homomorphic Encryption :- https://www.youtube.com/watch?v=2TVqFGu1vhw&feature=youtu.be

If you want to join our mission on making the world more privacy preserving :-

Join OpenMined slack

Check OpenMined's GitHub

OpenMined Welcome Package

Placements at OpenMined

Get your tickets for Pricon 2020