The panopticon was first conceived by brothers Samuel and Jeremy Bentham in the late 1700s. Samuel began writing to his brother Jeremy while he was stationed in St. Petersburg, Russia during his time in the Navy. He wrote to him about prisoners being constantly observed, an idea that Jeremy would later develop into what we understand today as the Panopticon. A circular building with a watchtower in the center where inmates of a prison (later he would develop this idea for schools, hospitals and governance) could potentially be observed at all times. The idea was that if you know you are being watched, then you'll behave differently (Steadman). Even if you are unable to see who is watching .
Panopticon Computer Model by Myles Zhang. Narration by Tamsin Morton
Retrieved from Youtube on December 02, 2020.
In 1975, philosopher Michel Foucault revisited this idea and explored it more deeply for its more embedded authoritarian ideas. When Foucault saw the Panopticon he believed the Panopticon revealed or exposed four main things:
First, he saw pervasive power. He saw a one-way power dynamic where the those in power had the power to see everything (“Foucault 2: Government Surveillance & Prison”).
Second, he saw obscured power. There was no way for the prisoners to look back, so there was no way for the prisoners to know who was watching them or why they were being watched (“Foucault 2: Government Surveillance & Prison”)
Third, he saw direct violence replaced with structural violence. Violence no longer had to be carried out in a physical form, the building itself, just by being there was now its own form of violence (“Foucault 2: Government Surveillance & Prison”).
And last, structural violence was now profitable, and working towards profit was now the only option. Because money was now being saved by the facilities to reduce the number of "guards" or "watchers" in the Panopticon, then it no longer mattered why people were inside the prison in the first place. You have obedient workers for a lower cost so it becomes profit over reform (“Foucault 2: Government Surveillance & Prison”).
However, Bentham ultimately wanted reform, and an ideal state where the Panopticon was no longer needed. Foucault saw a place where profitability outweighed reform. Authoritarianism would ultimately set in, because it was a founding principle, and profit over people would win (Pease-Watkin).
Today, this idea is explored further and even challenged, when it comes to our digital world. We need to look at the impacts of Digital Panopticism and Surveillance Capitalism (a new model introduced by Shoshana Zuboff) in our lives.
Bot: A software application that runs automated tasks (scripts) over the internet (Cloudflare).
In their 2015 article about the digital panopticon in TechCruch, Arthur Chu says:
"Foucault said you can build a prison without walls just by letting your prisoner know he's always being watched. In real life in 2015, you can totally control someone's behavior by training them to tweet or instagram every tiny thing they do and see and see how many likes it gets... In real life in 2015, we can all be each other's warden as member of the amorphous mob that hands out like and dislikes."
This was 6 years ago... Things have only gotten worse. The 2020 Imperva Bad Bot Report shows that almost 40% of all traffic on the internet is bots. As predicted (or designed) who or what is watching us may not be there at all but is still having a massive impact on our lives.
Stephanie Hare: Thinking We’re Being Watched Changes Our Behavior
AlJazeeraEnglish, & Rae, A. (2019, July 03). Do Biometrics Protect or Compromise Our Security? | All Hail The Algorithm. Retrieved from YouTube on December 02, 2020.
Stephanie Hare: Surveillance & Changing of Governments
AlJazeeraEnglish, & Rae, A. (2019, July 03). Do Biometrics Protect or Compromise Our Security? | All Hail The Algorithm. Retrieved from YouTube on December 02, 2020.
DIGITAL PANOPTICISM & SOCIAL MEDIA
One question that keeps arising today is where social media fall into the world of digital panopticism. Often social media is called a reverse or inverted panopticon because we use social media to be the observer. In this scenario, we have the ability to not only gain power but affirm our actions. We control what we share, and now we have the ability to watch and look back at people in positions of power. But is this true? If we look at the dynamics of power that Foucault outlined for us, social media is not a reverse or inverted panopticon at all but simply a panopticon in the truest sense of the word.
First, we need to look at veillance. Veillance simply means to watch. There are different types of veillance, and each of those gives power to who is doing the watching. The watcher is not a passive participant. They have a role and are actively involved. This is important to power dynamics. The most common types of veillance are:
Surveillance. Sur- meaning over. Surveillance is oversight. This is what we normally think of when we think of people or someone watching. The few watching the many. This removes power (Mann).
Souseveillance. Sous- meaning under. Sousveillance is undersight. This restores power to the people. This is the many watching the few. Examples of this are body cams on police and even people turning cameras on police during protests (Mann).
Self or Autoveillance. Auto meaning self. Autoveillance is self-sight. Self, or participatory surveillance where the people monitor each other, and themselves. This both removes and restores power (Mann).
If we use Foucault's lens to view social media, social media reveals the same authoritarian structure as Foucault's panopticon regardless of the application.
If we look at his first principle: Social Media has pervasive power. The platforms we use have the power to see everything. Because of the predictive nature of algorithms, they even have the ability to see and even know our behavior before we do.
Looking at his second principle: Social Media has obscure power. Because of the nature of the platforms we use, we have no way to see who is watching us. Except this time it isn't only guards or Big Data, it can potentially be anyone, from friends, potential employers, complete strangers to foreign governments..
In his third principle he saw physical violence replaced with structural violence. Guards in this prison no longer had to threaten prisoners with physical violence, in the world of Social Media, Doxxing (publishing personal or identifying information on the internet for malicious intent), hacking (gaining unauthorized access to data in a system or computer), cyberstalking (using the internet to harass an individual or group on the internet) to use data to find and cause harm to people in real life. We also have to give very real attention to Disinformation & Information Warfare, and how our data is being used against us.
Last, he speaks of the profitability of this structural violence. Which brings us to Surveillance Capitalism.
SURVEILLANCE CAPITALISM
Shoshana Zuboff introduces the idea of surveillance capitalism in her book The Age of Surveillance Capitalism. This refers to the scraping of our personal and private data to make behavioral predictions about us. She explains it further in this interview with Matt Frei with Channel 4 News in the United Kingdom.
Channel4News, & Frei, M. (2019, September 23). Shoshana Zuboff on 'surveillance capitalism' and how tech companies are always watching us. Retrieved from YouTube on December 02, 2020.
Shoshana Zuboff: A Profound Misconception of What is going on
VPROinternational, & Duong, R. (2019, December 20). Shoshana Zuboff on Surveillance Capitalism | VPRO Documentary. Retrieved from YouTube December on 02, 2020.
A profound misconception of what is going on.
Shoshana Zuboff explains how targeted ads, and personalized services are only a small piece of the data that are being being collected about us.
Training Models & Behavioral Surplus
Shoshana Zuboff explains how some of this data is used to improve services. However, even more of it is used to train models to predict how you will behave not only now but later on.
VPROinternational, & Duong, R. (2019, December 20). Shoshana Zuboff on Surveillance Capitalism | VPRO Documentary. Retrieved from YouTube December on 02, 2020.
We now have a definition of Surveillance Capitalism, but how does it effect us really?
What are they really able to do with the data they collect on us?
And how are they actually profiting from it?
Who are They?
Lets take a look at how we got here, and try to answer some of these questions by taking a deeper look into how things began with one of the biggest “They’s”: Google.
Founded by Larry Page and Sergey Brin in 1998, Google started as an idea to build a better search engine. Instead of ranking a page by how many times search terms appeared on a page (which is what search engines did at the time), they created an algorithm with Scott Hansen that analyzed relationships among websites (Hosch). This look at relationships - and finding connections for better predictions for searches along with the data cloud or trail we leave behind - turned into a business model that would not only change the way we find things on the internet, but would have very real world consequences.
These clips explore the information that Google mines (collects), the Google business model, as well Google’s history.
Aleister McTaggert: Google Mining the Data of Your Life
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
Shoshana Zuboff: Google History & Business Model
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
Behavior Restriction
Does collecting all of our personal data really give us all of the options that Google tells us it does? By micro-targeting ads and options online, Jaron Lanier (computer philosopher and scientist, also considered one of the founders of virtual reality) believes this not only reduces our options but leads to lack of freedom and behavior restriction on the internet.
VPRO documentary. “The Real Value of Your Personal Data.” YouTube, uploaded by VPRO, June 11, 2017, www.youtube.com/watch?v=dW7k_GZYLwk. Retrieved on February 23, 2021
Facebook Took Google’s Model to the Next Level
Before you think that not using social media (or Facebook specifically) means that it doesn’t impact you:
Instagram & WhatsApp are owned by Facebook. If you use Instagram, you use Facebook.
Though Google & Facebook are being highlighted, it isn’t just these companies taking, buying and selling your data. This is the new business model for all companies. This information can also be taken from store credit cards, loyalty cards, apps, etc.
If a service is free, you are the product. Meaning if you downloaded a free app, you are paying with the data they are collecting and selling about you when you clicked “I Agree” on their Terms of Service (ToS).
Most important: It no longer matters if you use the internet or social media at all. These things have real-world impacts, and consequences.
Roger McNamee (Author, Ex-Facebook CEO: Facebook
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
Roger McNamee explains more about Facebook and the data Facebook began collecting and purchasing about you (left).
Cathy O’Neill speaks about the issues with the Facebook algorithm (right).
Cathy O’Neill (Author & Data Scientist): The Facebook Algorithm
VPROinternational, & Kieft, M. (2018, October 26). Algorithms Rule Us All - VPRO documentary - 2018. Retrieved December on 02, 2020,
Social Contagion Experiments
In 2010, Facebook began conducting social contagion experiments to see if they could change real-world behavior. Not only were they successful, they also found they could do it without user awareness (Meyer).
Shoshana Zuboff: Changing Real World Behavior
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
Jaron Lanier: The Facebook Algorithm Threat
VPROinternational, & Kieft, M. (2018, October 26). Algorithms Rule Us All - VPRO documentary - 2018. Retrieved from YouTube on December 02, 2020,
THREAT TO DEMOCRACY
Without care, intervention and responsibility we are at the whim of whoever owns the technology, or whoever can pay for it. These technologies aren't just a threat to our privacy, they are a threat to any democratic society. As Stephanie Hare mentioned in the previous video, governments can change. Even though we may feel safe with these technologies under the current administration or regime that can change. Technologies that once provided comfort, ease, and protection, under different control can easily be turned authoritarian. As Cathy O’Neill explained previously, the Facebook algorithm is especially susceptible to misuse. So, what does the Facebook algorithm look like in real world situations? Jaron Lanier explains what happens with the Facebook algorithm, and why we don’t always get the outcome we expect during protests and movements such as the Arab Spring.
Sophie Richardson (Dir. Human Rights Watch): Authoritarian Governments
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
We can already see these technologies being used by authoritarian governments to keep certain populations imprisoned. One example that Sophie Richardson explains is the humanitarian crisis of the Uyghurs in China (left).
Joshua Bengio discussed further the overall threat these technologies pose to democracy (right).
Joshua Bengio (Pioneer of Deep Learning): Democracy Threat
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
RESPONSIBILITY & CHANGE
So, is it all just doom and gloom? I hope not, but there is real change that needs to happen now. Technology is changing so quickly and moving at such an advanced rate we don't have the time to wait to make these changes. We must act now.
Joshua Bengio (Pioneer of Deep Learning): Scientist Responsibility
PBSfrontline. (2019, December 02). In the Age of AI (full film) | FRONTLINE. Retrieved from YouTube on December 02, 2020.
Michal Kosinski (Computational Psychologist): Warning
VPROinternational, & Kieft, M. (2018, October 26). Algorithms Rule Us All - VPRO documentary - 2018. Retrieved on December 02, 2020,
Cathy O’Neill (Author and Data Scientist: Singularity
VPROinternational, & Kieft, M. (2018, October 26). Algorithms Rule Us All - VPRO documentary - 2018. Retrieved on December 02, 2020,