The central concept of surveillance capitalism lies in the fact that consumers, in exchange for generally free services, agree to allow giant companies to monitor their behavior, which helps them show you ads aligned with your interests. For Shoshana Zuboff, this new “information capitalism” helps predict and modify human behavior under the premise of increasing sales and return on investment for companies. Is this something new? Of course not, and surveillance is not exclusive to capitalism; it has existed since time immemorial and in all systems, sometimes even to silence dissenting minds. Such was the case of Romania in the 1980s, where even under its communist regime, it was decreed that all people who had a typewriter had to report to the police once a year and provide a sample or copy of their machine’s writing to create forensic profiles in case someone decided to write against Nicolae Ceaușescu’s government[89].
Although surveillance capitalism is associated with the commodification of personal data, its roots go much further back. We then go back to the creation of the modern State, where the state bureaucracy began to take charge of recording various data of its citizens. Just as centuries ago churches recorded birth records, this function then passed to the States, as well as the issuance of personal identification numbers, tax records, school report cards, medical history, and many other records that paved the way for easily collecting the personal attributes of each individual.
Consumers are often deceived by social media companies that easily capture their attention. These companies often share some of our information with third parties to create smarter advertising profiles about us. So far, after the Facebook and Cambridge Analytica scandals[90], nothing new. Now, without forgetting the benefits that social networks provide, by this point we also know well the economic, political, and social risks they entail. What do we do with that? That’s another matter entirely, but it’s likely that the consequences of doing nothing about it could be worse than choosing another path. So, let’s get to the question. If we know that social media algorithms are bad for us and harmful to personal health and social well-being, why, if we regulate cigarettes, which are harmful to health, and even the content we watch on television, can’t we regulate the algorithms that capture our attention? It doesn’t matter if we’re talking about the health of our lungs or our mental health. Health is health. This question was raised by Melina Masnatta, co-founder of Chicas en Tecnología and former Global Director of Learning and Diversity at Globant, at a meeting organized by the Global Shapers Buenos Aires organization at the Centro Cultural Recoleta. Maybe we don’t like to admit that a set of algorithms governs our lives today; no one likes to admit being below the intelligence of another person, or worse, a machine. Even if we don’t like it, these algorithms today have the ability to generate chemical reactions in our brains to keep us under control, affecting our behavior and emotions. Before writing the previous sentence, I wondered if that was okay or not, because after all, if we read between the lines, it seems like I said we should choose what algorithms should show us daily. To my surprise, I realized that this is already a practice carried out by all countries through their education systems, by choosing which texts to teach and which not to. Mathematics is universal. The values and connotations about right and wrong in the main historical events of humanity are another matter. Is it right? Is it wrong? Will we get people to stop fearing vaccines and stop believing in conspiracy theories like flat Earth if we restrict access to certain texts from our algorithms? I don’t know. But I also don’t know if it’s our authority to decide that, even if it is for the greater and common good of society as a whole. This question contrasts the intention of living in a world where we all abide by scientific truths and another where freedom of thought is completely free and not coerced by a single narrative, even when its postulates are true.
I have no doubt that Artificial Intelligence must be regulated; I am not the first to say it, and I will not be the last. Any expert will say it, from Elon Musk in the West to Kai-Fu Lee in the East. However, my question is, to what extent? It is clear to me that we must prevent biases against different social groups. What is not clear to me is whether we should impose a single story, a single truth. Being human does not imply having all the answers, nor doing everything correctly according to some fashionable paradigm. At least it has been that way until now, but saying that things must remain the same because we have done them that way historically is also not correct. Too complacent, statist, and traditional for my taste.
In fact, I wonder what states will do when a foreign pair knows, perhaps through their corporations, all our preferences, mental weaknesses, and health status? What happens when they know this about all the politicians, businessmen, judges, and journalists of a country? Can we still say that this state is independent? We usually think that surveillance is exercised by the upper classes on the middle and lower layers of society, but this is about to change. It is especially the elite who will have to be more careful in this regard; just ask Jeff Bezos, who was spied on through his mobile phone via a file he received on WhatsApp from Mohammed bin Salman, the crown prince of Saudi Arabia[91]. At this point, surveillance can be external or internal, and the market is full of spyware programs sold to the highest bidder, a clear example being the Pegasus software developed by NSO Group and sold to several governments, including Germany[92], El Salvador[93], Hungary[94], India[95], and many more. The terrible thing about Pegasus is not only that it has sometimes been discovered to be used to monitor human rights activists, political opponents, and journalists, but by exploiting critical and publicly unknown vulnerabilities in major mobile operating systems, it has managed to be installed remotely on victims’ devices without needing to trick them into downloading a malicious file or having physical access to it to install the spy Trojan. To top it off, even though the software was developed by a private company, as it is a cyber weapon, the Israeli Ministry of Defense ultimately decides which countries can buy it. So much so that an extensive investigation published in the New York Times[96] demonstrated how the governments of Panama and Mexico changed their way of voting on issues related to Israel at the United Nations once they were given access to Pegasus, so this cyber weapon is also serving to modify the status quo of geopolitics. How independent will countries be in the face of this new digital colonialism? Will a UN motion to regulate the use and transfer of digital espionage tools be of any use? We’ve reached the point where if one has too much information, there is no need to send troops to enemy territory.
[89] Zsiros, S., &McMahoon, M. (2019). The Brief: Romania’s revolution – searching for answers 30 years on. Euronews. Viewed March 24, 2023, at https://www.euronews.com/my-europe/2019/12/18/the-brief-romania-s-revolution-searching-for-answers-30-years-on.
[90] Clark, B. (2018). Facebook and Cambridge Analytica: Here’s what you need to know. The Next Web. Viewed February 18, 2023, at https://thenextweb.com/news/facebook-and-cambridge-analytica-heres-what-you-need-to-know.
[91] Kirchgaessner, S. (2020). Jeff Bezos hack: Amazon boss’s phone “hacked by Saudi crown prince”. The Guardian. Viewed February 19, 2023, at https://www.theguardian.com/technology/2020/jan/21/amazon-boss-jeff-bezoss-phone-hacked-by-saudi-crown-prince.
[92] Welle, D. (2021). German police secretly bought Pegasus spyware. Deutsche Welle. Viewed March 24, 2023, at https://www.dw.com/en/german-police-secretly-bought-nso-pegasus-spyware/a-59113197.
[93] El Salvador: La Comisión Interamericana de Derechos Humanos celebrará una audiencia sobre el uso abusivo del programa espía Pegasus. (2022). Amnistía Internacional. Viewed March 24, 2023, at https://www.amnesty.org/es/latest/news/2022/03/elsalvador-pegasus-iachr.
[94] Hungary: The government must provide a meaningful response to the Pegasus scandal. (2021). Amnesty International. Viewed March 24, 2023, at https://www.amnesty.org/en/latest/press-release/2021/07/hungary-the-government-must-provide-a-meaningful-response-to-the-pegasus-scandal.
[95] Dhillon, A., & Safi, M. (2021). Indian supreme court orders inquiry into state’s use of Pegasus spyware. The Guardian. Viewed March 24, 2023, at https://www.theguardian.com/news/2021/oct/27/indian-supreme-court-orders-inquiry-into-states-use-of-pegasus-spyware.
[96] Bergman, R., & Mazzetti, M. (2022). The Battle for the World’s Most Powerful Cyberweapon. The New York Times. Viewed March 24, 2023, at https://www.nytimes.com/2022/01/28/magazine/nso-group-israel-spyware.html.