Human data sovereignty

 

Let’s pause for a second. Suddenly you’re Messi, playing the World Cup final against France, and after a long tied match, you decide to lift your head and think about what you can do differently to change the score. When we talk about data, the constant is that Nation-States and big companies own it. But why? We’ve become so accustomed to this being the default option that we’ve stopped questioning whether it’s right or wrong. In a context where cryptocurrencies invite us to be sovereign over our money, why don’t we do the same with our data?

 

Let me give you an example: imagine you’re going to a nightclub. It’s normal for a security person to check your ID to verify that you’re of legal age. Sometimes, our ID is photographed or scanned by a computer. Why do we show our ID card or driver’s license? Let’s analyze the situation. The only thing the nightclub security needs to know is whether we are over 18 years old. Other information, such as our name, surname, gender we identify with, city of birth, address, the type of vehicle we are licensed to drive, and our identification number, among other information, is useless for the nightclub’s purposes. However, we are accustomed to handing over this sensitive information. The same thing happens when we deal with the State, and they ask for more information than is truly necessary. This system could be replaced by one where we are the true sovereigns of our information. A system where our data can only be verified temporarily through a token that updates every determined unit of time, just as our banks require us to use a special token, valid for 60 seconds, when we perform certain operations on their online banking platforms. We could do the same with our online presence.

 

But why insist on being absolute sovereigns of our information? During my participation in the annual meeting of the World Economic Forum in Davos, in a panel moderated by Carlo Ratti, director of the MIT Senseable City Lab, where we debated the use of Artificial Intelligence for facial recognition in the streets and how to maintain public trust, I had the opportunity to ask the panelists, what will we do in the future when these technologies are compromised, either by a hacker or an internal mole as happened with Snowden in the NSA? How will we live once our entire public life is leaked? In 2020, photos of a woman sitting on a toilet taken by her floor-cleaning robot[197] ended up being posted on Facebook. Beware of giving up freedom for convenience and security, only to end up losing both!

 

Participation of Facundo Cajén in the World Economic Forum in Davos 2022[198]

 

In the video above, you can see the responses from all the panelists, but to summarize, one of the most sincere and interesting answers I received is that once a system is compromised and the information is published, it becomes public domain. We can’t put it back in the vault, and this is because today we have no way of knowing how many copies exist of our data and information. In fact, in a conversation with Esteban Ordano in Davos, he illustrated this with absolute clarity. His example was simple. If we write a password on a piece of paper and immediately lock it in a safe, we might think it is secure. If someone breaks into our safe, it would be easy to know that our information has been compromised. But what happens if someone else has a copy of the key, opens it, takes a photo of our password, and then locks it back up? We could see our safe every day and think it hasn’t been violated, yet our information has been compromised and copied. How many times? We currently can’t know. Perhaps it was forwarded and stored in ten different places, or maybe there’s only one copy in circulation. Either way, it’s out of our control.

 

In that sense, going back to the panel moderated by Carlo Ratti, one of his final remarks was that today we already live in absolute transparency for companies like Google or Facebook. Has the time come to live transparently with the rest of society? While this clearly doesn’t apply to our passwords, the point observed is still interesting. That said, I feel obliged to repeat the idea proposed by Angela Oduor Lungati in that same AI panel. Angela pointed out that it would be important for people to choose for themselves whether or not to participate in these AI programs, and not be forced to do so by their governments by default, although in practice, there are places where this is the given configuration, and there is no way to avoid it, like when we enter a new country through its airport. Indeed, as I write this, the United Kingdom is trying to update its Investigatory Powers Act, which aims to require messaging apps to obtain government approval for the security functions of their protocols, thereby creating backdoors that facilitate state surveillance. In response to this proposed change, not yet approved, Apple announced that if this amendment to the current law is applied, it would stop offering its iMessage and FaceTime services in that country[199].

 

For me, while the advent and application of these technologies will require citizen participation in their control and regulation, participation in them should be optional, just as no one forces you to use Instagram or Google. In this sense, it seems necessary to point out that researchers at Cornell University have been successfully working on creating clothing with prints that confuse AI systems designed to detect people[200]. Only if we ensure that the benefits of these technologies outweigh the potential consequences they may bring, will society choose to embrace them as a fundamental tool for its order.

 

capa de inv

 


Click here to return to the Index 🔍 


[197] Guo, E. (2022). A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook? MIT Technology Review. Viewed on February 19, 2023, at https://www.technologyreview.com/2022/12/19/1065306/roomba-irobot-robot-vacuums-artificial-intelligence-training-data-privacy.

[198] AlphaGo – The Movie | Full award-winning documentary. YouTube. (2020). Viewed on June 19, 2021, at https://www.youtube.com/watch?v=WXuK6gekU1Y.

[199] Macaulay, T. (2023). New UK law could spark “default surveillance of everyone’s devices.”. The Next Web. Viewed on July 22, 2023, at https://thenextweb.com/news/uk-investigatory-powers-act-default-surveillance-devices-privacy.

[200] Wu, Z., Lim, S.-N., Davis, L., & Goldstein, T. (2019). Making an Invisibility Cloak: Real World Adversarial Attacks on Object Detectors. ArXiv.org. Viewed on October 3, 2022, at https://arxiv.org/abs/1910.14667.