by Julius WiedemannFeb 09, 2021
Imagine you are being interviewed by a border official in a country, and you are suddenly struck with the reality: they know everything about your life. They start asking about the last destinations you have been to, and just a few minutes later start inquiring about your lover in Indonesia, someone you have exchanged a few messages with, and whom you met only a couple of times. At this point it should be crystal clear that you cannot hide anything anymore. It doesn’t matter if it is something you believe is completely private, such as the pill of ecstasy you had at the last night or a bank account you have in the Cayman Islands. There is almost no information that cannot be assessed given the time and resources employed these days. This is the ugly reality of maximum control, as described somehow by the French philosophers Deleuze and Guattari in their work about “capture machines”. That is ugly enough when it’s supported and controlled by states, but even uglier when it’s led, collected, controlled, and used by private companies, with no limits on controls of how data can be collected and employed. But in this particular case, it is a real story. Regulatory bodies are traditionally behind the actions of private institutions. However, in this case, they might be condemned to stay far behind because of the pace of innovation and the complexity of the situations related to the usage of data and the privacy of citizens.
Data can indeed be used to better products and services. There is no doubt on that. We have been doing that for a long time, maybe in a slightly naïve way, for instance when you buy something and they display related products recommended for you. In the case of user interface, customer feedback is absolutely vital, so data collection is mandatory. We change systems as they are being used. This is a classic example of changing the wings of the plane while flying. In the digital world this is the daily reality, and a necessity. There is no application or website that can be improved without the failure of its users.
Failure in the digital world is just another word for a designed experiment. When Xerox Parc first created the user interface with graphic elements on the screen, which was later copied and improved by Steve Jobs for the first Mac, the doors were open for a much easier interaction and functionality between the users and systems. Graphic interfaces were later adopted by every single system for the simple reason that it’s easier to popularise a software and increase adoption for not only nerds, but for everyone to use them. It was just a question of time that people with no knowledge of technology, or technological skills, like myself, could use nearly every platform. Just a few years ago, if you wanted to be a taxi driver in London, you would have to learn the streets for a couple of years to be able to reach the right destination and you would be using a printed map. These days, you just have to sign up for a platform like Uber or Cabify and use GPS-embedded systems such as Google maps or Waze to be able to pick and drop someone anywhere, better than a trained taxi driver could ever do.
But what does that have to do with privacy? Companies can only collect data, which means it’s worth collecting data if there is a critical mass immediately generated for personal information to become an asset. More the people adopt digital technologies, more interesting the business of trading with data becomes. Every online business has to rely on data, and therefore needs to convince users to give away personal information in exchange for something. No free platform is really free. Facebook is not free as much as Instagram is not. YouTube is also not for free even though now they are offering a subscription package and are still using your data.
Scrutiny is not even a currency in the digital world, it is like the air we are breathing. In countries where you cannot use a number of platforms, people choose to use VPN servers to get around that prohibition. However, you are going to be scrutinised by the other side anyway. The amount of information globally created, captured, copied, and consumed in 2020 is estimated around 59 zettabytes. This means 10 elevated to 21 (1.000.000.000.000.000.000.000). Sounds a lot, and it is a lot. Most of us do not think about the connections between a credit card transaction in a restaurant and a water bill, or the flight ticket you bought and the taxes of your apartment. But it is just a question of time that someone can connect something with something else and extract a business model from that.
Privacy is primarily about money and power. And in a capitalist system power is also about money. Every information that has been extracted from you has a destination, which is in the pockets of shareholders of something. That in principle shouldn’t be a problem. The problem happens when the system is not transparent, and when your data is collected and used without your permission. Tech companies are constantly sued by governments for collecting data they were not supposed to have access to. Regulatory bodies are starting to create stricter privacy laws, and companies seem to be always ahead of the curve, both predicting what governments will do and creating new tools to stir circumvent restrictions.
The new questions arising from the use of 5G and its implications of privacy have to do with the fact that hardware is the part through which all types of data is funneled to different channels. This is the one dispute that is not going to be unobserved in the development of our networks. If all types of data necessarily need to go through only one channel, it should be indeed worrisome. Real privacy should be considered a new luxury in the 21st century. For most of us it is just a question of measurement of how much we want to give up on the influence companies knowingly exercise on us.
Read more from the series Digital Legacies where our columnist Julius Wiedemann investigates the many aspects of digital life.