by Julius WiedemannOct 06, 2020
There are now countless documentaries about data. I have edited myself five books about data visualisation, a part of design that has grown exponentially due to the necessity of converting chunks of information into digestible concepts and ideas, which in turn could help information become knowledge. The use of information has always had two purposes: the first one is to understand patterns in order to predict action, and the second one, much deeper, to tweak the information available to conduct behaviour. Manipulation is part of human nature, and it wouldn’t be different with the technological resources we would build. The difference with digital technologies is the depth, the volume, the speed, and how far we are able to spread these tools to profoundly push the limits of our own nature, both good and bad. These new dynamics can serve us both to discover new medicine for a pandemic through simulations of a virus or can be used to change elections and create new dictatorships. The nature in us, meaning our impulses for collecting these data is pretty similar, but with completely different possible outcomes.
The idea of democratising tools to advance technology is a noble one, but it doesn’t come without the fear that we can commit great mistakes and generate mass disruption capabilities. That’s not new, as we have seen that already with the development of atomic energy. As soon as we democratise a tool, we are allowing different members of society to create different outcomes. Again, that’s not new at all. Apple has done that when it outsourced creativity with its developer’s kit for producing its library of apps. The only question is, that more often than not those strategies come without an insurance policy.
Authors like Sam Harris are probably right when they point out their doubts about the lack of understanding of the consequences of building artificial intelligence without considering what artificial intelligence might be constituted of. Do we want it to think like human beings? Do we want it to be part of the capitalist machinery to sell advertising? Do we want it to be part of academia and institutions so that we can create social and economic positive change? The answer is that we want all these things to play a role, but we often forget that some of them will have more commanding power than others. And they will be in different hands in different times.
Human beings are not prone to adjust to self-regulating tools as people. Morality is a complex issue. For that, we created institutions that are supposed to guard our own impulses against each other. But institutions are also formed by people, usually put there by other people, with specific intentions and objectives, and more often than not with a strategy to achieve those goals. Predictability is not exactly a creative tool, but it gives us the possibility of avoiding many disasters. Data is neither an object of study nor an ingredient of what makes us part of this world. It is now the inherent part of the relationship between a person and nearly everything these days. It is the very fabric of every transactional relationship. From sensors in our houses to refrigerators, to dictation tools running online, to searches of books, to geolocation, the common thread used as an argument is the improvement of services. Without innovators like Charles Joseph Minard we wouldn’t have converted so much information in great knowledge about diseases, population, immigration, social welfare, economic progress, trade benefits, and so many other things. But all these histories of beneficial understanding don’t come without its pitfalls.
It seems to me that if data is the new currency then you should be treated like money, and maybe we need an institution like the central bank to regulate how data is used and manipulated. Advocators like Tristan Harris and the Centre for Humane Technology have been trying to push for legislation, which could create an environment where information is used wisely. Artificial intelligence is already here, and we are pushing the boundaries to convert great processing power into a new form of decision-making machine. And we should be careful, meaning wise, as to how we apply it to ourselves and to other people.
One important thing not to forget is that we should keep educating ourselves as to distinguish data from information, from knowledge, and from wisdom. When we banalize knowledge acquired through unreliable means, we confuse opinions with facts, and start creating “alternative facts” that only serve the selfish interest of a few. We are much better than that and should in turn be vigilant about the sources we are exposed to. I am mostly positive about our obsession with data, and what we can do with it. A 2010 article from The Economist, titled Data Deluge, which inspired me to bet on the first book I edited on data visualisation, stated that “Everywhere you look, the quantity of information in the world is soaring. According to one estimate, mankind created 150 exabytes (billion gigabytes) of data in 2005. This year (2010), it will create 1,200 exabytes”. It is the abundance of data that is allowing for so much progress in understanding human behaviour from a hierarchical structure to neural networks. If we are able to fully understand how the brain connects dots, we are ready to solve a lot of puzzles. And it will be good if we, as a humanity, have a moral compass about it.
Read more from the series Digital Legacies where our columnist Julius Wiedemann investigates the many aspects of digital life.