by Julius WiedemannDec 15, 2020
When we are looking at the digital photograph taken with a state-of-the art mobile camera these days, we are looking at an agglomeration of millions of small squares, pixels, in zoomed-out modus, each pixel with a colour. If we look at each one individually, we have one perspective, and in-depth experience of that small territory or fragment captured by the camera lens. As soon as we start zooming out, we start to understand that there is a bigger picture, a composition that makes up an image, which we can understand in a different way. The ground-breaking thing, however, is how fast we can go from one pixel to 12,000,000, and all the steps in between. How fast we can view things from deep down to high above. This is certainly one of the most amazing features of digital technologies. I have already caught myself using two fingers to zoom in on the screen of my computer, but worse than that I have also caught myself trying to zoom in on a printed newspaper, so much I got used to the feature on digital devices. My brain certainly thinks already that this is a common feature for all things now.
The metaphor of the digital image serves here to explain how digital technologies have empowered us to look at the data from macro to micro. This possibility, on a massive scale, should not be underestimated. When we are looking at a XML database, we are basically looking at the idea that we can scrutinise one small set of information, but also have access to a larger view, and consequently, make the links between one and the other. Statistics, a pure science, deals with the perspective of trying to project outcomes from a set of data, or trying to identify the best predictors, generating opportunities for better understanding of realities, strategies and planning.
What working in a digital era entails though are the use of real-time and entire sets of real data, instead of samples of information that help predict the global. Monitoring real data has brought a new way of thinking for product development. It allows both mass customisation and individual tailoring, as well mass deployment with recognised patterns. These edges of the same sword are many times the backbone for providing more efficient services and generating customer satisfaction. What stays in the middle is just the cost for going one way or another. Whereas in the past, companies had to rely on samples of data to try to optimise the products and services with conclusions coming from generalisations, now they can customise based on real information.
With the help of data visualisation software and also better professionals doing analysis, a new landscape of possibilities arose. From counter-terrorism to bags selling on Champs Elysées in Paris, data-supported analytics is now the underpinning tool for understanding human behaviour. Our long interest in complexity has driven us to get closer and closer to what we have defined as a chaos theory. According to the Encyclopaedia Britannica, “Chaos theory, is in mechanics and mathematics, the study of apparently random or unpredictable behaviour in systems governed by deterministic laws. A more accurate term, deterministic chaos, suggests a paradox because it connects two notions that are familiar and commonly regarded as incompatible. The first is that of randomness or unpredictability, as in the trajectory of a molecule in a gas or in the voting choice of a particular individual from out of a population". To avoid unpredictability, we have harnessed wise ways of using sets of data, large and small, trying to circumvent chaos in order to achieve control.
One of the big dilemmas of almost every research is whether to embrace a qualitative or a quantitative approach. With better use of machine learning and large sets of good data, we are now able to approximate these two extremes. Research is also about asking the right questions so that answers come in a more standardised way, facilitating conclusions from similar perspectives, and trying to prevent analysts or analytical tools to commit mistakes that can be both semantic and contextual. Very soon we will be adding FMRI analytical tools on larger scales, allowing for neuroscientists to uncover and unpack mechanics that were held previously impossible to reach.
Our data-collecting capability has changed so dramatically that we are now able to make changes on the fly to adapt products and services as they are being developed. It is like changing the wings of an aeroplane while it’s flying. This is no exaggeration. Companies are now employing agility in order to allow changes to be done as quickly as possible. Algorithms suggesting customer preferences have become an important complementary part of sales strategy, and depend heavily on good use of data, with both micro and macro approaches.
Read more from the series Digital Legacies where our columnist Julius Wiedemann investigates the many aspects of digital life.