make your fridays matter with a well-read weekend


Digital Legacies: Fingers

Julius Wiedemann delves into the shift in ways we are accessing and interacting with our screen-based world and observes that fingers might now become obsolete.

by Julius WiedemannPublished on : Mar 16, 2021

Everything is about touching these days. My one-year-old daughter already knows how to pinch and slide on the mobile phone. She is not zooming yet, but it will come fast. The small screen, described as something 'like sucking for watching a film on an iPhone' (a spoof by David Lynch, easily found on YouTube) now has become ubiquitous, and it seems like there’s no turning back. After small mobile phones that were sold by their weight in Japan, when I lived there in the late 90s, to the Blackberries that had a smaller physical keyboard, to the iPhones that transformed every screen into an industry standard (even though they did not invent it), only a couple of decades have passed. The transformation is enormous.

QWERTY keyboard | Digital Legacies: Fingers | Julius Wiedemann | STIRworld
QWERTY keyboard Image: Courtesy of Julius Wiedemann

Typing is now done mostly with thumbs. No one anticipated that. In January 2021, about 98 per cent of active Facebook users accessed the social media platform via some kind of mobile phone. History is full of ironies. The QWERTY keyboard design, which is still being used today, was named from the order of the first six keys on the top left, created for the Sholes and Glidden typewriter and sold to E Remington and Sons in 1873. To imagine that every time we are typing on a keyboard, meaning a traditional one, we are still going back to a design from the 19th century, it doesn’t come without a smile for me. But this is what technologies are about. An accumulation of achievements.

QWERTY keyboard design of BlackBerry | Digital Legacies: Fingers | Julius Wiedemann | STIRworld
QWERTY keyboard design of BlackBerry Image: Courtesy of Julius Wiedemann

The iPad turned 10 in April 2010. The idea then was that its screen could be used partly as a computer, partly as a communication tool, and partly as a e-reader to compensate for the smaller mobile screens, which has worked indeed to a great extent. The only problem with it is that tablets don’t fit in our pockets. There were other tablets before the iPad, some from about a decade before, which never really picked up for a number of reasons. Even Apple released the Newton, a precursor of the iPad, in 1993. Technically categorised then as PDAs, Personal Digital Assistants. The first Kindle, from Amazon, was released in 2007, with a completely different intent. E-books were becoming popular, but no one was willing to pay a relatively large sum for a device only to read books. For a heavy reader those devices were a solution to avoid carrying a lot of weight, but for a regular reader, a device like Kindle would pay off after a few books. For then, the paper was still something tactile and important to have in their shelves. The medium was as important as the content. Even if the math made sense, the attachment to the paper was still relatively big. A proof of that is still there today as we see the sale of e-books is relatively stable and not growing so much anymore in mature markets.

A multitouch screen | Digital Legacies: Fingers | Julius Wiedemann | STIRworld
A multitouch screen Image: Courtesy of Julius Wiedemann

One very well-known evolutionary theory describes the differentiator and the difference to the Homosapiens as the ability to use the thumb and the index fingers to pinch things. When doing that intelligently, we were able to grab and build objects efficiently, and with that create new technologies. It makes sense. We still use these movements today. And curiously, in artificial hands, these are the two pieces that make the difference for someone to return to doing useful actions after losing a hand. Little by little we will understand how the brain connects to our hands and in that we will understand how our brains could connect to artificial hands in the future for full capability. But we will probably also understand how we evolved.

The first finger-controlled touchscreen is widely acknowledged to have been developed by EA Johnson in 1965 at the Royal Radar Establishment in Malvern, UK. In the 70s, capacitive touchscreens gave place to resistive ones. American prominent inventor, Samuel Hurst, invented them almost accidentally while studying atomic physics at the University of Kentucky. The screen revolution was just starting, and screens meant almost exclusive TV for the wider audience. The first human-controlled multi-touch device was then developed at the University of Toronto in 1982 by Nimish Mehta. Since the first iPad, Apple kept adding features for the use of multiple touch capability, and some of its applications are still to be developed.

Fingers might now become obsolete. The text you just read was completely dictated on a mobile phone without having to type. Only corrections and minor editing were done on the computer. As we now start to connect controls with audio, we are also starting to expand the applications of verbal commands. Dictation is for instance helping me write better in other languages, because many times I need to write emails in German or Spanish, but do not know exactly how some words should be written. Only after dictating I can actually see the correct spelling. Older technologies may not disappear, but in the digital world there’s always something new to make us think about how we can do things differently, more efficiently, and sometimes, in a more natural way.

Read more from the series Digital Legacies where our columnist Julius Wiedemann investigates the many aspects of digital life.

What do you think?

About Author


see more articles

make your fridays matter

This site uses cookies to offer you an improved and personalised experience. If you continue to browse, we will assume your consent for the same.