Technology has undoubtedly changed the way we live our lives. It’s changed our social habits, it’s changed our relationships and it’s changed the way we access information, and of course the sheer amount of that information. If my 39-year-old self could revisit my 16-year-old self and present him with the latest iPhone, or the even more advanced Microsoft Lumia 950 range, it would blow his fledgling little mind.
As I journeyed from the late nineties and my Phillips ‘Spade’ of a mobile, with space for ten ‘140 character’ text messages, to my smartphone today that has a 20 Megapixel camera and can actually reads my iris — yes the one in my eye, like in the science fiction films circa that spade phone — it may have felt like a natural progression, but let’s just look at the history of developments in technology to understand how rapid that progression really was.
From the end of the Second World War, when families dusted off rescued valve powered radios and stereo systems, right through to the mid-eighties, people knew very little of microchip based technologies. Around this time most people’s experience of the screen was at the cinema, those privileged enough to afford home movie equipment the exception.
As the eighties approached screens were still very much the static, valve based, filament tube, flickering marvel in the corner of the room. In the UK colour transmissions started in 1969, but even as late as the early nineties over 20% of the UK population still only had a black and white TV set.
Our relationship with the screen didn’t really change until after the millennium, when Sony Ericsson brought out the first mobile sporting a colour screen for the European mass market, in the shape of the T68. Soon to follow were the other big players of the time, Nokia, Samsung, Siemens and Motorola, all trumping each other with minor tweaks until the day Apple dropped their iPhone H-Bomb.
Whatever your view of Apple these days, the way they treat their suppliers, the way their suppliers treat their workers, or the questionable authenticity of their marketing and PR practices, like it or not, they disrupted the technology world bigtime; they freed the screen from its shackles, let it free, allowed it to take over our lives and according to some, potentially destroy us.
Sitting in a hipster joint just north of the Tallinn old town, I’m struggling to see much human interaction. The advantage of this is that bar the inoffensive rare beats pumping out of the reclaimed 1970’s valve set (ironic, no?) there is very little noise, even though the place is nearly full.
It’s evening and the sun has all but faded, there are scented candles lighting the main area, but they are more than adequate as the majority of light I use to distinguish the moustachioed waiter from the clients is coming from screens. That’s right, smartphones, laptops, tablets and of course the ever so useful illuminated Apple logos. It struck me that we are living in a time where we have everything, and yet we potentially have nothing.
This screen, once relegated to the corners of sitting rooms across the world, is now in our hands. It doesn’t give us programming planned for our entertainment, selected by a handful of metropolitan liberals for the good of the people — it gives us pretty much anything we want, when we want it; it also gives us a lot more too, most of which we never need to see, read or hear.
The screen is responsible for the latest incarnation of noise pollution, that is the kind of noise that clouds our minds, makes us ignore our potential soulmates, new best friends — and old best friends who are probably sat opposite us. At worst it makes us drive our car into others, career into others in the street and walk in front of buses or trains. To single out the screen as the Achilles heel in the advancements of technology that we have seen in the past 20 years gives rise to some very interesting thinking.
Yves Behar, founder and principal designer of Fuseproject, recently wrote about the use of what he coins as ‘Invisible interface technology’ in Wired magazine. His concerns were those shared by many these days, primarily why the very technology that has the potential to enhance our world is at present distracting us from it. Screen free (or at least reduced) interaction with today’s technology presents a set of challenges for developers and adopters alike.
There has long been the belief that well implemented cutting edge technology should be as natural to use and flicking a light switch, but the challenge here is how to change a mind-set — albeit a recently developed one — to that of a more natural interaction with our tech, as well as the obvious challenges of setting up such an infrastructure.
The main focus here is creating a technological environment that we as humans can interact with as we do the natural earth. To make this happen there must be this solid infrastructure in place in our environments, allowing us to access the kinds of info we are used to, using gesture along with a range of other natural ways of interacting. Creating this kind of world will take time, but it is possible, just look at the city-wide availability of Wi-Fi, I’m sure my 16-year-old self would have had the same doubts about that.
The use of sensors, responsive devices — yes of course that will include screens, we will never* not need them — and tailored tech, that not only enhances our lives but more importantly doesn’t distract us from the actual process of living as human beings, is looking to be the next big thing. There is a very real and deep concern amongst many people from all walks of life at this present moment, that we as humans cannot sustain this level of antisocial behaviour, that it will essentially plant the seed to bear the vine that strangles us.
We are on the verge of living in a technological landscape that plays our tune, that works in time with our rhythms. This brave new world may not only halt the decline in the sharpness of our senses, it may even reverse it, making us the most in tune with our environment, our world and each other that we as a human race have ever been.
*Ok, I know. Never say never.
First appeared on Irish Tech News, 3rd April 2016