Love, social networks and other avatars

Featured in

  • Published 20110607
  • ISBN: 9781921758218
  • Extent: 264 pp
  • Paperback (234 x 153mm), eBook

WHEN CONSIDERING TECHNOLOGY, few people immediately think of happiness and joy. Most find computing a challenge or irritating. It is common to hear ‘it was a computer error’ or ‘it’s the computer’s fault’. Generally, computers don’t make mistakes – people do. And computing plays such a large part in our lives, permeates virtually every facet, that it is almost impossible to go through a day without interacting with some form of computer.

The children and young adults of today are the most technologically savvy humans in history, entering classrooms and universities around the world with more technological know-how than any other generation. This generation of eager users is, unfortunately, focused on social networking and mobile computing. Social networking has caused a revolution in communication, enabling people to connect with anyone, anytime, anywhere. But are we now too dependent on technology for social interaction? How has technology influenced our social behaviour and our ability to be happy?

In the 1940s and ’50s, there were only a few computers in the world. They filled entire rooms and could only undertake basic calculations, and few people could actually use them, let alone repair or program them. Now computers are smaller and faster than ever, ready to solve any problem, undertake any chore. They have become ubiquitous. Smart phones enable us to make calls, take pictures, download music and play it, interface with commerce and share information with friends across the globe. It won’t be long before our mobile devices will interface with any appliance, any network, any building and amenity in the world. Fifty years ago, few would have thought that most people in the world could seamlessly pick up a complex computing and communications device and use it in sophisticated ways, but humans have broken the ‘technology pain barrier’.

Computing was once considered a domain dominated by geeks and nerds. The use of computing and technology was fraught with difficulties, especially for those whose world was turned upside down when their daily business or regular work was hijacked by technology or ‘computerised’. The fear that technological progress would somehow cause the wholesale loss of jobs, as people were replaced by computers and robots, can now be seen as unfounded – or at least outweighed by the new jobs created. But, to many people, having a computer run their business or components of their workplace was petrifying. What if something went wrong, what if it couldn’t be fixed, what if years of data simply disappeared?

The hysteria of doomsday computing was never more extreme than during the feared Y2K fiasco, at the eve of the millennium. People worried that planes would fall from the sky, bank vaults would open at the dawn of 2000 and release the riches inside, transport would halt and essential computing systems would stop dead. But nothing happened – well, nothing bad. Instead something very good occurred. Countless thousands of computer programmers, technologists and specialists were gainfully employed on very good salaries in the lead-up to 2000, to ensure that the computers would run smoothly when clocks struck midnight on 31 December 1999.

All fears were allayed. Computing spawned indulgent years of investment in technology and the web, leading to the dotcom era and the bursting of the big technology bubble. The reputation of computing has still not fully recovered. The public has little faith that careers in information technology and computing science will last. There are tens of thousands, and in some countries millions, of projected job vacancies and an unprecedented skills shortage in the coming decade. But enrolments in computing courses continue to dwindle, falling by 61 per cent in Australia since the dotcom bubble burst a decade ago. Society may have crossed the technology pain barrier, people are more comfortable with their interaction with technology than ever before, but computing is so ubiquitous that it may have become mundane.

 

DESPITE THE LACK of interest in learning how computers work, there is a spike in their use. This seemingly miraculous technology is matching, and in some cases reuniting, people from across the world in casual and serious relationships. People are no longer walking into a bar to meet their prospective mate. The stigma of going online to find a partner is disappearing, and the cliché of the pale, frail male sitting at home staring at a computer screen is becoming outmoded. It has become a popular way of meeting people and a common way of sharing information about ourselves with close (and not so close) friends on the web. The epidemic of social networking is replacing the older modes of social interaction. While young and old enjoy their own company, they are also incessantly texting and accessing their social networking sites from their mobile phones. It appears that the company they are in is rarely enough, and needs to be supplemented by virtual friends. The ability to access anyone while in the company of friends may now be a new definition of happiness.

Given the success of The Social Network and the pleasure Facebook has brought to so many people, it is likely that virtual connectivity is here to stay. It is, however, certainly not in its final evolution. Many keep in touch virtually by ‘chatting’ over the internet, and increasingly by ‘video chat’. The next stage is chatting to virtual beings that can interact socially or provide assistance in a corporate setting. Technology is already available where a virtual being (cyber twin) can chat using artificial intelligence. Imagine a computer program that monitors the way you chat to your friends, and learns from your idiosyncrasies. Then, while you sleep, it can respond to ‘chat’ invitations from friends overseas and respond as if it were you… This technology is a customer service tool used by major companies around the world – an intelligent computerised ‘virtual human’ that might answer frequently asked questions about banking, for example. People love chatting, texting and social networking, but is it as fun to talk to an artificially intelligent being?

Artificial intelligence might be considered the final frontier (so far) of computing and technology. Frequently associated with robotics, AI is everywhere, embedded in many products and places – from cars to washing machines, in so many applications that it might be considered nearly indispensable. For instance, cutting-edge technology powered by AI automates postal-address recognition for mail sorting. Computers that can read handwriting have been in development for over half a century. It has been a major challenge for researchers, and those in areas relating to automated pattern recognition. An adult human recognises a character, a word or sentence in a way that seems almost innate, but a computer requires a certain level of intelligence: of sophistication, memory and computing power. For a computer to distinguish simple shapes, or the difference between a handwritten ‘a’ or ‘b’, is highly complex; it relies on the analysis of the lines, curvature and contours written on a page. In most cases complex AI sits behind this, and the applications contribute to human happiness – such as when a child receives a birthday card from their grandparents, thanks to the rapid and efficient sorting of the mail by an automated process.

Recognition of simple shapes, such as handwritten words and sentences, is nothing compared to other real-world complex problems. Systems for detecting, counting and tracking people on our beaches are now being developed. There is a Big Brother stigma attached to surveillance cameras, as most uses of the technology have negative connotations. If the cameras that monitor beaches (locations associated with happiness and fun) could report swimmers in danger to assist lifeguards, surely the stigma would melt away and the cameras would be seen as life-savers: a good application with obvious benefits. The research in this area is advanced, but the problem unresolved – although cutting-edge work suggests that automated behaviour analysis of swimmers in the ocean is not far away.

AI and pattern-recognition technology is also far advanced in health applications such as medical imaging. Early detection of disease through non-invasive methods is already available in some spheres. In the not-so-distant future, detailed analysis of microscopic images of brain tissue may yield results for diagnosis of such neuro-degenerative diseases as Parkinson’s or Alzheimer’s. The technology under development analyses 3D digital images examining individual neurons. There are billions of neurons in the human brain. The first step is to ‘untangle’ them, so that single cells can be isolated. Out of hundreds of neuron types there needs to be a way of distinguishing between the subtleties of the different cells, so that only the ones that need to be examined are separated. AI-based technology can already do this automatically. Experiments have proven that the computerised methodology outperforms human experts in distinguishing between different neuron types. This amazing technology is only the beginning of providing assistance to humans and enhancing their day-to-day happiness – thousands more examples exist.

 

AI FASCINATES BECAUSE of the prospect that a computer might one day mimic a human. Robots already walk, build cars, navigate and even talk, but there is a missing element. Robots cannot yet hold a normal conversation, indistinguishable from a human, or learn from experience and gather data from their environments using sensors that resemble those of a human to collate information. Robots have not yet acquired the ability to undertake creative tasks such as painting or writing poetry. They can do these things partially. The ‘cyber twin’ concept has still not passed the test formulated by the famous mathematician Alan Turing in the 1950s. He proposed that if a computer and a human were located in separate rooms and ‘chat’ through another medium, and the human is unable to distinguish whether they are talking to another human or a computer, the test has been passed. No computer system can yet pass this simple test. There is software that can learn from experience; however, it focuses on learning about, and solving, only one problem. For example, the technology can identify unusual behaviour in a crowded public place, but it certainly won’t also make a cup of coffee.

Creative computers, or those with emotions, are the final frontier. Some AI research can detect human emotion, paint pictures automatically and even make up new words or phrases in a seemingly creative fashion, but this has only scratched the surface. Exuding complex emotions or painting like Monet is beyond even the most intelligent computers. The simplest pleasures that people convey through emotional engagement to invoke happiness are far off for even the most sophisticated AI technology. The pleasure from gazing at the Mona Lisa has not yet been achieved by examining a drawing produced by a computer. But can pleasure and happiness really be measured in this context? Just as human happiness has evolved through the introduction of ‘social networking’, perhaps we will continue to adjust our perception of happiness on the basis of technological advances.

Share article

About the author

Michael Blumanstein

Michael Blumenstein is an Associate Professor and the Dean (Research) of the Science, Environment, Engineering and Technology Group at Griffith University where he previously...

More from this edition

Invisible innocence

EssayIT IS HARD for most people to imagine being convicted of a crime they did not commit. Yet this scenario is not only possible,...

Across the divide

EssayIT'S SUMMER, AND I decide to drive the length of the Waranga Western Channel. I want to see for myself the canal system that...

The silence

EssayMANY AUSTRALIAN JEWS take an intense interest in Israel. They find it difficult to ignore the miracle of its creation so soon after the...

Stay up to date with the latest, news, articles and special offers from Griffith Review.