Trends in human-computer interaction

From Knowledge Kitchen
Jump to navigation Jump to search


This page outlines trends that people in the Human-Computer Interaction (HCI) field are beginning to notice.

Increased computer-to-person ratio

The relationship between people and computers has radically shifted since their introduction to common use in the mid 1900'ss. ' Most markedly, the number of computers per person has drastically increased.

Whereas the 1960'ss saw a many-people-to-one-computer relationship, the 2010's are seeing a one-person-to-many-computers relationship.

from Being Human: Human-Computer Interaction in the Year 2020, published by Microsoft Research

from Being Human: Human-Computer Interaction in the Year 2020, published by Microsoft Research


Ubiquitous computing

Computers are now often embedded in a huge range of materials and artifacts. This allows people to interact with computers and digital data in far contexts than previously possible. Some easy examples:

  • shoes that keep track of your running stats
  • scannable books in stores and libraries
  • scannable passports
  • computing with biological organisms and organic matter
  • fingerprint scanners on weapons
  • surveillance systems with facial recognition
  • flexible displays
  • plastic electronics
  • organic electronics
  • biological sources of power and computation

Your digital shadow

Huge amounts of data are being collected about people doing everyday mundane tasks.

  • tracking online behavior by advertisers, merchants, ISPs and government
  • textual analysis of email and social media content
  • video and photo facial recognition
  • tagging of user-generated content online
  • credit card and store loyalty card purchase histories
  • telephone call analysis
  • location tracking via car and phone GPS, transit cards, cell phone tower location, ATMs, etc.

This data is often analyzed and can be stored indefinitely, which leads to interesting questions about what can and will be done with such a huge amount of personal information.

Rise of the robots

Robots and applied machine learning techniques are already here. For example, they are being used in calculating car GPS directions, to automatically build software systems, in drone warfare, in applications that track people's behavior and recommend products and services they may wan to buy, in industrial factory assembly lines, and in situations where it is unsafe for humans, such as in nuclear disasters and bomb disposal tasks.

But robots and artificial intelligence algorithms are infamous for not being able to perform many tasks that humans find easy. ' For example, understanding people's intentions and emotions and generally dealing with uncertainty.

The fall of the GUI

Graphical User Interfaces (GUIs) have been extremely useful in helping people manipulate computers and digital data. Although they have always had their problems, we have found many ways of preserving their relevance.

However, with the increasing ubiquity of non-traditional embedded computer interfaces, human-computer interaction must increasingly be done through non graphical means.

  • using video tracking to understand gestures (e.g. Wii and Kinect)
  • using eye tracking to understand foci of attention (e.g. Google's creepy glasses)
  • allowing multi-touch for more expressive device input (e.g. iPhone, iPad)
  • using speech recognition to control computers
  • implanting devices into the brain and body to control machines (e.g. monkey controlling motorized prosthetic arm)

Question

What are the implications of these trends for designers of interactions?


What links here