the future of human-computer interaction: overview of...

23
The future of human-computer interaction: overview of input devices Fabrizio Fornari School of Computer Science ask´olinn´ ı Reykjav´ ık - Reykjav´ ık University Reykjav´ ık, Iceland November, 2012 Abstract We are in 2012 and we are still using the mouse and the keyboard to interact with a computer. We have seen a lot of changes in the world of Computer Science, relating to: performance, design and the way we interact with the computer. Differ- ent input devices have been developed around the computer, starting from the most recent touchscreens, continuing with webcams, microphones, and arriving to the oldest mice and keyboards. The aim of this research is to let the reader imagine a new way to interact with the computer. To reach our purpose, we introduce advanced technologies such as: Speech and Voice Recognition, Electronic Perception, Eye Tracking and Brain Computer In- terfaces. We propose examples of the cited technologies that may change the paradigm that saw, until now, keyboard and mouse as leaders of the input devices. 1

Upload: others

Post on 21-Apr-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

The future of human-computer interaction:

overview of input devices

Fabrizio Fornari

School of Computer Science

Haskolinn ı Reykjavık - Reykjavık University

Reykjavık, Iceland

November, 2012

Abstract

We are in 2012 and we are still using the mouse and the keyboard to interact

with a computer.

We have seen a lot of changes in the world of Computer Science, relating

to: performance, design and the way we interact with the computer. Differ-

ent input devices have been developed around the computer, starting from

the most recent touchscreens, continuing with webcams, microphones, and

arriving to the oldest mice and keyboards. The aim of this research is to

let the reader imagine a new way to interact with the computer. To reach

our purpose, we introduce advanced technologies such as: Speech and Voice

Recognition, Electronic Perception, Eye Tracking and Brain Computer In-

terfaces. We propose examples of the cited technologies that may change

the paradigm that saw, until now, keyboard and mouse as leaders of the

input devices.

1

Page 2: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

1 Introduction

From the computer’s birth1, we saw a lot of changes in the world of Com-

puter Science. Changes relating to: performance, design and human-computer

interaction [49]. A few years ago, the words “input device” evoked in our

mind only two specific objects: the keyboard and the mouse - the main

instruments used to provide data to a personal computer. Keyboard and

mouse are, in fact, two of the first input devices in the history of computer.

Nowadays, with the evolution of computers, we have a large set of input de-

vices that changed the way we interact with the computer. For example we

have: touch-screens, that allow users to interact with a computer using only

their hands; webcams and microphones, used together to make video-calls

and many others. Although we introduced other input devices we are still

using mouse and keyboard, why?

We want to provide an overview of input devices to let the reader imag-

ine a world without mice and keyboards. We start with the definition of

input device and the histories of mouse and keyboard to continue introduc-

ing instruments and technologies that may replace the usage of mouse and

keyboard such as: the Electronic Perception Technology [70] - the ability

of electronic components to form a 3-D map of their surroundings and see

what their users are doing. We also refer to Eye Tracking [47] - the abil-

ity of a computer to keep track of eye movements of users for developing

tasks like the movement of the pointer on a screen or the scroll of a page

at the same time that the user reads. We describe a new technology called

“Tobii Gaze” [38] - an example of Eye Tracking compatible with the new

release of Windows 8 [69]. Other technologies that we discuss are Speech

and Voice Recognition [58, 59], the abilities that allow a computer to un-

derstand speech and to recognize who is the speaker. We introduce one of

the most popular software package for Speech and Voice recognition called

1With the term computer, we refer to a Personal Computer (desktop, laptop, etc.)

2

Page 3: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

Dragon NaturallySpeaking [32] and we focus on pros and cons of using our

own voice to communicate with a computer.

At the end we analyze a different approach to improve the human-

computer interaction called Brain Computer Interface (BCI) [35]. The BCI

consist in a collaboration between a brain and a device that enables signals

from the brain to direct some external activities, such as the control of a

cursor. We describe the different types of BCI: Invasive, Partially invasive

and Non-invasive focusing on an example of Non-invasive Brain Computer

Interface called Emotiv EPOC [14] (winner of the 2012 Advance Global

Australian Award for Information and Communication Technologies [11]).

For each cited technology, we provide a definition and an example that may

bring the reader to discover new ways to interact with the computer.

3

Page 4: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

2 Background

In this section we provide the definition of input device and describe the

history of the keyboard and the mouse to give the reader the opportunity

to learn more about these 2 devices.

2.1 Input Device

From Wikipedia: “An input device is any peripheral - piece of computer

hardware equipment - that we use to provide data and control signals to

an information processing system such as a computer or other information

appliance.” [50]

In our case we refer to an input device as a device that converts the user’s

actions and analog data (sound, graphics, pictures) into digital electronic

signals that can be processed by a computer. Are the input devices that

let a user to control a computer, to interact with it and to take advantage

of the calculation power for which the computer is well-known; a computer

without input devices would not allow the interaction with an user.

2.2 Keyboard and Mouse

Since the birth of the computer, the input devices have increasingly evolved;

nowadays we are used to different types of input devices. Despite of the

variety of available input devices, the most used ones until now are the

keyboard and the mouse.

4

Page 5: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

2.2.1 The Keyboard

One of the first developed input devices for computer is the keyboard. A

keyboard is a typewriter-style device, which uses an arrangement of but-

tons or keys to input letters or numbers into a computer device [43]. The

first keyboards were really different from the idea of keyboard that we have

nowadays. It derives largely from two devices: teleprinters [62] and key-

punches [51]. It was from such devices that modern computer keyboards

inherited their layouts. From the 1940s until the late 1960s, typewriters

were the main means of data entry for computing, becoming integrated

into what were known as computer terminals [44]. The keyboard remained

the primary, most integrated computer peripheral well into the era of per-

sonal computing until the introduction of the mouse as a consumer device

in 1984. However, keyboards remain central to human-computer interac-

tion to the present day. Indeed even mobile personal computing devices

such as smartphones [57] and tablets [60] adopt the keyboard as an optional

virtual, touchscreen-based [63], means of data entry. Nowadays we can see

different kind of physical keyboards: Standard, Laptop-size, Thumb-sized,

Software-Keyboard, Foldable-Keyboard and Optical Keyboard [43].

Studies such as the one presented in [33] have demonstrated that a key-

board is a place where bacteria tend to accumulate. An article by the

BBC [31] underlines the danger of using a keyboard. From the sanitary

point of view, in a hospital a keyboard can be custodian of very dangerous

bacteria because different people touch the same keyboard, even people that

have been in touch with dangerous bacteria. This can be a good motivation

for trying to find a replacement for the keyboard.

5

Page 6: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

2.2.2 The Mouse

We introduce the definition and history of mouse taken from Wikipedia [54].

In computing, a mouse is a pointing device that functions by detect-

ing two-dimensional motion relative to its supporting surface. Physically,

a mouse consists of an object held under one of the user’s hands, with one

or more buttons. The mouse sometimes features other elements, such as

“wheels”, which allow the user to perform various system-dependent opera-

tions, or extra buttons or features that can add more control or dimensional

input. The mouse’s motion typically translates into the motion of a pointer

on a display, which allows for fine control of a Graphical User Interface

(GUI) [48].

The computer mouse has been invented by Douglas Engelbart of Stan-

ford Research Center in 1964 and was originally called “X-Y Position Indi-

cator” for a display system. Made of wood and featuring a bright red push

button, the mouse was more like a matchbox than computer device. Still,

the “mouse” (so named because the cord looked like a tail) took off, and new

versions popped up in Research and Development labs and in stores as the

PC and computing itself became more mainstream. Whereas today’s mice

are sleek and curved, early models were blocky and bulky. Many compa-

nies, including Microsoft, Apple, Hewlett-Packard, and Logitech, produced

innovative models from the 1980s until now and the mouse has become even

more efficient and comfortable. The types of mice vary by technology as

much as color: mechanical mice, gyroscopic mice, 3D mice, optical mice,

tactile mice, and most recently, wireless mice and touch mice.

During the years, as the mouse evolved, studies have been done to find

mouse-related health problems [4, 12, 18, 20, 25]. The results obtained from

the studies show how the use of mouse can cause traumas such as Carpal

Tunnel Syndrome [22], Tendonitis [1] or Bursitis [2]. If the computer mouse

can cause this kind of problems, why do we still use it?

6

Page 7: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

3 The Alternatives

We want to enlighten the reader on the world of input devices, focusing on

devices that we think are the most quoted to replace the use of mouse and

keyboard. We start from well-known devices such as touchscreens, micro-

phones and webcams to continue with new technologies such as: Electronic

Perception Technology and Brain Computer Interface.

3.1 Touchscreen

A touchscreen [63] is an electronic visual display that can detect the presence

and location of a touch within the display area. The term generally refers to

touching the display of the device with a finger or hand. Touchscreens can

also sense other passive objects, such as a stylus. Touchscreens are common

in devices such as game consoles, all-in-one computers, tablet computers,

and smartphones.

Fig.1 Example of touchscreen tablet with virtual keyboard

To the recent diffusion of tablets and smartphones corresponds the dif-

fusion of touchscreens. Tablets and smartphones adopt this technology that

combined with virtual keyboards, let touchscreens being the main competi-

tors of physical mouse and keyboard. However, until now we haven’t seen

a complete replacement of mouse and keyboard. People that use the com-

puter to work, generally prefer to use a physical keyboard or a physical

mouse instead of moving their hands and arms across the screen, specially if

7

Page 8: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

it is located vertically, like Forlines C. points out in [10]. Furthermore, it is

difficult, using a touchscreen, to accurately point at objects that are smaller

than one’s finger.

3.2 Microphone

A microphone [53] is an acoustic-to-electric transducer [64] or sensor [56]

that converts sound into an electrical signal. We refer to a microphone

that we can attach to a computer and can be used for applications such as:

teleconferencing [61] , video conferencing [65], digital dictation [46] , in-game

chat and music recording [9].

The microphone plays a fundamental role in the Speech and Voice Recog-

nitions - the abilities that allow a computer to understand the speech and

to recognize who is the speaker. It allows to transform the voice in a digital

format that a computer can process.

We want to introduce, as example, one of the most popular software package

for speech recognition: Dragon NaturallySpeaking [32].

• Dragon NaturallySpeaking is a software package developed by the

Nuance company that allows an user to “talk” to a computer for pro-

viding action such as the execution of vocal commands or the transla-

tion of the speech in digital format. Moreover it implements a text-to

speech function that allows the user to review the words by listening

them spoken by the computer.

Advantages of using Dragon NaturallySpeaking are: the possibility to

work without physically being in front of a computer, the avoidance of trau-

mas provided by a long use of keyboard and mouse and also the possibility to

control the computer while you are doing other things. Studies [27] demon-

strated that professional typists can type, with Dragon NaturallySpeaking,

more words per minute then using a keyboard.

8

Page 9: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

Disadvantages of using Dragon NaturallySpeaking are: the difficulty

and the long required time, for a new user, to practice with Dragon Natu-

rallySpeaking to reach the same performance that the user has with the com-

bination keyboard-mouse; the user requires an environment without other

speakers and without noise that may disturb the dictation; the software

is not 100% accurate. We can find a brief review of the latest version of

Dragon NaturallySpeaking on TopTen Reviews [41] - a website specialized

in software reviews.

3.3 Webcam

A webcam [66] is a video camera that feeds its images in real time to a com-

puter or computer network, often via USB , ethernet, or Wi-Fi. Webcams

are used to establish video links permitting computers to act as videophones

or videoconference stations. The webcam took its name from its common

use as a video camera for the World Wide Web. Other popular uses in-

clude security surveillance, computer vision [45], video broadcasting and

the recording of social videos.

We know that the webcam is strictly linked with Computer Vision Tech-

nology. An important task within computer vision is object tracking - the

ability of a computer to recognize predetermined objects (i.e. this tech-

inque is used in automation to identify defective products) or human parts

to provide tasks. This technology is largely used to provide software that

allows disabled people to interact with a computer. The most popular free

software tools allow the user to control the movement of the pointer on a

screen by using the movements of their head; an example is Enable Viacam

(eViacam) [29].

9

Page 10: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

• Enable Viacam is an open source software for head tracking. It al-

lows to move the pointer on a screen as the user moves his head. It

works on standard PCs and it requires only a webcam. It won the

Second prize in “VI Premio Vodafone a la Innovacion en Telecomuni-

caciones (2012)” [17].

Companies are also developing software and hardware that allow users

to control the movements of a pointer by using the movement of their own

eyes. This technique is called Eye Tracking. We can see an example of Eye

Tracking with a new technology called Tobii Gaze.

• Tobii Gaze is an interface, produced by the Tobii company [37], that

makes it possible for users to use their eyes to point to the screen and

interact with a standard computer. It was demonstrated on a Win-

dows 8 computer and it won a 2012 Microsoft Award.

For using Tobii Gaze, a computer requires two additional hardware

components: a Tobii IS20 Eye Tracker [40] and a Tobii Gaze Interac-

tion Device [39].

In this subsection, we introduced Enable Viacam and Tobii Gaze, two

ways to interact with a computer using the object tracking technique. They

let an user to move a pointer on the screen in a relative easy way. Relative,

because for a normal user is easier to use a common mouse instead of moving

the head all the time to move a pointer on the screen (with a reduced

accuracy); for a user with disabilities (i.e. a user that can’t move his own

hands) it can be a useful way to interact with a computer. In particular the

use of a virtual keyboard combined with head or eye tracking is a good way

for people with disabilities to communicate with others, while for common

users it is a more tricky way then using a physical keyboard.

10

Page 11: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

3.4 Electronic Perception Technology

A good definition of EPT is provided by the wisegeek website [70]: “The

Electronic Perception Technology (EPT) is a low-cost, single-chip imagining

technology that enables electronic components to form a 3-D map of their

surroundings and see what their users are doing. One of the first applications

is a “virtual keyboard”, a system that projects a laser keyboard onto a table

and detects which keys the user is pressing by watching their hands and

sensing which spots on the table their fingers are touching. EPT systems

can determine depth by sending out pulses of light and timing how long

it takes for the reflection to return to the sensor. This is rather different

from the way the human brain determines depth, but still effective. EPT

systems can accurately determine brightness and distinguish objects from

one another”. Two example of electronic perception technology are the

evoMouse [5] and the Magic Cube [6] two products made by a company

called Celluon [8].

• The evoMouse wants to be the evolution of the computer mouse.

The evoMouse allows a user to use his fingers like a mouse to interact

with a computer device, without actually using a physical one. The

evoMouse works on nearly any flat surface and requires little space.

It tracks the user’s finger movements to move a pointer on the screen

and provides actions like the zooming and the scrolling of pages. The

evoMouse aims to reduce Carpal Tunnel Syndrome that is one of the

traumas, which a user can suffer with a traditional mouse.

11

Page 12: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

Fig.2 Example of evoMouse.

• The Magic Cube provides a projection keyboard and multi-touch

mouse. It connects to any Bluetooth HID devices [19], including the

latest iPhone, iPad and Android devices. A user can also plug-and-

play the Magic Cube with Windows and Mac OS devices via USB

connection. The Magic Cube connects wirelessly with a mobile device

and it has small dimensions. It can be a solution for the usage of

keyboards in hospitals where keyboards may accumulate dangerous

bacteria. It is for this purpose that the Celluon developed a mini

version of the Magic Cube dedicated for its usage in hospitals called

Medical Keyboard [7].

Fig.3 Example of a Magic Cube projecting a virtual keyboard.

12

Page 13: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

The devices previously cited want to replace the physical use of mice

and keyboards with a projected keyboard and with the use of the user’s

hand like a mouse. From the health aspect they can prevent the human

contact with bacteria accumulated on physical mouse and keyboard. But

also the evoMouse and the Magic Cube remain conceptually linked with the

oldest input devices: keyboard and mouse. This may be seen like a not real

innovation but it has the pro factor of not changing drastically the human-

computer interaction. Moreover we think that this kind of product may

remain a valid option for people who will want to remain linked to mouse

and keyboard even when they will be replaced.

3.5 More Natural User Interfaces

In the last years we saw other types of devices, based on Electronic Per-

ception, taking place in the market aiming to reach a better Natural User

Interface [55] - a concept of human-machine interface that see the user using

its own “natural” ways to communicate like speech or gestures. Two success-

ful devices are: the Nintendo Wii [67] and the Microsoft’s Xbox Kinect [52].

Essentially, the Nintendo Wii allows the user to interact with the Wii con-

sole waving a Wii Remote [68] while the Microsoft’s Xbox Kinect allows the

user to interact with a Xbox without a controller. We don’t want to discuss

more about these devices but we want to introduce a device that has been

presented such as: “The Kinect-killer” [23] or in a more ambitious way “The

mouse killer” [26]. The device is called Leap [30].

13

Page 14: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

Fig.4 Example of Leap used with a common laptop.

• Leap Motion technology use: camera sensors to map out a 3D

workspace, touch-free motion sensing and motion control software. As

reported on the Leap official website [30]: “The Leap senses your in-

dividual hand and finger movements independently, as well as items

like a pen. In fact, it’s 200x more sensitive than existing touch-free

products and technologies. It’s the difference between sensing an arm

swiping through the air and being able to create a precise digital sig-

nature with a fingertip or pen.”

The simplest thing the Leap can do is simulate a touch screen, so the user

can interact with any display as if they were touch-enabled. Developers will

be able to create any sort of application: some may improve remote surgery,

others may allow easier navigation through complex models and data, and

others might put the user in the middle of a first-person shooter. For now,

the possibilities appear to be limited only by the imagination.

As every new product it claims to be the best one. We need only to check

the first rows of the official website under section “about” to read: “Say

goodbye to your mouse and keyboard...it’s more accurate than a mouse, as

reliable as a keyboard and more sensitive than a touchscreen.”

However, because the Leap will ship in 2013 we don’t have enough in-

formation to properly judge it.

14

Page 15: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

3.6 Brain Computer Interface

A Brain-Computer Interface (BCI) [34] is a direct interface between a brain

and a computer not requiring any motor output from the user. Neural

impulses in the brain are intercepted and used to control an electronic de-

vice. BCIs are often directed at assisting, augmenting, or repairing human

cognitive or sensory-motor functions [42]. There are three types of Brain

Computer Interface [34]:

• Invasive BCIs are implemented by implanting chips directly into the

grey matter of the brain during neurosurgery. As they rest in the

grey matter, invasive devices produce the highest quality signals of

BCI devices but are prone to scar-tissue build-up, causing the signal

to become weaker or even lost as the body reacts to a foreign object

in the brain.

• Partially invasive BCI devices are implanted inside the skull but rest

outside the brain rather than amidst the grey matter. They produce

better resolution signals than non-invasive BCIs where the bone tissue

of the cranium deflects and deforms signals and have a lower risk of

forming scar-tissue in the brain than fully-invasive BCIs.

• Non-invasive implants are easy to wear and produce poor signal reso-

lution because the skull dampens signals, dispersing and blurring the

electromagnetic waves created by the neurons.

More details about Brain Computer Interfaces can be found in [24,36]

Because of we aim to find an alternative to mouse and keyboard that

doesn’t require any kind of “surgical operation”, we focus on a recent non-

invasive Brain Computer Interface that won the 2012 Advance Global Aus-

tralian Award for Information and Communication Technologies: the Emo-

tiv EPOC produced by the Emotiv company [16].

15

Page 16: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

• The Emotiv EPOC [15] is based on the latest developments in neuro-

technology. It is a high resolution neuro-signal acquisition and process-

ing wireless neuroheadset. It uses a set of sensors to tune into electric

signals produced by the brain to detect user thoughts, feelings and

expressions and connects wirelessly to most PCs.

Fig.5 Example of a Emotiv EPOC

We know about the community of researchers that is using the Emotiv

EPOC to develop software that may change the human-computer interaction

and not only that. An example is the project called “BrainDriver” [28] that

involves the AutoNOMOS Labs - a part of the Artificial Intelligence Group of

the Freie Universitat Berlin. BrainDriver tests the possibility to maneuver

a particular research car, named “MadeInGermany”, with the use of the

brain-power.

16

Page 17: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

Discussion

We want to provide a personal consideration on a psychological aspect of

the computer user mind: the “Fear of Change”. The Fear of Change can be

seen in a severe form in people who are scared about any kind of change:

from the simple change in a weekly schedule to massive life changes such as

a move to another place. As Joseph Burgo says in [3]: “There’s no guaranty

that change will be for the better; you don’t know for sure how you’re

going to feel when your world changes. For this reason, many people have

a strong fear of change; they cling to the familiar, even if it’s not especially

satisfying.”

We want to focus, not on the Fear of Change like a real pathology but, on

that kind of conservative status that may bring an user to make the choice

to not replace his commonly used tools until a truly better replacement is

provided (sometimes neither in that case). Let’s think about people who

spend 8 hours or more in front of a computer, we find it difficult to imagine

all this people changing their way to work. They don’t want to move their

hands all the time, nevertheless they don’t want to move their head to point

on a screen and moreover they don’t want to speak all the time to interact

with a computer. A common user, wants an input device that is easy to

install, easy to use, that requires less energy as possible to be used, with

high performance and with a low price. The mouse and the keyboard are

the right compromise to everyone of these aspects. Moreover, we are used

to interact with a computer using mouse and keyboard. The only way to

replace them is to develop an input device better in all the fields, cited

before, or so much better in only one or more of them to let the user chose

the new product; until now we didn’t see any device capable of such a thing.

However, we may find that solution not in a single device or software but in

a combination of them that may let a user to: speak with a computer, point

on a screen with his own eyes, manipulate virtual object with his own hands,

17

Page 18: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

and maybe use a mouse or a keyboard for more specific tasks. We may also

see a brain computer interface that allows to completely control a computer,

only with the power of the mind. Nowadays this kind of technology is still

young: it has a high price (i.e. Emotiv EPOC $299.00) and it is not accurate

as mouse and keyboard, but we are sure that studies in this field will lead

to big changes in the human-computer interaction.

Related Work

Several studies have been made about Human-computer interaction.

Franck Dernoncourt in the article: “Replacing the computer mouse” [13]

provides a brief overview of input devices that may be used to replace the

mouse. He makes a distinction from devices that replace the movement of

a mouse cursor to devices that replace the click function of the mouse.

Also an interesting overview of input and output techniques is provided

by Thomas Hahn in [21]. The author focuses on multi-touch devices, video

recognition and voice recognition. He provides examples of these technolo-

gies introducing the concept of multi-modal interfaces - the combination of

several input methods to interact with the computer.

Acknowledgements

We want to thank the professor Luca Aceto from the University of Reykjavik

for his suggestions and his really interesting course in Research Methodology.

We thank also the professor Kristin Sainani from the University of Stanford

for her suggestions in the online course “Writing in the Science”. The in-

tegration of these two courses has allowed us to understand fundamental

concepts and rules to follow during the writing of a scientific paper.

Moreover we want to thank our colleague Paolo Rovelli with whom,

sharing opinions has been a pleasure.

18

Page 19: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

Conclusion and Future Work

We focused our research to illustrate different ways to communicate with a

computer (instead of using a physical mouse and keyboard) to let the reader

imagine a future in which the human-computer interaction will not happen

with the use of an intermediate device that requires practice or particular

efforts.

However, for some of the technologies that we introduced, we can’t es-

tablish a priori if they may replace mouse and keyboard such as: the Leap

motion, which will ship in 2013, the Brain Computer Interface, which is

still in a developing phase, and the speech recognition that maintains lacks

for a general purpose use. The evoMouse and the virtual keyboard should

be compared with the physical mouse and keyboard to see which are more

productive and healthy.

At the end we are sure that all the introduced technologies will contribute

to change our computer-human interaction paradigm that until few years ago

showed mouse and keyboard leaders of the input devices. We think that the

physical mouse and keyboard, for common use, will tend to disappear even

more. They will remain only for specific purpose like: game playing (at

the beginning), interaction with other machines that don’t support the new

technologies or that require to much time or money to be upgraded with

new technologies and for people that want to use the oldest devices only for

the pleasure of using them or for the “Fear of change”.

Finally we hope to have the possibility to try at least some of the cited

devices such as: the Leap and the Emotiv EPOC, to verify ourself the state

of the art comparing their performance with those of mouse and keyboard.

We want to discover if they really may lead the human-computer interaction

to a higher level.

19

Page 20: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

References

[1] Chris Adams. What Is Tendonitis? About.com Guide.

http://ergonomics.about.com/od/tendonitis/a/tendonitis.htm.

[2] Chris Adams. Whati is Bursitis? About.com Guide.

http://ergonomics.about.com/od/carpaltunnelsyndrome/a/bursitis.htm.

[3] Joseph Burgo. The Fear of Change. http://www.afterpsychotherapy.com/the-fear-

of-change/.

[4] Sungwon Changa Catherine Cooka, Robin Burgess-Limerickb. The prevalence of neck

and upper extremity musculoskeletal symptoms in computer mouse users. Interna-

tional Journal of Industrial Ergonomics, 2000.

[5] Celluon. evo Mouse. http://celluon.com/products em overview.php.

[6] Celluon. Magic Cube. http://celluon.com/products.php.

[7] Celluon. Medical Keyboard. http://celluon.com/products medic overview.php.

[8] Celluon. Official website. http://celluon.com/.

[9] Shanika Chapman. Definition of a Computer Microphone.

http://www.ehow.com/facts 5062680 definition-computer-microphone.html.

[10] Chia Shen1 Ravin Balakrishnan Clifton Forlines, Daniel Wigdor. Direct-Touch vs.

Mouse Input for Tabletop Displays, 2007.

[11] Australia’s Global Community. Advance Global Australian Award.

http://advance.org/articles/advance-global-australian-awards-winners-2012.

[12] Smith Sharit Czaja. Aging, Motor Control, and the Performance of Computer Mouse

Tasks, 1999.

[13] F. Dernoncourt. Replacing the computer mouse, 2012.

[14] Emotiv. Emotiv Epoc. http://www.emotiv.com/index.php.

[15] Emotiv. Emotiv EPOC. http://www.emotiv.com/apps/epoc/299/.

[16] Emotiv. Official website. http://emotiv.com/.

[17] Fundacion Vodafone Espana. Vi Premio Vodafone a la Innovacion en Telecomuni-

caciones. http://fundacion.vodafone.es/fundacion/es/conocenos/difusion/v-premio-

vodafone-a-la-innovacion-en/.

[18] Brogmus G. Fogleman M. Computer mouse use and cumulative trauma disorders of

the upper extremities, 1995.

20

Page 21: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

[19] Bluetooth Special Interest Group. Human Interface Device Profile (HID).

http://www.bluetooth.org/Building/HowTechnologyWorks/ProfilesAndProtocols/HID.htm.

[20] Aasa U. Jensen B.R. Sandfeld J. Lyskov E. Richter H. and Crenshaw A.G. THE

IMPACT OF COMPUTER MOUSE WORK WITH DIFFERENT SIZE OBJECTS

ON SUBJECTIVE PERCEPTION OF FATIGUE AND PERFORMANCE, 2007.

[21] Thomas Hahn. Future Human Computer Interaction with special focus on input and

output techniques, 2010.

[22] PubMed Health. Carpal tunnel syndrome. http://www.ncbi.nlm.nih.gov/pubmedhealth/PMH0001469/,

2007.

[23] Danish Technological Institute. The Kinect killer? http://robot.dti.dk/en/news/the-

kinect-killer.aspx.

[24] Brian Y. Hwang B.A. Geoffrey Appelboom M.D. Christopher P. Kellner M.D. Ivan

S. Kotchetkov, B.A. and M.D. E. Sander Connolly Jr. Brain-computer interfaces:

military, neurosurgical, and ethical perspective.

[25] Finsen L Hansen K Juul-Kristensen B Christensen H. Jensen C, Borg V. Job de-

mands, muscle activity and musculoskeletal symptoms in relation to work with the

computer mouse. Scand J Work Environ Health, 1998.

[26] Alex Kahney. The Leap 3d Motion Sensor Is Not A Kinect Killer, It’s Going To Kill

Your Mouse Instead [Exclusive Hands-On].

[27] Halverson C. Horn D. & Karat-J. Karat, C. M. Patterns of entry and correction

in large vocabulary continuous speech recognition systems. In Proceedings of the

SIGCHI conference on Human factors in computing systems: the CHI is the limit

(pp. 568-575). 1999.

[28] AutoNOMOS Labs. BrainDriver. http://autonomos.inf.fu-berlin.de/.

[29] Cesar Mauri Loba. Enable Viacam. http://eviacam.sourceforge.net/index.php.

[30] Leap Motion. Leap. https://leapmotion.com/about.

[31] BBC NEWS. Keyboards ’dirtier than a toilet’.

http://news.bbc.co.uk/2/hi/7377002.stm.

[32] Nuance. Dragon NaturallySpeaking. http://www.nuance.com/dragon/index.htm.

[33] Linda Dailey Paulson. Does Your Workstation Make You Sick? IEEE computer

magazine (vol. 35 no.7), July 2002.

[34] Santosh Kumar Ram. Brain Computer Interface. 2006.

21

Page 22: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

[35] Margaret Rouse. brain-computer interface (BCI).

http://whatis.techtarget.com/definition/brain-computer-interface-BCI.

[36] Guido Widman Michael Schroder-Jeremy Hill Wolfgang Rosenstiel Christian E. Elger

Bernhard Scholkopf Niels Birbaumer Thomas Navin Lal, Thilo Hinterberger. Meth-

ods Towards Invasive Human Brain Computer Interfaces, 2005.

[37] Tobii. Official website. http://www.tobii.com.

[38] Tobii. Tobii Gaze. http://www.tobii.com/en/group/news-and-events/press-

releases/tobii-gaze-interface-for-windows-8/.

[39] Tobii. Tobii Gaze Interaction Device. http://www.tobii.com/en/gaze-

interaction/global/products-services/hardware-dev-kits/tobii-gaze-interaction-

device/.

[40] Tobii. Tobii IS20 Eye Tracker. http://www.tobii.com/en/gaze-

interaction/global/products-services/hardware-dev-kits/tobii-is20-eye-tracker/.

[41] TopTenREVIEWS. Dragon NaturallySpeaking Premium 11 Edition. http://voice-

recognition-software-review.toptenreviews.com/dragon-naturally-speaking-

review.html.

[42] Wikipedia. Brain-computer interface. http://en.wikipedia.org/wiki/Brain–

computer interface.

[43] Wikipedia. Computer keyboard. http://en.wikipedia.org/wiki/Computer keyboard.

[44] Wikipedia. Computer terminal. http://en.wikipedia.org/wiki/Computer terminal.

[45] Wikipedia. Computer vision. http://en.wikipedia.org/wiki/Computer vision.

[46] Wikipedia. Digital dictation. http://en.wikipedia.org/wiki/Digital dictation.

[47] Wikipedia. Eye tracking. http://en.wikipedia.org/wiki/Eye tracking.

[48] Wikipedia. Graphical uer interface. http://en.wikipedia.org/wiki/Graphical user interface.

[49] Wikipedia. Human-computer interaction. http://en.wikipedia.org/wiki/Human-

computer interaction.

[50] Wikipedia. Input device. http://en.wikipedia.org/wiki/Input device.

[51] Wikipedia. Keypunch. http://en.wikipedia.org/wiki/Keypunch.

[52] Wikipedia. Kinect. http://en.wikipedia.org/wiki/Kinect.

[53] Wikipedia. Microphone. http://en.wikipedia.org/wiki/Microphone.

[54] Wikipedia. Mouse (computing). http://en.wikipedia.org/wiki/Mouse (computing).

[55] Wikipedia. Natural User Interface. http://en.wikipedia.org/wiki/Natural user interface.

22

Page 23: The future of human-computer interaction: overview of ...luca/...Fornari_The_Future_of_Human_computer_interacti… · The future of human-computer interaction: overview of input devices

[56] Wikipedia. Sensor. http://en.wikipedia.org/wiki/Sensor.

[57] Wikipedia. Smartphone. http://en.wikipedia.org/wiki/Smartphone.

[58] Wikipedia. Speaker recognition. http://en.wikipedia.org/wiki/Speaker recognition.

[59] Wikipedia. Speech recognition. http://en.wikipedia.org/wiki/Speech recognition.

[60] Wikipedia. Tablet. http://en.wikipedia.org/wiki/Tablet computer.

[61] Wikipedia. Teleconferencing. http://en.wikipedia.org/wiki/Teleconferencing.

[62] Wikipedia. Teleprinter. http://en.wikipedia.org/wiki/Teleprinter.

[63] Wikipedia. Touchscreen. http://en.wikipedia.org/wiki/Touch screen.

[64] Wikipedia. Transducer. http://en.wikipedia.org/wiki/Transducer.

[65] Wikipedia. Video conferencing. http://en.wikipedia.org/wiki/Video conferencing.

[66] Wikipedia. Webcam. http://en.wikipedia.org/wiki/Webcam.

[67] Wikipedia. Wii. http://en.wikipedia.org/wiki/Wii.

[68] Wikipedia. Wii Remote. http://en.wikipedia.org/wiki/Wii Remote.

[69] Wikipedia. Windows 8. http://en.wikipedia.org/wiki/Windows 8.

[70] Wisegeek.com. What is Electronic Perception Technology (EPT)?

http://www.wisegeek.com/what-is-electronic-perception-technology.htm.

23