interaction aspects of wearable computing for human...

176
DOCTORAL THESIS Luleå University of Technology Department of Computer Science and Electrical Engineering Media Technology Research Group 2006:60|:02-5|: - -- 06 ⁄60 -- 2006:60 Interaction Aspects of Wearable Computing for Human Communication Mikael Drugge

Upload: others

Post on 06-Oct-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

DOCTORA L T H E S I S

Luleå University of TechnologyDepartment of Computer Science and Electrical Engineering

Media Technology Research Group

2006:60|: 02-5|: - -- 06 ⁄60 --

2006:60

Interaction Aspects of Wearable Computing for Human Communication

Mikael Drugge

Page 2: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 3: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Interaction Aspectsof Wearable Computing

for Human Communication

Mikael Drugge

Media Technology Research GroupDepartment of Computer Science and Electrical Engineering

Luleå University of TechnologySE–971 87 Luleå

Sweden

December 2006

Supervisor

Ph.D. Peter Parnes, Luleå University of Technology

Page 4: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

ii

Page 5: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Abstract

This thesis presents the use of wearable computers for aiding human communication over adistance, focusing on interaction aspects that need to be resolved in order to realize this goal.As wearable computers by definition are highly mobile, always on, and always accessible, theability to communicate becomes independent of place, time and situation. This also imposesnew requirements on the user interface of the wearable computer, calling for natural andunobtrusive interaction with the user.

One of the key challenges in wearable computing today is to streamline the user’s inter-action, so that it is tailored for the situation at hand. A user interface that takes too mucheffort to use, interrupts or requires more than a minimum of attention, will inevitably ham-per the user’s ability to perform tasks in real life. At the same time, human communicationinvolves both effort, interruptions and paying attention, so the key is to find a balance wherewearable computers can aid human communication without being intrusive. To design userinterfaces supporting this, we need to know what roles different aspects of interaction havein the field of wearable computing. In this thesis, the use of wearable computing for aidinghuman communication is explored around three aspects of interaction.

The first aspect deals with how information can be conveyed by the wearable computeruser, allowing a user to retrieve advice and guidance from experts, and remote persons toshare experiences over a distance. The thesis presents findings of using wearable computingfor sharing knowledge and experience, both for informal exchange among work colleagues,as well as enabling more efficient communication among health-care personnel. The secondaspect is based on findings from these trials and concerns how the wearable computer inter-acts with the user. As the user performs tasks in the real world, it is important to determinehow different methods of notifying the user affects her attention and performance, in order todesign interfaces that are efficient yet pleasant to use. The thesis presents user studies examin-ing the impact of different methods of interruption, and provides guidelines for how to makenotifications less intrusive. The third and final aspect considers how the user’s physical inter-action with the wearable computer can be improved. The thesis presents rapid prototyping ofsystems employing user centric design. Furthermore, a framework for ubiquitous multimediacommunication is presented, enabling wearable computers to be dynamically configurableand utilize resources in the environment to supplement the user’s equipment.

All in all, the thesis presents how wearable communications systems can be developedand deployed, how their human-computer interaction should be designed for unobtrusiveoperation, and how they can come to practical use in real world situations.

iii

Page 6: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

iv

Page 7: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Contents

Abstract iii

Preface xi

Publications xiii

Acknowledgments xv

1 Thesis Introduction 11.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.2 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.3 Background and Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.3.1 Wearable Computing . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.3.2 Ubiquitous and Pervasive Computing . . . . . . . . . . . . . . . . . 6

1.3.3 Video Conferencing and E-meetings . . . . . . . . . . . . . . . . . . 7

1.3.4 Mobile E-meetings . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

1.3.5 Motivation of Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . 11

1.4 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

1.5 Scope and Delimitation of the Thesis . . . . . . . . . . . . . . . . . . . . . . 14

1.6 Research Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

1.7 Summary of Included Publications . . . . . . . . . . . . . . . . . . . . . . . 16

1.8 Wearable Computing for Human Communication . . . . . . . . . . . . . . . 18

1.8.1 Mobile E-Meetings through Wearable Computing . . . . . . . . . . . 19

1.8.2 Managing Interruptions and Notifications . . . . . . . . . . . . . . . 22

1.8.3 Prototyping and Deploying Mobile E-Meeting Systems . . . . . . . . 24

1.9 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

1.9.1 Future Research Directions . . . . . . . . . . . . . . . . . . . . . . . 31

v

Page 8: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

vi Contents

1.9.2 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

1.10 Personal Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

2 Sharing Experience and Knowledge with Wearable Computers 352.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

2.1.1 Environment for Testing . . . . . . . . . . . . . . . . . . . . . . . . 38

2.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

2.3 The Mobile User . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

2.3.1 Hardware Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . 39

2.3.2 Software Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

2.4 Beyond Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

2.4.1 Becoming a Knowledgeable User . . . . . . . . . . . . . . . . . . . 41

2.4.2 Involving External People in Meetings . . . . . . . . . . . . . . . . . 42

2.4.3 When Wearable Computer Users Meet . . . . . . . . . . . . . . . . . 43

2.5 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

2.5.1 The Importance of Text . . . . . . . . . . . . . . . . . . . . . . . . . 44

2.5.2 Camera and Video . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

2.5.3 Microphone and Audio . . . . . . . . . . . . . . . . . . . . . . . . . 46

2.5.4 Transmission of Knowledge . . . . . . . . . . . . . . . . . . . . . . 46

2.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

2.6.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

2.7 Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

3 Experiences of Using Wearable Computers for Ambient Telepres-ence and Remote Interaction 49

3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

3.1.1 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

3.2 Everyday Telepresence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

3.3 Wearable Computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

3.4 Experiences of Telepresence . . . . . . . . . . . . . . . . . . . . . . . . . . 58

3.4.1 User Interface Problems . . . . . . . . . . . . . . . . . . . . . . . . 59

3.4.2 Choice of Media for Communicating . . . . . . . . . . . . . . . . . 61

3.5 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

3.5.1 Time for Setup and Use . . . . . . . . . . . . . . . . . . . . . . . . 62

3.5.2 Different Levels of Immersion . . . . . . . . . . . . . . . . . . . . . 63

Page 9: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Contents vii

3.5.3 Appearance and Aesthetics . . . . . . . . . . . . . . . . . . . . . . . 66

3.5.4 Remote Interactions made Possible . . . . . . . . . . . . . . . . . . 68

3.5.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

3.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

3.6.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

3.7 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

4 Methods for Interrupting a Wearable Computer User 714.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

4.1.1 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

4.2 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

4.2.1 Real World Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

4.2.2 Interruption Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

4.2.3 Combining the Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . 76

4.2.4 Treatments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

4.3 User Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

4.3.1 Test Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

4.3.2 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

4.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

4.4.1 Comparison with Base Cases . . . . . . . . . . . . . . . . . . . . . . 83

4.4.2 Pairwise Comparison of Treatments . . . . . . . . . . . . . . . . . . 84

4.4.3 Comparison with Original Study . . . . . . . . . . . . . . . . . . . . 85

4.4.4 Subjective Comments . . . . . . . . . . . . . . . . . . . . . . . . . 85

4.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

4.5.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

4.6 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

5 Using the "HotWire" to Study Interruptions in Wearable Com-puting Primary Tasks 87

5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

5.1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

5.1.2 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

5.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

5.3 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

5.3.1 Primary Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

Page 10: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

viii Contents

5.3.2 Interruption Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

5.3.3 Methods for Handling Interruptions . . . . . . . . . . . . . . . . . . 92

5.4 User Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

5.4.1 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94

5.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

5.5.1 Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

5.5.2 Contacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

5.5.3 Error rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

5.5.4 Average age . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

5.6 Evaluating the apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

5.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

5.7.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

5.8 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

6 Wearable Systems in Nursing Home Care: Prototyping Experi-ence 105

6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

6.2 Scoping the Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108

6.3 Paper Prototyping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109

6.3.1 Paper, Pen, and Plastic . . . . . . . . . . . . . . . . . . . . . . . . . 109

6.3.2 Paper Prototyping Benefits . . . . . . . . . . . . . . . . . . . . . . . 110

6.4 Moving to Multimodal Devices . . . . . . . . . . . . . . . . . . . . . . . . . 111

6.4.1 Wearable Prototype . . . . . . . . . . . . . . . . . . . . . . . . . . . 111

6.4.2 Communication Application . . . . . . . . . . . . . . . . . . . . . . 111

6.4.3 Wizard of Oz Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 112

6.4.4 Feedback From the Nurses . . . . . . . . . . . . . . . . . . . . . . . 113

6.5 Final Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114

6.6 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115

7 Enabling Multimedia Communication using a Dynamic WearableComputer in Ubiquitous Environments 117

7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

7.2 Background and Related Work . . . . . . . . . . . . . . . . . . . . . . . . . 121

7.3 The Ubiquitous Communication Management Framework . . . . . . . . . . 122

7.3.1 Information Repositories . . . . . . . . . . . . . . . . . . . . . . . . 124

7.3.2 Personal Communication Management Agent . . . . . . . . . . . . . 127

Page 11: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

ix

7.3.3 Remote Control User Interface . . . . . . . . . . . . . . . . . . . . . 128

7.3.4 Mobility Manager . . . . . . . . . . . . . . . . . . . . . . . . . . . 129

7.4 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131

7.4.1 Framework Implementation . . . . . . . . . . . . . . . . . . . . . . 132

7.4.2 Message Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . 135

7.4.3 Bandwith Overhead . . . . . . . . . . . . . . . . . . . . . . . . . . 136

7.4.4 Time Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137

7.4.5 Proof of Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

7.4.6 Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

7.4.7 Prototype Implementation . . . . . . . . . . . . . . . . . . . . . . . 139

7.4.8 Hardware used in the Scenario . . . . . . . . . . . . . . . . . . . . . 140

7.4.9 Evaluation by End Users . . . . . . . . . . . . . . . . . . . . . . . . 141

7.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

7.6 Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

Bibliography 147

Page 12: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

x

Page 13: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Preface

The work presented in this thesis has been conducted at Luleå University of Technology(LTU) between the years 2002 and 2006. I started in a project called RadioSphere with theCentre for Distance-spanning Technology (CDT), where the ultimate goal was to proliferatethe mobile Internet by providing ubiquitous network access to mobile computers. Among thework needed to help this vision come true, was research in human-computer interaction forhighly mobile and portable devices. This brought me in contact with the field of wearablecomputing, where I together with my colleague Marcus Nilsson became the local pioneers inexploring this research topic at our university.

Much of my early work was to build a foundation of knowledge on how wearable com-puters could be used, creating prototypes which would provide first hand experience in orderto provide the essential know-how about wearable computing. As my research group had along history of research in multimedia communication and online e-meetings between people,my research soon followed along with the goal of enabling and facilitating such e-meetingsthrough wearable computing. This resulted in a number of publications where the concept ofusing wearable computers for mobile e-meetings was explored.

Realizing that wearable computing was in fact a very broad and highly interdisciplinaryfield of research, containing topics ranging from software to hardware and human-computerinteraction, crossing over into fields such as psychology and ergonomics and even fashiondesign, I tried to focus my work more on the human-computer interaction aspect. The reasonfor this choice being that one of the major problems I found when using our wearable com-puters in real-life settings, was that the user interface was highly difficult to get right for acomputer supposed to be used in mobile and physically challenging environments, and thiswas detrimental for the entire concept of mobile e-meetings. One of the inherent propertiesof a meeting in the real world is that the persons involved interact and interrupt each other,and when meetings are mediated through a computer this happens even more frequently associal cues are lost in the process. Therefore, I did an experiment aimed at finding out howto manage interruptions properly. Because of the depth of this research question, this wouldturn out to lead to a series of experiments and publications that would continue throughoutthe years.

After my licentiate thesis in late 2004, I got involved in projects run by the Centre forDistance-spanning Healthcare (CDH). Noticing the need for better communication and theability to bridge distances between medical workers in the rural parts of northern Sweden,such as enabling a nurse to remotely communicate with a doctor when examining a patient,

xi

Page 14: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

xii Preface

my research became focused on providing mobile e-meetings for such purposes. Becauseof the precarious situation of deploying novel computing solutions for people who normallydeal with humans rather than computers, my research still maintained the ever important goalof making interaction easy and disruption free. Having access to a nursing home in whichprototypes could be deployed and experiments conducted, this led to a number of field testswith proof of concept solutions.

In the following autumn and winter of 2005, I was given the opportunity to stay as anexchange student at the Technologie-Zentrum Informatik (TZI) at the University of Bremen inGermany. Me being the only researcher in wearable computing back home at LTU, workingtogether with the many members of the TZI wearable computing research group proved to bea highly educational and valuable time. Besides gaining new insights in research and researchmethodologies related to wearable computing, we also initiated a collaboration around myinterruption studies as interaction was a common research question of ours.

Page 15: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Publications

This doctoral thesis consists of an introduction and six papers. The introductory chapter pro-vides a discussion of all papers and their relationship with each other, together with ideas forfuture work in the area of research. All papers except one have been published at internationalpeer reviewed conferences, journals, and workshops. I am the main author of four papers andco-author of two papers.

Paper 1 Marcus Nilsson, Mikael Drugge, and Peter Parnes,“Sharing Experience and Knowledge with Wearable Computers”, In Proceedingsof Pervasive 2004 Workshop on Memory and Sharing of Experiences, Vienna, Austria,April 2004.

Paper 2 Mikael Drugge, Marcus Nilsson, Roland Parviainen, and Peter Parnes,“Experiences of Using Wearable Computers for Ambient Telepresence and Re-mote Interaction”, In Proceedings of the 2004 ACM SIGMM Workshop on EffectiveTelepresence, New York, USA, October 2004.

Paper 3 Mikael Drugge, Marcus Nilsson, Urban Liljedahl, Kåre Synnes, and Peter Parnes,“Methods for Interrupting a Wearable Computer User”, In Proceedings of the 8thIEEE International Symposium on Wearable Computers (ISWC’04), Washington DC,USA, November 2004.

Paper 4 Mikael Drugge, Hendrik Witt, Peter Parnes, and Kåre Synnes,“Using the "HotWire" to Study Interruptions in Wearable Computing PrimaryTasks”, In Proceedings of the 10th IEEE International Symposium on Wearable Com-puters (ISWC’06), Montreux, Switzerland, October 2006.

Paper 5 Mikael Drugge, Josef Hallberg, Peter Parnes, and Kåre Synnes,“Wearable Systems in Nursing Home Care: Prototyping Experience”, In IEEEPervasive Computing, vol. 5, no. 1, pages 86–91, January–March 2006.

Paper 6 Johan Kristiansson, Mikael Drugge, Josef Hallberg, Peter Parnes, and Kåre Synnes,“Enabling Multimedia Communication using a Dynamic Wearable Computer inUbiquitous Environments”, Under review.

xiii

Page 16: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

xiv Publications

The following publications were intentionally left out from the thesis, either becauseresults have been superseded or made redundant by more recent findings included herein, orbecause their focus does not lie entirely within the scope of the thesis.

• Hendrik Witt and Mikael Drugge,“HotWire: An Apparatus for Simulating Primary Tasks in Wearable Comput-ing”, In ACM International Conference on Human Factors in Computing Systems(CHI’06), extended abstracts, Montréal, Canada, April 2006.

• Mikael Drugge, Josef Hallberg, Kåre Synnes, and Peter Parnes,“Relieving the Medical Workers’ Daily Work Through Wearable and PervasiveComputing”, In 11th International Conference on Concurrent Enterprising (ICE 2005),Munich, Germany, June 2005.

• Marcus Nilsson, Mikael Drugge, Urban Liljedahl, Kåre Synnes, and Peter Parnes,“A Study on Users’ Preference on Interruption When Using Wearable Computersand Head Mounted Displays”, In Proceedings of the 3rd IEEE International Confer-ence on Pervasive Computing and Communications (PerCom’05), Kauai, USA, March2005.

• Mikael Drugge, Marcus Nilsson, Kåre Synnes, and Peter Parnes,“Eventcasting with a Wearable Computer”, In Proceedings of the 4th InternationalWorkshop on Smart Appliances and Wearable Computing (IWSAWC’04), Tokyo, Japan,March 2004.

Page 17: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Acknowledgments

First, I would like to thank my supervisor Dr. Peter Parnes for all your guidance, support andencouragement to always strive for excellence. I would also like to thank my secondaryadvisor Dr. Kåre Synnes for your valuable comments, discussions and advice given. Aposthumous thanks goes to the late Dr. Dick Schefström for his grand visions that servedas inspiration when I first started working here.

Most of my research has been funded by projects run by CDH and CDT, by the VIN-NOVA RadioSphere project, and by the VITAL project supported by the Objective 1 NorraNorrland EU structural fund programme. Further funding has been received by the EuropeanCommission through the IST Project wearIT@work (No. IP 004216-2004).

A big thanks goes to all my colleagues in the Media Technology research group and atLTU and CDH/CDT. In particular, I would like to express my gratitude to my fellow graduatestudents Josef Hallberg, Johan Kristiansson, Marcus Nilsson, Roland Parviainen, JeremiahScholl, and Sara Svensson, with whom I’ve spent the most time with over the years. Thankyou all for making this a great place to work and conduct research in, and for countlessdiscussions concerning all aspects of life inside and outside the world of research. Withoutyour wits, wisdom and friendship, it would never have been as rewarding to work here.

I would also like to thank the people at TZI at the University of Bremen for welcoming meas a guest researcher. Being involved in your wearable computing research group providedme with valuable insights in the field and research in general. My stay at TZI also led tosubsequent collaboration with Hendrik Witt who shared similar research interests, and withwhom I had several interesting discussions and experiments conducted with.

Furthermore, some people have always helped reminding me that there is a life outsideof research. This includes the fellow buyû in my training group, there can be few bettercompanions than you when venturing the way of the warrior.

A very special thanks goes to the precious persons who are known as friends, I won’tmention any names but I am quite certain you know who you are.

Finally, I would like to thank my parents and sister for always supporting me in whateverendeavour I have undertaken.

Luleå, November 2006

Mikael Drugge

xv

Page 18: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

xvi

Page 19: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 1

Thesis Introduction

1

Page 20: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 21: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 3

1.1 Introduction

Throughout history, communication has constituted a major part of the evolution of mankind.Advances in technology have eased how communication can be conveyed, ranging from theuse of primitive writing tools for clay and stone, to pens and pencils for writing on paper.The invention of the printing press and photography enabled an easier way to disseminateinformation, while telegraphs and telephones, and, in the recent decades, computer networks,made it easier to communicate over a distance. The Internet of today allows audio, video,commentary and illustrations to be shared in real-time, with little or no regard to the physicaldistance between people. At the same time, the emergence of wireless networks has en-abled communication regardless of the physical location of people, allowing communicationthrough mobile phones, laptops, and handheld computers. The next step in making peoplemore mobile and free from constraints, is the concept of wearable computing — providingunobtrusive assistance and service by bringing the computer so close to the user that it is nolonger noticeable. How the user interacts with the wearable computer, or any technology, isessential for how well it can be used for communicating with other people. The less focus thatneeds to be given to the underlying technology the better, as it allows a person to pay moreattention to the contents of the communication. That is, after all, what remains importantregardless of any changes in technology.

This doctoral thesis presents research on how to enable mobile e-meetings through wear-able computing, with focus on making the user’s interaction streamlined and unobtrusive.The overall vision is to have a wearable computing platform that enables its user to com-municate with remote people on demand, while at the same time not being in the way norimpeding the user. As wearable computing is a highly multidisciplinary research topic, thegoal of the thesis is not to provide a complete system in terms of software and hardware as afunctional product, but rather to point out and provide solutions to the design issues relatedto human-computer interaction. These issues include the use of computer supported collabo-rative work applications in mobile settings, the importance of designing interaction properlyso as not to distract or interrupt users, and the question of how to prototype user interfacesand make them easy to deploy.

1.2 Thesis Organization

The thesis consists of seven parts. This introduction belongs to the first part, while the re-maining six parts each contain a paper that has either been previously published or is currentlysubmitted for review at the time of writing. The published papers are reproduced in originalform and have not been modified since the time of publication, with the following exceptions.

• The formatting of the papers has been unified so that they all share a common style andappearance.

• Figures have been resized and repositioned so as to fit aesthetically in the commonlayout used.

Page 22: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

4 Thesis Introduction

• Figures, tables and sections have been renumbered to fit into the numbering schemeused throughout the thesis.

• Bibliographical entries and citations have been renumbered, and all references havebeen moved into a common bibliography at the end of the thesis.

• Editorial changes of grammar and spelling have been done to correct a few minor andobvious errors.

The remainder of this chapter contains background information about wearable comput-ing and e-meetings, as well as a discussion on how these two areas can be combined. Here,the motivation for the research presented in this thesis is also explained. After that, a numberof relevant research questions are presented, followed by a discussion of the research method-ology used to address them. Then follows a brief introduction to the papers included in thisthesis, and a discussion on how the research questions have been addressed and answered.Finally, this chapter is concluded by pointing out potential future research directions in thisfield.

1.3 Background and Motivation

In this section, background information regarding the concepts of wearable computing andmobile e-meetings will be presented. The concepts will be explained separately and put inrelation to other areas of research, such as ubiquitous and pervasive computing, as well astraditional video conferencing and online e-meetings. This is followed by a discussion onhow the concepts are combined in this thesis, and the motivation for the work and researchcontained herein.

1.3.1 Wearable Computing

Wearable computing is a paradigm which has evolved in line with three different factors;reduced size of computers, increased mobility of people, and additional personalization ofdevices. Ever since the advent of computers, the trend has been to fit more computing powerinto less space. The size of computers have gone from occupying entire rooms, to slightlysmaller mainframe computers, and further on to personal computers stationed at the user’sdesktop. As people are mobile and need access to their computers from other locations thantheir desktop, this has proliferated the idea of mobile computing through laptops and hand-held computers. Designed to be lightweight and small in size, they are easy for the user tobring along to other places, while still providing the user with a personal and consistent work-ing environment and user interface regardless of where the user is located. This leads to theidea of personalization of devices. The desire for personalization has become very apparente.g. in mobile phones, which today are highly customizable and can be tailored to the user’sdesires. Albeit this customization still mainly applies to the superficial level, e.g. changingring tones and background images, it still points out the desire for people to have their owndevice adapted to suit themselves. A related example of this is the Personal Digital Assistant

Page 23: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 5

(PDA) which in addition to providing basic computing tools, also serves as a general calendarand organizer for its user over the entire day. In a sense, a PDA becomes more involved inthe user’s personal life, serving as an assistant for the user’s everyday tasks in the real world.

All of these factors combined lead naturally to the paradigm of wearable computing. Awearable computer is a lightweight computer meant to be worn by the user, providing accessto computational power from any place and at any time. With more and more functionalitybeing added to mobile phones and handheld computers, it can be difficult to discern whatseparates a wearable computer from a non-wearable computer, and depending on the defi-nition used the line that separates the two fields is not always very clear. In this thesis, thedefinition will therefore be that the key element that makes a computer wearable, is how theuser’s interaction with it is managed.

In terms of interaction, there are several differences between wearable and non-wearablecomputers. The list below summarizes the most important ones to give the reader an idea ofwhat kind of interaction is required in wearable computing.

• Mobility: The first difference is that the user uses a wearable computer in a highlymobile setting, e.g. while standing up or walking around, as opposed to sitting downin front of a desktop computer. This alone calls for new kinds of interaction devices, asneither the traditional mouse and keyboard used with a desktop computer are suitablein a more mobile setting.

• Assistance: The second difference is that a wearable computer is aimed more at as-sisting the user with a real world task, rather than the user using it to perform somededicated task in the virtual world inside the computer. PDAs and mobile phones comecloser to a wearable computer in this sense, although they mostly require the user’s con-stant attention when being used. Whether controlled via a stylus pen, touch-sensitivedisplay, or miniature keyboard, all of these interaction methods require the user to fo-cus on the computer rather than the real world while performing the task. Also, the taskitself is often related to the device itself, rather than to the task currently performed inthe real world.

• Unobtrusiveness: The third and final difference is that a wearable computer should beunobtrusive to use. The user’s physical interaction with it should not impose excessiveattention demands, nor should it restrict or encumber the user’s interaction capabilitieswith the real world. Furthermore, wearable computers are typically dedicated to a sin-gle task, to avoid overwhelming the user with distracting information and nuisances. Incomparison, ordinary desktop computers typically present the user with numerous has-sles that impede the user’s performance; ranging from mundane dialog boxes blockingall interaction until they are responded to, to incoming mails or chat messages that in-terrupt and cause the user to perform numerous mental context switches. The severityof all these problems becomes magnified when a wearable computer is used to assistits user in a real world task, and thus calls for more suitable user interfaces being em-ployed.

Wearable computers thus differentiate themselves on many aspects from traditional desk-top computers, and define themselves primarily based on how the user uses and interacts with

Page 24: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

6 Thesis Introduction

the computer. This paradigm shift of what a computer is and what it can be used for, can alsobe seen in two neighbouring areas of research, both of which will be briefly introduced in thenext section.

1.3.2 Ubiquitous and Pervasive Computing

The terms ubiquitous computing and pervasive computing denote areas of research whichare closely related, both to each other and to the field of wearable computing. The idea ofbringing computational power away from the desktop and out into the real world is paramountin all three research areas, and the difference is mainly the goals and means by which this canbe achieved. The two terms ubiquitous and pervasive are sometimes used interchangeably,but there are some inherent differences in their meaning which should be clarified.

Ubiquitous computing refers to the vision introduced by Mark Weiser in his seminalarticle [82], where he states that “The most profound technologies are those that disappear.They weave themselves into the fabric of everyday life until they are indistinguishable fromit.” Thus, ubiquitous computing is the idea of having access to computers everywhere —not necessarily as distinct or dedicated machines per se, but rather embedded in everydayobjects and accessible throughout a person’s physical environment. Examples of this includethe ParcTab [81] ubiquitously available computers for the office, and the MediaCup [23] thataugments an ordinary coffee mug with sensors providing context awareness.

Pervasive computing is similar to ubiquitous computing, but refers to the vision of makingthe computers integrated into the environment and their usage completely transparent for theuser. Whereas in ubiquitous computing a user would still interact directly with certain every-day objects containing embedded computers, pervasive computing would have those objectsdisappear and become invisible, so that the user does not even know they are there. Examplesof this include e.g. radio frequency identification (RFID) technology and applications [54],as well as more concrete applications like the ActiveBadge [80] location system.

Wearable computing thus falls within the realm of pervasive computing, as the idea isto have the computer disappear and assist the user while not being noticed. In practice atthe time of writing, most wearable computers only partially belong to this realm, as furtherresearch is still needed to make them less obtrusive and the interaction more streamlined.

In certain application domains, the use of a wearable computer requires infrastructureor services provided by ubiquitous or pervasive computing. For example, indoor positioningsystems serve as a prime example of a pervasive computing service that a wearable computingapplication may utilize. Another example is the use of ubiquitous computing to extend thecapabilities of a wearable computer, e.g. to be able to delegate computations to more powerfuldevices, or utilize terminals and input/output devices in the surrounding environment foreasier interaction. In other scenarios, a pervasive computing system relies on each user havinga wearable computer, e.g. for the purpose of storing private data and avoid security concernsby the user. All in all, these three areas of research co-exist and all have certain benefits anddrawbacks compared to each other, and as discussed in [68] the best choice is sometimes tocombine them.

Page 25: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 7

1.3.3 Video Conferencing and E-meetings

Video conferencing is the idea of enabling people to meet over a distance. This can beachieved by conveying media such as audio and video from one place to another, so that thepeople involved get an experience of them being together even though physically separated.Early video conferencing facilities were taken in use by certain companies and institutions,where dedicated hardware and communication channels were used to connect meeting roomsat different locations with each other. This enabled group meetings of a larger scale, butstill required a lot of investments in expensive infrastructure to enable this. These problemsbecame easier to overcome as the Internet started to permeate society, and Internet Protocol(IP) based communication channels could be used more easily to convey the video and audiodata. With increased capabilities of personal computers in terms of graphics and sound,and enough computational power to process multimedia content in real time, it was finallypossible to achieve video conferencing through ordinary computers. This helped leveragethe concept of video conferencing to include other purposes than formal meetings, enablingpeople to meet informally in groups and communicate in shorter or longer sessions from thecomfort of their own desktop.

The term e-meeting denotes such an online group conferencing session which can includevideo, audio and chat among other media. Rather than requiring a dedicated meeting roomequipped with expensive video conferencing hardware, e-meetings can take place from theuser’s desktop computer and be used for either formal or informal communication. In therecent years e-meetings have become more commonplace and available to the populace, withprograms such as Skype1, ICQ2, and MSN Messenger3 being widely used for both leisureand work related communication.

Within the Media Technology research group at Luleå University of Technology, thereis a long history of conducting research in collaborative work applications for enabling e-meetings. Early research concerned the development of the mStar [58] architecture, whichwas used to explore real-time communication between distributed clients and participants.The research in mStar later resulted in a spin-off company called Marratech AB being formed,which sells, develops, and distributes the commercial Marratech e-meeting software derivedfrom this research.

Marratech can be used in several application domains, ranging from general computersupported collaborative work to distance learning. Within our research group, Marratech isused for holding formal e-meetings as an alternative and complement to physical meetings,but also as a way of providing all members with a continuous sense of presence of each otherthroughout the day. This latter case is known as the e-corridor — a virtual office landscape inwhich the group members can interact, communicate, and keep in touch with each other. Thee-corridor offers the social benefits of an office landscape, while still allowing each person todecide for themselves to what degree they wish to partake.

Figure 1.1 shows an illustration of what a user’s desktop can look like when using Mar-ratech. To the right, the top window shows continuously updated video thumbnails of all

1http://www.skype.com/2http://www.icq.com/3http://messenger.msn.com/

Page 26: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

8 Thesis Introduction

Figure 1.1: A snapshot of the e-corridor as it looks on a user’s desktop.

participants, while the bottom window shows the person currently in focus, e.g. a personwho is currently speaking or which the user is otherwise communicating with. The largewindow to the left shows the shared whiteboard which can be edited by any participant, andwhich is commonly used to manage planning or illustrate various ideas or concepts duringdiscussions and meetings.

1.3.4 Mobile E-meetings

A mobile e-meeting is an extension of the concept of an ordinary e-meeting, in which one ormore users are being mobile when participating in the meeting. In this thesis, the structureof mobile e-meetings follow the idea of having a single mobile user of a wearable computerperforming certain tasks “out in the field”, while having one or more users or experts seated attheir desktops participating in the same e-meeting session as the mobile user. Figure 1.2 pro-vides an illustration of this. The idea is that the mobile user will be able to receive advice andguidance originating from the experts through the wearable computer, while simultaneouslyconveying video, audio, and possibly other media back to the experts so they can followthe progress of the task being performed. In this thesis, this concept has been dubbed theknowledgeable user, denoting that the mobile user can perform the tasks with the combinedknowledge of the experts at hand.

Page 27: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 9

Network

Remote experts Wearable computer user

Figure 1.2: The structure of the mobile e-meetings addressed in this thesis.

To exemplify some typical situations in which the knowledgeable user concept is appli-cable, three scenarios are given below. All of these are based on discussions with industrialproject partners and people working in the respective professions, and are as such based onreal needs identified in the real world. The first example represents a typical “field workerand remote expert” scenario in general. The second example represents a more specializedscenario that is of more critical nature, both in terms of time but also in terms of the safetyand security of users. The third and final example represents a less critical but more socialscenario, where an e-meeting is used to give workers more time and reduce their workload.

Scenario 1: Electricians working at remote installations. An electrician working withrepairing remote installations in rural areas, may sometimes face problems if the installationsite is of a highly specialized or unknown nature that the person has not encountered before.As such, it may be the case that only certain expert electricians with more experience knowhow to perform the necessary repairs. Because of the rural areas and long distances involvedin commuting back and forth, it would be beneficial if the electrician that is already at thesite can still perform the repair, instead of spending costly time on transporting an expertto and from the site to aid or replace the electrician. In this situation, a mobile e-meetingbecomes a useful way for the expert to convey his knowledge to the electrician, guiding himor her on how to perform the repair properly. Thus, the electrician contacts the main office toget in touch with an expert, and starts an e-meeting with that person. As the electrician willlikely need to use both hands and work with small components, it is vital that the e-meeting isunobtrusive for that person, and that video can be continuously conveyed to the expert whenguiding the person. The use of wearable computing in this situation, helps making the processof conveying information back and forth less obtrusive and more natural for the electrician.Through the use of a head-mounted camera, the expert can follow the work through theelectrician’s eyes, so to speak, while a head-mounted display can provide the electrician withillustrations and annotated blueprints. Thereby, the electrician can now perform the repairproperly thanks to the guidance provided.

Scenario 2: Firefighters in need of remote guidance. Firefighters working with fire extin-guishing at an emergency scene, often face highly critical situations in terms of time and the

Page 28: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

10 Thesis Introduction

security and safety of people involved. When working at extinguishing fire inside a building,its structure and architectural layout is often unknown at first, forcing the firefighters to builda mental model of it while performing the operation. With heavy smoke and prevailing dark-ness, physical maps and similar material for assistance often becomes impossible to use forthe firefighters. This can however be accomplished through the use of wearable computing,and in particular through head-mounted displays mounted inside the firefighters protectivehelmet and face mask [5]. With such equipment available, the firefighters are able to lookat maps over the building presented before their eyes, as well as be notified about importantstatus information regarding their self contained breathing apparatus. In addition, guidancecan then also be provided by fire engineers and experts outside the building, who may have abetter overview of the scene and can annotate maps and help the firefighters navigate insidethe building.

Scenario 3: Medical workers performing routine examinations. Medical workers andnurses often perform a multitude of routine examinations of patients in their daily work.Some of these examinations are trivial, while others call for the specific competence of acertain profession or individual, such as a physiotherapist, a chiropractor, a medical doctor,or a fellow medical worker with previous experience of that particular patient. In an idealworld, there would be enough resources available in terms of time and money to allow theseexperts to visit the patient in person, but in practice that is not always the case — causingproblems such as stress and discomfort for medical workers and patients alike. Rather thannot having access to the expert at all, a compromise would be to make use of that person’sexpertise and knowledge even though not physically being there in person. With medicalworkers equipped with unobtrusive wearable computers, they would be able to contact anavailable expert on demand through an e-meeting session, in order to let him or her guide theexamination remotely over a distance. A specific example of such a situation is a patient witha sore arm that is in the process of healing. Here, a remote physiotherapist can guide a nursein instructing the patient on how to move the arm, while watching how the patient manages itand thereby make a remote diagnosis of the healing process. Employing wearable computingin situations like these [14], is motivated by the need for medical workers to interact naturallywith the patient, rather than focusing on a separate stand-alone computer to facilitate thee-meeting.

Conclusions from the scenarios. What the three scenarios above point out, is the impor-tance of the role that interaction plays between a user and a wearable computer. In the electri-cian’s scenario, the field worker should be allowed to interact with the technology in a naturaland intuitive fashion, so that he or she can concentrate fully on performing the necessary re-pair. In the firefighting scenario, the wearable computer should not cause additional stresswhen it notifies the user, e.g. when remote personnel provide advice or guidance in timecritical situations. In the health-care scenario, the wearable computer must be unobtrusiveenough so as not to disrupt the meeting with a patient, yet still allow for communication andadvice regarding medical information being passed back and forth.

Page 29: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 11

1.3.5 Motivation of Thesis

The main motivation for this thesis is to enable mobile e-meetings through wearable comput-ing, with focus on making the user’s interaction as streamlined and unobtrusive as possible.This concerns both the interaction between regular e-meeting participants and the user of thewearable computer, as well as the user’s interaction with the wearable computer itself. Bothof these concerns affect how the user is able to interact with the world surrounding her. Asa mobile e-meeting is intended to help the user in performing tasks in the real world, theseconcerns therefore needs to be addressed.

One of the main problems we have experienced in mobile e-meetings is that they canbecome too immersive for the user of a wearable computer, thereby distancing the user fromthe interactions in the real world. At the same time, this immersion serves to offer a richexperience of being in contact with remote participants; the user can sense them as beingthere, assisting and communicating with them in the virtual world. The key to efficient com-munication, in both the real and virtual world, is to find a proper balance between these twoaspects. To succeed in that, the interaction between the user and the wearable computer itselfneeds to be highly streamlined, natural, and intuitive. In certain application domains, suchas those involving many other people in the real world which the user needs to interact with,this becomes even more important.

The primary application domain of choice for the latter part of the thesis has been thatof institutionalized health-care, particularly that taking place in nursing homes where nursesattend mainly elderly patients. The rationale for this choice is twofold. First of all, with anelderly generation growing in size, health-care is an important area for research in order todeal with a larger number of elderly in the coming future. Second of all, nursing homes offera confined and relatively isolated setting in which research can be conducted under morecontrolled forms. At the same time, they are not as constrained by the very stringent safetyand security concerns of e.g. hospitals and emergency clinics, but allows new technologiesto be tested and studied in real life scenarios while still maintaining the safety of all peopleinvolved. Thus, nursing homes can be highly suitable for conducting applied research in,with real end users and patients ensuring that the research is properly directed at real worldproblems, while at the same time having the potential for deploying prototypes bringingimmediate benefits for the people working there. Furthermore, as the end users in a nursinghome tend to demand that their computer systems are easy and unobtrusive to operate, anyresults and solutions deemed suitable here can be expected to be just as applicable in other,more general, application domains.

1.4 Research Questions

The objective of this thesis is to make mobile e-meetings through wearable computing easierfor the user. To achieve this, some of the problems that appear in this context need to beinvestigated further, so that they can be mitigated or solved once a real system for such e-meetings is to be deployed and taken in use. Primarily, these problems relate to human-computer interaction issues that occur in the interaction between the user and the wearable

Page 30: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

12 Thesis Introduction

computer. These can be further divided into three specific problem statements. The first dealswith how information can be conveyed by the wearable computer user, allowing a user toretrieve advice and guidance from experts, and remote persons to share experiences over adistance. The second concerns how the wearable computer directly interacts with the user, asit is important to determine how different methods of notifying the user affects her attentionand performance, in order to design interfaces that are efficient yet pleasant to use. The thirdconsiders how the user’s physical interaction with the wearable computer can be improved, byprototyping entire communications systems for use and deployment in real world situations.These three general statements can in turn be classified into more specific research questions,which will be described and discussed further in this section.

1. By what means can communication take place in mobile e-meetings through wear-able computing, and what media are relevant to focus on for this purpose? Mobilee-meetings meant to be participated in by a user of a wearable computer can differ vastlyfrom traditional non-mobile e-meetings. This affects what media are useful and needed bythe mobile user. The use of video in e-meetings is typically used to provide all participantswith a sense of awareness of each other, whereas for the wearable computer user this aware-ness can be undesirable as the user may need to focus on the real world around her. The useof audio differs in a similar manner; because the mobile user is in a more dynamic and lesscontrolled environment, the audio channel may not always be the most appropriate meansto convey information. For example, chat or whiteboard drawings, may be better suited toconvey an idea instead of using voice communication that requires a person’s direct andcontinuous attention. In order to be able to construct mobile e-meeting systems, the use ofdifferent media needs to be explored further to determine their relevance and usefulness forwearable computing scenarios.

2. How can mobile e-meetings be seamlessly used and employed in real life scenarios?In the kind of mobile e-meetings focused on in this thesis there are two sides; the mobile userof the wearable computer, and the remaining stationary participants. This research questionconcerns the stationary participants’ experience of the e-meeting. Can and will those par-ticipants find the e-meeting useful or not, and what aspects of the mobile user’s interactionmust be improved to provide an experience that is good enough? Furthermore, seamlessnessfrom the stationary participants’ side is important to provide a good experience for them, sothat they can enter the e-meeting session and still grasp the situation and task at hand. Thus,this is one question that needs to be addressed when creating a wearable system for mobilee-meetings.

3. Given a number of methods to interrupt a user, how should these be used so as not toincrease the user’s cognitive workload more than needed? A driving idea of wearablecomputing is that the computer should assist its user in performing real world tasks. Bydefinition, the concept of wearable computing therefore expects the user to primarily focuson the real world, rather than on the computer itself which is often the case in traditionaldesktop computing. Thus, in mobile e-meetings, the idea is for advice and support to reachthe user of the wearable computer in an unobtrusive manner, so that the user is not interrupted

Page 31: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 13

more than necessary. This is important as scenarios encountered in the real world can be ofcritical nature, where the user may be in a difficult situation while still requiring supportthrough the wearable computer. In such situations, the proper handling of interruption can bevital for the safety and security of the user, and also to provide the user with an efficient andstreamlined interaction with the wearable computer in general.

4. How can a typical wearable computing scenario from real life be modeled as an ex-perimental setup, in order to evaluate wearable user interfaces in a reliable and validmanner? With the goal of designing streamlined and unobtrusive user interfaces for wear-able computers, it becomes important to have suitable means for assessing the effect that theinterface will have on its user. In the real world, there are a number of nuisance factors thatcannot be accommodated for, leading to the risk of experiments becoming unreliable if theyare performed solely in that domain. For this reason, it is important to find an apparatus thatallows an arbitrary user interface to be evaluated in a reliable and reproducible manner, whileretaining the properties of a typical wearable computing scenario to make the experimentvalid.

5. What methodologies are useful when prototyping easy to interact with wearablecomputing e-meeting systems and engaging end users in the process? In order for theuser’s interaction with the wearable computer to become unobtrusive and accepted, greatcare needs to be taken to the end users’ work situation and their idea of what constitutesproper design. Involving the end users in the design process is one way to ensure that theresulting wearable computing system will be useful, as they are the ones with the expertiseto decide what features are needed and what should and should not be part of the solution.This question involves both the physical appearance of the wearable computer, as well as thefunctionality and interaction means provided in terms of hardware and software.

6. What functionality is needed to allow users to automatically combine and switchbetween resources available in the wearable computer and in the surrounding environ-ment? Just as there is no single program that fits all purposes on a desktop computer, thereis no single wearable computing design that fits all purposes in real life. This can be the caseeven for smaller and more constrained application domains, where there is still a need to dy-namically tailor the wearable computer for the task at hand. With a wearable computer meantto be deployed and used in a real world scenario, the ability for the end users themselves toperform this tailoring becomes critical for the long term acceptance of such a system. Thisquestion concerns how a wearable computer can be dynamically configured, by combiningand switching between resources useful for an e-meeting. Such resources can include, for ex-ample, head-mounted displays, external displays, television sets, and on-body or stationaryoffice cameras and microphones. Allowing the user to automatically combine these resourceson demand, would mean that she can decide what resources and means for interaction thatare needed for the task at hand, and subsequently perform the task easier without being ham-pered by needless equipment or missing vital functionality. Naturally, this calls for an easyway for the end users to perform this configuration, without delving into technical details andinterfacing problems.

Page 32: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

14 Thesis Introduction

1.5 Scope and Delimitation of the Thesis

It should be acknowledged that creating a mobile e-meeting system that suits all kinds ofapplication domains is not feasible within the scope of this thesis. Each application domaincontains unique situations, and the needs in each situation can vary depending on the contextof the user. The application domain for this thesis has therefore primarily been constrainedto that of institutionalized health-care, where nurses, doctors, and medical workers need tokeep in contact with each other. Even with this constraint applied, the situations encounteredwithin health-care can be very heterogeneous. The thesis addresses a subset of the situationswhich occur commonly, in order to provide a solution that in further research can be adaptedto handle other kinds of situations as well.

Human-computer interaction is an interdisciplinary research topic, and even when appliedin the more narrow field of wearable computing, it still covers a large number of aspects thatcan and need to be dealt with. In this thesis, emphasis has been given to how the wearablecomputer can notify and interrupt its user in an unobtrusive manner, as this becomes relevante.g. in the case of presenting information for the user in an e-meeting. Furthermore, emphasishas also been given to the way the user interacts with the wearable computer, both in termsof hardware and input/output devices on the computer or in the surrounding environment.

Certain assumptions have further been made regarding the network over which an e-meeting session is conveyed. It is assumed that in the situations where mobile e-meetingswill be employed, there is access to a suitable IP based network, e.g. typically via an IEEE802.11b wireless network, to which the wearable computer can be connected for receivingand transmitting media streams. This is a realistic assumption, as the health-care facilitiesand scenarios which this thesis has focused on have all had such wireless networks availablefor use. In the future, it is also expected that more and more facilities will be equipped withwireless networks, further enabling mobile e-meetings in such locations. It should be pointedout that the thesis will not focus on research issues in computer networking, nor in the issuesof encoding multimedia data to be sent over wireless networks.

During the development of the prototypes, ergonomic constraints and physiological con-siderations of the wearable computers have been addressed to the extent permitted by theavailable budget and equipment available. That is, prototypes have been built so as to becomeusable for proof of concept tests and shorter user studies, and in certain cases also for longerterm studies covering several weeks. However, for wearable computers to be deployed foractual use as functional products, ergonomic constraints must be further taken into accountto make them easy to wear, as well as ensuring long term stability of operation and durabilityof the hardware used. The publications contained in this thesis discusses a number of theseissues encountered in real world tests, to serve as initial guidelines for further developmentof such products.

1.6 Research Methodology

The research presented in this thesis has been conducted with the ambition to solve actualproblems found in the real world. This has been accomplished by working in research projects

Page 33: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 15

with representatives from industry as well as academia, where the former part often has thegoal of seeing a return on investment and create products based on the research results. Inturn, this has called for a research methodology that is both applied and practical in nature,resulting in prototypes and artifacts which can be taken in use in real life and demonstratethe solutions proposed. Though the prototypes by themselves may not always yield scientificresults suitable for publication, they can still be used to facilitate research being conductedthrough the use and deployment of them.

A problem of applied research conducted in the real world, with real users and real prob-lems, is that experiments can become dependent on the specific prototypes used and therebymake the results difficult to generalize and reproduce. On the other hand, experiments con-ducted in the real world has a benefit that is often missed in more controlled laboratory stud-ies, namely that the experiment becomes exploratory in nature, and while conducting theexperiment new factors are often revealed that were not conceived of before. Research basedon prototypes can mean that the development will require large amounts of time and effort,with little or no scientific data resulting from such work. However, once having a prototypeavailable, new findings can often be made which would otherwise have been neglected or notuncovered by other means.

Within this thesis, the Marratech e-meeting system has been extensively used as a basisfor the prototypes used. As Marratech is the result of early research in my research group,access to its source code has been granted so that necessary modifications and further de-velopment has been possible. This has had the advantage of prototyping wearable platformsusing software that is stable and sold as a commercial product, thereby avoiding many of thebugs and minor nuisances which can otherwise distract a user. In order to keep the researchindependent from the actual software, care has been taken not to investigate the Marratechapplication per se, but rather what can be achieved by the use of such software. In all pro-totypes, it would therefore have been possible to utilize other software, either free and opensource programs such as Vic/Vat [49], or another commercial product.

Within experimental sciences, the terms validity and reliability are often used when dis-cussing the usefulness and trustworthiness of the research and its results. Validity meansthat the experiment measures what is intended to be measured, while reliability means thatthe results gained from the experiment will remain consistent over repeated measurements.In practice, it can be very difficult to make experiments conducted outside of a laboratorysetting valid and reliable. As the real world outside a laboratory is dynamic and changing,and contains a vast number of nuisance factors that cannot be accommodated for, validity canbe hard to ensure because it is not known what factors contribute to the result. Due of this,the results may also become unreliable as the uncontrolled factors cause different results forrepeated experiments.

As the validity and reliability of experiments conducted in a real world setting can behard to ensure, simulations are often used to assure that all factors can be controlled for,while allowing an easy way to reproduce and repeat experiments with consistent results. Asimulator facilitates an experiment being set up in a highly controlled environment, with allor most factors accommodated for before, during, and after the simulation. The advantageof simulations are clear in some disciplines, such as computer networking and algorithmtheory, as experiments as well as resulting prototypes and products will mainly run inside

Page 34: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

16 Thesis Introduction

a similarly controlled environment where nuisance factors are not really an issue. For thefield of wearable computing, the use of simulations other than for very specific aspects isdifficult. Wearable computers are by definition meant to be used by people in an immensenumber of different real world scenarios. The wearable computer hardware used, the meansprovided for interaction, and the support it provides to the user, are all factors which canbe varied ad infinitum. Being a relatively young research discipline at the time of writing,the characteristics of these factors are not well known, thereby making proper simulators forwearable computers and their use a grand challenge.

The research and results presented in this thesis are based primarily on real world studiesand laboratory experiments, and to a lesser degree on simulations for certain aspects of whathas been studied. Initially, the research was exploratory in nature by taking prototypes intothe real world to see how they could be used, as well as determine what problems were themost relevant to solve within the area of interaction with wearable computers. Because of thenovelty of using wearable computers for human communication, this step was necessary inorder to create a foundation of knowledge regarding this topic. Later on, laboratory studieswere used to examine how different methods for interrupting the user affected that person’sperformance. At first, these were highly controlled and simplified to ensure a high validityof the experiment’s results, for example by studying subsets of a wearable computer suchas the head-mounted display — the primary visual means for interacting with it. The firstexperiments also partially used simulations to represent the primary physical task performedby the wearable computer user, in order to ensure that all factors could be accommodatedfor in the given experiment. Later on, these experiments resulted in an evaluation apparatusthat represents a physical primary task, allowing laboratory experiments more realisticallysimulating real world use of a wearable computer, while maintaining a high level of validityand reliability.

1.7 Summary of Included Publications

In this section, each publication included in the thesis will be briefly presented. The papersin parts 2 and 3 present early findings related to the field of human communication throughwearable computing, presenting an overview of what can be done with such technology, aswell as problems that need to be addressed such as creating unobtrusive user interfaces. Thepapers in parts 4 and 5 present user studies aimed at finding out proper ways to interrupt andnotify the user of a wearable computer, while also presenting an evaluation apparatus derivedfor evaluating wearable user interfaces. The papers in parts 6 and 7 present prototyping ofwearable computing e-meeting systems for use in the real world, and a software frameworkthat allows for extending the input and output capabilities of a wearable computer into aubiquitous computing environment. A schematic overview of the topics covered by the papersis shown in figure 1.3. Following, each paper will be described in further detail with theresearch problems focused on in this thesis highlighted.

Paper 1: Sharing Experience and Knowledge with Wearable Computers. The first pa-per addresses the use of a wearable computer for sharing experiences and conveying knowl-

Page 35: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 17

Figure 1.3: Overview of the topics covered by the papers.

edge between people, introducing the concept of the Knowledgeable User. Emphasis is laidon how the user of a wearable computer can represent the combined knowledge of a group byacting as a mediator of the bits of information that each member contributes with. Real lifestudies at different events, fairs, and exhibitions have been performed to explore and evaluatea prototype communication system enabling this.

Paper 2: Experiences of Using Wearable Computers for Ambient Telepresence and Re-mote Interaction. The second paper continues the exploration of communication based onthe telepresence aspect of wearable computing. Focus is laid on how to enable remote partic-ipants to virtually accompany a person equipped with a wearable computer, allowing them toexperience a remote location and gain knowledge from other people being there. In contrastto the first paper dealing with information originating from a remote group, this paper exam-ines the issue from the opposite perspective — information being conveyed to the group. Thecurrent wearable computer is evaluated in terms of advantages and drawbacks, with their re-sulting effects brought forward and described together with recommendations for improvingthe platform’s usability.

Page 36: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

18 Thesis Introduction

Paper 3: Methods for Interrupting a Wearable Computer User. The third paper presentsa user study of different methods for interrupting the user of a wearable computer, as this wasfound to be a problematic issue in the studies and field trials discussed in the first two papers.Knowledge of what ways there are to notify users without increasing their cognitive work-load is important, and this becomes especially evident in communication systems such as theaforementioned wearable platform. The results from the study suggest suitable methods bywhich to interrupt the user, which can thereby help make a wearable computer less obtrusiveand more natural to use.

Paper 4: Using the "HotWire" to Study Interruptions in Wearable Computing Pri-mary Tasks. The fourth paper presents a follow-up user study regarding the appropriatemanagement of interruptions. The experiment from the third paper is extended and broughtinto a typical wearable computing scenario involving a physical primary task. The resultsboth confirm and complement earlier findings, highlighting the relevant issues to considerwhen designing user interfaces for wearable computers. Furthermore, an evaluation appara-tus dubbed the HotWire is introduced, which can simulate typical wearable computing sce-narios in laboratory experiments, embodying the characteristics of wearable computers beingused in mobile, physical, and practical tasks demanding the user’s attention.

Paper 5: Wearable Systems in Nursing Home Care: Prototyping Experience. The fifthpaper presents the prototyping of a wearable computing system for providing communicationamong nurses in a nursing home. The nurses’ needs are uncovered through an ethnographicalstudy, revealing that improved communication among personnel and remote medical workersis highly desirable. This is followed by participatory design events in which a wearablesystem is prototyped in terms of functionality and interaction. Paper prototyping as well asWizard of Oz prototyping are employed to involve the end users in the design process.

Paper 6: Enabling Multimedia Communication using a Dynamic Wearable Computerin Ubiquitous Environments. The sixth paper presents an underlying framework for en-abling ubiquitous multimedia communication. This complements the design and prototypingdiscussed in the fifth paper, as the framework allows for further customization by the end useralso during run-time. The driving concept of the framework is that of the Dynamic WearableComputer, where the user only needs to wear the interaction devices needed for the task athand, and is able to utilize external media resources found in the environment to extend thewearable computer’s capabilities.

1.8 Wearable Computing for Human Communication

In this section, the topic of wearable computing for human communication will be presentedin further detail. With basis in the papers summarized in the previous section, three differentaspects will be discussed. These include the aspect of how mobile e-meetings can be con-ducted through wearable computing, how interruptions and notifications should be managed

Page 37: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 19

to make interruption unobtrusive, and how prototyping and deployment of systems which areeasy to interact with can be achieved.

1.8.1 Mobile E-Meetings through Wearable Computing

To conduct research on the how mobile e-meetings and wearable computing can be used tofacilitate human communication, a prototype of a wearable computer was assembled for ini-tial studies in this area. These studies are presented in more detail in part 2 and 3, whilethis section mainly discusses the rationale for the prototype used in these studies. The pro-totype consists entirely of commercially available consumer products off the shelf, withoutany specialized or custom built components. The reasons for favouring this approach, ratherthan using commercial wearable computers that are available on the market, are twofold.Experience has shown that wearable computing products frequently tend to be subject to dis-continuation, making the reliance of a certain product to be a bit too hazardous. While thisis not a major problem for the sole purpose of conducting research, it becomes more signif-icant to take into account when doing more applied research. This is because it is desirableto enable funding partners, companies and customers to be able to reproduce the platformwith ease, using the same or similar products which are used in the research prototype. Thatwould not be possible if a custom built prototype was relied on, which in turn relies either ona certain vendor or hardware configuration to function as desired.

In general, the continuous advances in technology serve to make wearable computerequipment smaller and more powerful every day. Even though the current hardware usedin the prototype may be a bit cumbersome to wear on an everyday basis, these advances willin time allow for truly wearable equipment that are neither obtrusive nor noticeable. For ex-ample, progress is being made on miniaturizing HMDs, and there are today certain modelsthat unobtrusively fit on a pair of ordinary glasses.4 As of writing, some HMDs are planned tobe sold as accessories for watching videos on the currently popular iPod5 from Apple Com-puter Inc., and as their popularity increase they will eventually reach the consumer marketon a larger scale with falling prices as a result. It should be stressed that the research issuesaddressed in this thesis are not focused on the actual hardware itself, but rather the underlyinginteraction aspects that remain regardless of the technology used to implement the system.

For these reasons, the wearable computer prototype has at its core had either an ordi-nary Dell Latitude C400 laptop, or alternatively a Compaq TabletPC. This part can easily bereplaced with other, more powerful, smaller and less power consuming computers as theybecome available. In recent research, a Sony Vaio U70P has been used, providing a moreportable and easier to wear form factor. The same reasoning about ease of replacementand substitution lies behind the use of ordinary audio and camera equipment from the PCconsumer market. For the interaction in this first prototype, a Twiddler hand-held chordingkeyboard was used. The HMD used was originally an M2 Personal Viewer and later an SV-6with a standard VGA connector, powered by an external battery. The displays can function

4See for example the head-mounted display products and solutions available from MicroOp-tical (http://www.microoptical.net/), Lumus Vision (http://www.lumusvision.com/), and eMagin(http://www.emagin.com/).

5http://www.apple.com/ipod/

Page 38: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

20 Thesis Introduction

directly as a secondary display under WindowsXP, and therefore no specific device drivers orAPIs need to be used to make it work. This means that the displays can easily be replacedwith similar HMDs which need no software modifications to work.

The choice to use WindowsXP as the underlying operating system is mainly because itis the de facto standard that new computers ship with, meaning no modifications or extradevelopment is needed before research applications can run. The majority of the softwarerunning on the platform is written in Java, thereby allowing easier porting to other operatingsystems in the future.

To enable the user of the wearable computer to communicate with other people, the Mar-ratech collaborative work application has been used. Specifically, in many of the initial fieldtrials of using the wearable computer, this was connected to the aforementioned e-corridorin which all members of our research group participate in. Since the members use it at theirdesktop all the time, using the same software suite in wearable scenarios provides us witha testbed that is accessible at all time. This is another reason for using an existing productrather than developing a wearable e-meeting application from scratch — it allows for easytesting with people who are well experienced in the field, and thereby able to provide valuablecomments regarding the field trials.

In figure 1.4, the author is shown using different versions of the wearable computer pro-totypes developed over the years. In figure 1.4(a), a laptop is placed in a backpack for ease ofcarrying, together with a USB hub connecting the head-mounted web camera and Twiddlerkeyboard and mouse. The user has audio input and output via a wired headset connected tothe computer. The HMD (an M2 Personal Viewer) is connected through a standard VGAcable into the computer’s monitor port. An external battery placed in the backpack can sup-port the HMD with enough power for about 6–8 hours of use. The batteries for the computerusually last around 3 hours, meaning one or two extra batteries allow for a full workday’s use.A power supply to the computer can also be connected and stored in the backpack, allowingthe user to reach the power cable and recharge its batteries with relative ease, e.g. when theuser is seated next to a power outlet. When not in use, all equipment can be packed into thebackpack allowing for easy storage and transportation.

In figure 1.4(b), a later version is shown based on the Sony Vaio U70P and the M2 Per-sonal Viewer. Batteries for the HMD and the necessary cables required are placed insidea small bag worn around the user’s hip near the lower back, although this part is currentlyobscured by the user in the picture. Reducing the size of the equipment made it significantlyeasier to wear and use compared to the former backpack-based prototype.

In figure 1.4(c), one of the newest versions intended for deployment at a nursing homeis shown. This wearable computer provides a body-stabilized camera for filming patients,with the possibility to detach the camera from its mounting to examine e.g. wounds in detail.On the user’s right hip, an echo-cancelling microphone and loudspeaker is hanging, throughwhich remote experts can communicate directly with a patient. It is also possible to connectan electronic stethoscope to this device, in order to send e.g. the sound of heartbeats andbreathing to a remote expert able to make diagnoses based on the sound. The Sony VaioU70P is currently hanging on the user’s left hip so that it can provide the patient with animage of the remote expert speaking, for the purpose of increasing the feeling of the expert“being there” with the patient. In the SV-6 HMD, currently mounted on a pair of sunglasses

Page 39: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 21

(a) A backpack-based prototype. (b) A smaller and less ob-trusive prototype.

(c) A prototype intended for de-ployment at a nursing home.

Figure 1.4: Different wearable computer prototypes worn by the author.

for purely aesthetical reasons, the user can see what the camera is currently capturing andalign it to suit the remote expert’s requirements, as well as receive advice and guidance fromother experts in form of text and video.

The prototypes have made it possible to easily perform field trials in various situations, inorder to explore the field and build a foundation of “know-how” regarding the topic of mo-bile e-meetings. The results from the field trials indicate that human communication throughwearable computing is a viable concept. Although, there are problems with excessive im-mersion during the user’s interaction with the wearable computer, which distances the userfrom the real world tasks and is detrimental to the user’s performance. Among the factorscausing undesirable immersion is the lack of proper management of interruptions, such asis the case when e.g. incoming chat messages or other forms of notifications appear in thewearable computer. For this reason, the proper management of interruptions and notificationsis an important research topic.

Page 40: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

22 Thesis Introduction

1.8.2 Managing Interruptions and Notifications

So far, the use of mobile e-meetings through wearable computing has been discussed in gen-eral terms, with further details given in the actual publications appearing in part 2 and 3.Following, two other aspects that are needed for efficient mobile e-meetings will be pre-sented, namely the aspects of unobtrusive interaction with the wearable computer in terms ofinterruptions and notifications.

One way to make interaction with a wearable computer less obtrusive is to make sure thatmessages and notifications are presented to the user in the most suitable manner possible.What is deemed as suitable may be dependent on the user’s current situation. For exam-ple, when discussing with someone face to face in real life while being involved in a mobilee-meeting, text-based messages that queue up may be preferable to direct voice messages.On the other hand, the opposite may be true if the user is involved in tasks demanding vi-sual attention. Because wearable computers are closely coupled to the user and can exhibitcontext-aware functionality, they can aid by converting incoming messages from one mediato another, e.g. voice to text or text to voice. A prototype performing this media conversionhas been built6 and tested in initial pilot studies, demonstrating how the user of a wearable canreceive incoming communication through the proper media given a certain situation. This isdiscussed in more detail in part 3, and represents one contribution for making communicationbetween humans more streamlined in wearable computing scenarios.

Another way of improving the communication is to make sure that incoming messages donot interrupt the user, or more specifically that they do not increase her cognitive workloadmore than what is absolutely necessary. A user study of different methods to perform thisnotification has been conducted in part 4, investigating the effects they have on tasks per-formed in the real world and in the wearable computing domain. The results from this studyexpose the advantages and drawbacks of the methods tested, contributing with knowledge onwhen a certain method is preferable to use for notification or not. This knowledge can in turnbe used in conjunction with the aforementioned media conversion facility, to further reducethe intrusiveness of using a wearable computer. Subsequently, this in turn brings benefitsto the sharing of knowledge and experiences, which should be natural and not hindered bytechnology that is in the way.

The initial user study, presented in part 4 with more subjective and qualitative data ap-pearing in [55], utilized a highly controlled experimental apparatus which mainly tested theeffect of wearing a head-mounted display. This was done in order to be able to comparethe outcome of the studies with previous interruption studies, specifically those conductedby McFarlane in [51, 53] focusing on desktop computing. In wearable computing, there arehowever many factors that make it inherently different from traditional desktop computing.User interfaces for wearable computers also differ as a result, as the traditional WIMP7 userinterface and desktop metaphor is difficult to employ in wearable computing [67]. As such,it was decided that a follow up study was needed, where more of the properties of wear-able computing could be embodied. All of this while still maintaining the experiment highlycontrolled, and still being able to evaluate an arbitrary wearable user interface.

6This prototype was originally developed by Marcus Fransson as part of his Master’s Thesis work.7Windows, Icons, Menus, Pointer.

Page 41: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 23

Figure 1.5: The HotWire apparatus.

Finding an apparatus that allowed this turned out to be a major challenge, however, asproper evaluation apparatuses for wearable user interfaces are few and next to nonexistent.Typically, wearable user interfaces are evaluated in one of two ways. Either the user is walk-ing around while interacting with the user interface, which runs the risk of getting invalidresults because simply walking around does not accurately represent the intended applicationdomain where the user interface is being used. Alternatively, the real world task is instru-mented with sensors and monitoring equipment to assess how the task is performed whileinteracting with the wearable user interface. While this results in accurate data and perfor-mance measurements, the required instrumentation makes it difficult and time consumingto apply the experiment to another task within the same or another application domain, andbecause of intricacies in the specific task being done, results may not always be possible togeneralize due to the existence of certain more or less prominent nuisance factors. Hence,finding an apparatus that allows for easy modeling of a real world task is highly desirable.

For the user study presented in part 5, the HotWire was therefore introduced in [83] asa suitable apparatus. The HotWire was conceived, designed and created by myself and mycolleague Hendrik Witt, in order to have an easy to use apparatus for conducting laboratoryexperiments while retaining the inherent properties of wearable computing. This apparatusfulfills three requirements; it abstracts a physical task performed in the real world, it is easyto learn so that arbitrary subjects can be tested, and it can be adapted to different scenariosand application domains.

An example of a HotWire apparatus is shown in figure 1.5. In essence, the user is forcedto move a metallic ring as accurately as possible over a wire bent in different shapes, thereby

Page 42: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

24 Thesis Introduction

requiring the user’s focus and constant attention, similar to what an ordinary task wouldrequire in real life. Furthermore, the shape of the wire can also force the user to move aroundwhen following it, as well as move in and out of different body postures such as kneeling orbending over. This further emphasizes the mobility aspect of wearable computing, such asis often the case in the application domains of manufacturing and inspection tasks [4]. TheHotWire is connected via an RS-232 serial interface to a monitoring software running on acomputer, so that the user’s performance of the real task can be measured and logged. In turn,an arbitrary wearable user interface can be tested by letting the user use it while performingthe task. By coupling the user interface to the monitoring software, the user’s interaction withthe interface can also be measured and logged at the same time. This makes it possible toassess how the user performs the interaction with the real world as well as the wearable userinterface, and hence, conclusions can be drawn on whether the wearable computer helps orhinders the user.

In the subsequent user study regarding interruptions detailed in part 5, the HotWire wasused to simulate a physical and primary task while testing different methods for interruptingthe user. In addition to studying the same interruption methods used in part 4, the full setof methods as proposed by McFarlane in [53] were now also tested. The means for inter-acting with the wearable computer also changed, so that the previously used keyboard wasreplaced with a wearable interaction device in form of a data-glove using tilt sensors [84].The results from the study point out that there are mainly similarities, but also some inherentdifferences, between what methods for interruption are useful when comparing more realisticwearable computing with desktop or stationary computing. The intricacies of these resultsare discussed in further detail in parts 4 and 5.

Perhaps more importantly, however, was the finding that the HotWire apparatus helped inrevealing some of the problems in wearable computing. The foremost example of this wasthat the negotiated interruption method, where the user is more in control, turned out to causethe user to perform badly in many aspects. Upon closer examination of the results, this washowever not caused by the interruption method per se, but rather by the interaction deviceand modality employed in this method. This finding was unexpected as the device used wasconceptually simple to operate, yet when taken in use in a more realistic wearable computingscenario it turned out to be significantly more difficult to operate correctly.

Concluding the topic of interruptions, the results in part 4 and 5 are both valid in their ownsense, especially when considering the context and design of the study. The first study used anidealized setup where few wearable computing properties were present but the typical HMD,and thereby showed what effect that interruptions will have on a user once the interaction withthe wearable computing is fully streamlined. The second study used a more realistic setup inwhich contemporary wearable computing applications are being used, and thus showed whateffect can be expected when using current interaction devices such as a data-glove.

1.8.3 Prototyping and Deploying Mobile E-Meeting Systems

In order to move outside of laboratory experiments and allow for actual end users to testmobile e-meeting systems in real life situations, prototypes are needed which can be deployedand taken in use at the intended facility. In part 6, user centric prototyping is presented as

Page 43: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 25

a way to incorporate the intended end users in the design process of a wearable computer,in particular with consideration for how they will be needed to use and interact with such asystem. In part 7, this prototyping is extended by the proposed Ubiquitous CommunicationManagement Framework, which allows such a wearable computer to have additional inputand output capabilities, all under control by the end user through a simple to use abstractionof a user interface.

The investigation of different ways for prototyping wearable computing as discussed inpart 6, was performed to gain input by the end users themselves and their idea of what con-stituted useful interaction means. The nurses’ needs were first investigated in field studies,which revealed that the current communication possibilities between the personnel and re-mote medical workers were insufficient. In discussions with the nurses, it was deemed de-sirable to improve their ability to communicate over a distance. During an ethnographicalfield study, the nurses’ daily tasks were observed and analyzed, to provide us with insightin what their profession entails and what situations they encounter. This was followed byparticipatory design events in which a wearable system was prototyped in terms of function-ality and interaction. To involve the end users in the design process, methods such as paperprototyping [65] and Wizard of Oz prototyping [12] were applied.

However, although employing end users in the design process can result in a wearablecomputer they feel comfortable using, the design is still constrained by the need to construct acomplete hardware system suitable for all situations. For example, for certain tasks involvingvisual guidance by a remote expert there is a definite need for a HMD, while in other tasksit is sufficient to communicate over audio. In the latter cases, the HMD serves no purpose,but as it is still part of the wearable computing system it may or may not be easily removedon demand depending on how the hardware system is designed. This leads to the undesirablesituation of always having to carry around a complete wearable system suitable for all kindsof tasks, increasing the weight of the system and making it more obtrusive as a result. Thiscan become a significant problem if users stop using the equipment altogether as a directconsequence of these problems.

Using a modular approach was identified as a possible way to relieve the aforementionedproblem. The ideas for this approach originate in part from the Borderland vision [56] brieflyintroduced in parts 2 and 3, and were also influenced by research in ubiquitous communi-cation [36]. With this approach, a user should be able to customize the wearable computeron demand by simply adding the components as needed. With a broad definition of what acomponent constitutes, these can be everything from wearable devices to devices found inthe surrounding environment. In order to realize this approach and make the wearable com-puter more modular, a software framework was designed to provide the necessary signallingbetween different components.

The framework provides the ability for controlling different media resources, i.e. devicesfor presenting or capturing e.g. audio and video, so that media streams can be redirectedand transfered between them arbitrarily. An example scenario illustrates how the frameworkcan be used in a real life situation. Assume a nurse at a nursing home needs the adviceof a physician when attending a patient, and gets this via a wearable computer equippedwith a HMD in which she can see the physician’s instructions. Upon entering the patient’sroom, the framework detects this and reacts by notifying the wearable computer that there

Page 44: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

26 Thesis Introduction

PersonalCommunication

ManagementAgent

Configure

Media Resource

User

interaction

Subscribe

on events

InformationRepositories

Sensor

Sensor

Sensor

Sensor

Publish

Publish

Publish

Publish

RemoteControl UI

MobilityManager

Search

for information

MediaResources

Media Resource

Sensor

Publish

Configure

Figure 1.6: Overview of the Ubiquitous Communication Management Framework.

are additional media resources in the room. The nurse is notified about this by the wearableuser interface, and decides to redirect the video stream to the nearby television screen, so thatthe patient can view and follow the physician’s instructions directly. The driving concept isthat of the Dynamic Wearable Computer, where the user only needs to wear the interactiondevices needed for the task at hand, and is able to utilize external resources found in theenvironment as an extension of the wearable computer. For nurses working in a nursinghome who have previously tried different wearable computing systems for communication,as discussed in part 6, this was desirable so they could choose what interaction capabilitiesthey needed based on the task at hand.

In figure 1.6, the different components of the proposed framework are shown. Each com-ponent has a dedicated task, making up a complete distributed system that becomes easy todeploy and manage. The Information Repository is a distributed database containing informa-tion about the system and the users. Each repository represents either a user, an environment,or a media resource. The repositories are all linked, similar to how the World Wide Web islinked together. For example, a user repository can be linked to an environment repositoryrepresenting a room, which in turn is linked to several media resource repositories represent-ing different devices. The information stored in the repositories in turn originate from sensors,which publish their data into their associated repository. For example, a sensor could detectwhen a user enters a room, and then link the user’s repository together with the environmentrepository representing the room.

The Personal Communication Management Agent in turn listens for state changes in theinformation repositories, which occur when sensors publish certain data which the agentsubscribes to. In the situation when e.g. a new media resource becomes available as the resultof such a state change, the agent notifies the Remote Control User Interface. This componentis an abstraction of a user interface which informs the user about available media resources,

Page 45: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 27

Figure 1.7: A nurse using a wearable computer with the video stream redirected to a TV.

and allows the user to select whether or not to use any of them. The implementation of thiscomponent is specific to the device on which it runs, meaning that a wearable computer canhave an arbitrarily chosen wearable user interface, while a PDA can have an ordinary userinterface suitable for such hand-held devices. In the implementation of such an interface, theresults from parts 4 and 5 can be taken into account, to avoid distracting the user more thannecessary. As the user selects or deselects media resources, the remote control user interfacesends events back to the agent.

In turn, the agent communicates with the Mobility Manager responsible for configuringthe media resources. This manager can be implemented either as a gateway or integrated withthe communication software running in the media resources. Implementing it as a gatewaymeans that the communication software can be kept unmodified, and the gateway performsthe necessary redirecting and multiplexing of media streams between the devices. Imple-menting it in the communication software itself has the advantage of avoiding a single pointof failure such as with a gateway, but the drawback is that the software needs to be adaptedwhich is not always possible with proprietary software.

The signalling infrastructure provided by the framework means that it can be used inconjunction with arbitrary software for conveying media streams, as long as that softwarecan be modified to respond to such signalling or have its media streams redirected via agateway. In part 7, a proof of concept system using the Marratech software for implementingthe e-meeting system is presented, where the framework controls the application and is ableto redirect video streams between arbitrary displays on the user’s demand. Figure 1.7 showsa nurse using the system, having just transfered the video stream appearing on her wearablecomputer to a media resource in form of a television screen in the room.

Page 46: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

28 Thesis Introduction

Taking the publications in part 6 and 7 together, the results allow for easier prototypingof wearable communication systems, with ease of interaction as a common goal in terms ofboth hardware and software.

1.9 Discussion

This thesis has proposed the novel concept of mobile e-meetings conducted through wearablecomputers. User studies have been done investigating how interaction can be improved interms of interruptions and notifications, and field trials have shown how such systems can beprototyped for ease of use and deployment. This section discusses how the research questionslisted in section 1.4 have been addressed in this thesis. Furthermore, it gives an outlook intopotential future research directions on this topic, and thereafter concludes the introductorypart of this thesis by highlighting the scientific contributions.

1. By what means can communication take place in mobile e-meetings through wearablecomputing, and what media are relevant to focus on for this purpose? The first paperhas explored the use of mobile e-meetings by using a wearable computer prototype in differ-ent real life situations. The concept of the knowledgeable user has been introduced, denotingthe situation where a field worker retrieves assistance from remote experts. Based on thesituations encountered, many different media have been found to be more or less useful. As itall depends on the situation in which the wearable computer is used, a definite classificationof the media requirements is not possible to make, although some guidelines can be given.From the experts’ point of view, receiving video and audio from the field worker and the taskcurrently performed is a primary concern in most situations. From the field worker’s point ofview, audio and chat messages may be the most useful media for conveying information andadvice, whereas video and still images are only needed occasionally.

As no quantitative studies were made and all field trials took place in uncontrolled real lifesituations, the validity and reliability of these findings can not be guaranteed for all possiblesituations. However, as using wearable computers for mobile e-meetings was a novel concept,an exploratory approach to this field was deemed useful and desirable, in order to gain insightin the area for the purpose of the coming research.

2. How can mobile e-meetings be seamlessly used and employed in real life scenarios?The second paper further explores the use of mobile e-meetings, with focus on how the re-mote experts can be given a seamless and useful view of the field worker’s performed tasks.To achieve seamlessness, the use of different interfaces for conveying historical or live infor-mation can be used, allowing experts to enter the e-meeting at different points in time andby different means. This allows them a recap of the task performed, so that they are betterable to assist the field worker on demand. While it is possible to build mobile e-meetingsystems which can be used in real life, two primary issues were identified. Considering thefirst issue, the user’s wearable computer itself needs to be unobtrusive, both for the sake ofeasy handling, as well as not to interfere with the people the user meets while performingthe task. The second issue is that the wearable user interface needs to be highly efficient to

Page 47: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 29

use, so that the user can focus on performing the task as unhindered as possible. The abilityto switch freely between text and audio was also identified as potentially useful, as it wouldincrease the seamlessness of using the system for experts and field workers alike.

Similar to the first paper, this is also an exploratory paper which delves deeper into howmobile e-meetings are used, as well as what issues there are that needs to be addressed toimprove them further. In particular, the issue of unobtrusive interaction with the wearablecomputer was identified as necessary for an efficient e-meeting to be conducted.

3. Given a number of methods to interrupt a user, how should these be used so as not toincrease the user’s cognitive workload more than needed? The third and fourth papersprovide an overview of different methods for interrupting the user while trying to minimizethe increase of the user’s cognitive workload. In the third paper where the user’s interactionwith the wearable computer was highly streamlined, the results indicate that using a sched-uled method, where the interrupting tasks are clustered, will give the best results althoughwith the drawback of a considerably higher average age before the tasks are handled. Thenegotiated treatments, where the user could decide when to handle the interruptions, is moreuseful considering the overall performance of the user. This method yields a much shorteraverage task age with only slightly worse performance compared to the scheduled treatment.In the fourth paper where a more realistic wearable computing scenario was studied, similarresults were observed with some important and unexpected findings. The primary finding wasthat the negotiated method now exhibited worse results than the scheduled, immediate, andmediated methods, because the users experienced major problems with a conceptually simpleinteraction step. This points out that streamlined interaction with the wearable computer isstill a paramount topic of research.

In conclusion, a user’s performance is shown to be affected depending on what methodsfor interruption are used, warranting research in this field. Furthermore, it was suggested thatthe type of notification used in the negotiated method may have an additional impact, so thisalso becomes a factor to take in consideration. A prerequisite for properly employing themethods is also that the user’s physical interaction with the wearable computer’s input de-vices is not flawed. These findings are all based on user studies performed through controlledexperiments, and can be considered valid as they are likely to correctly measure the intendedeffect of interruptions. The results are also reliable as they concur with earlier studies regard-ing both desktop and wearable computing.

4. How can a typical wearable computing scenario from real life be modeled as anexperimental setup, in order to evaluate wearable user interfaces in a reliable and validmanner? The fourth paper presented, as part of the user study regarding interruptions, anapparatus called the HotWire. The apparatus can be used to model real world tasks wheremobility and attention are the key factors characterizing the task. These factors can be tunedand embodied in the experimental setup, by altering the shape, length, and difficulty of thetrack. Although the fourth paper only presents a user study, the apparatus allows an arbitrarywearable user interface to be tested.

Page 48: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

30 Thesis Introduction

Results gained from using the HotWire can be deemed valid as the apparatus in its con-struction embodies many of the characteristics of how wearable computing primary tasks areperformed. The advantage of having an apparatus for evaluating wearable user interfaces inthis manner, is that early use of the apparatus can guide a designer of a wearable user inter-face in choosing the proper interaction means. This makes it easier to determine from thestart whether a certain modality or interaction device will be useful or not. When proper in-teraction means have been identified and selected, more detailed data can be extracted usingthe very same apparatus at a later stage. The lack of reproducible and easy to use evaluationapparatuses in the field of wearable computing, also makes the HotWire apparatus itself oneof the most important contributions of this thesis.

5. What methodologies are useful when prototyping easy to interact with wearablecomputing e-meeting systems and engaging end users in the process? The fifth paperpresents the use of three methods of prototyping. Paper prototyping makes it easy to involveend users in the design process, because its simplicity requires no technical knowledge whilestill allowing them to explore the design space. Wizard of Oz prototyping allows end users totry out different means for interaction at an early stage, without needing go through tediousand perhaps misdirected research and development. The use of prototyping based on existingconsumer products can aid certain aspects of the design process, such as providing the userswith a sense for what is realistic to achieve with today’s technology. Using existing softwareto provide the actual communication, gives users a feeling for how useful such communica-tion can be in their work. On the other hand, limitations in soft- and hardware can also havea negative impact, as it tend to restrict the end users’ creativity making them focus more onthe current technology than the envisioned functionality.

When researching interaction aspects of wearable computing for human communication,it is important to include the end users in the process of designing proper interaction. Thecontribution of this fifth paper is therefore the insight on how traditional prototyping methodsfor desktop computing, can also be applied to wearable computing.

6. What functionality is needed to allow users to automatically combine and switchbetween resources available in the wearable computer and in the surrounding environ-ment? The sixth paper presents a framework that provides this functionality. A proof ofconcept prototype implementation of the framework shows that it can make a wearable com-puter more dynamic, in the sense that incoming and outgoing media streams can be redirectedto arbitrary output and input devices either worn or found in the environment. The prototypehas been used in a local nursing home, allowing nurses to communicate with medical work-ers and make use of external displays and cameras as needed. This enables the end users toswitch freely between different devices, in order to suit their current interaction needs for thetask at hand.

Although no long term deployment of the framework has been made, the findings canstill be considered valid as actual end users were involved in trying out the prototype system.Combined with the findings regarding prototyping in the fifth paper, this framework can thus

Page 49: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 31

aid in making the prototypes more customizable to the different tasks performed by the endusers.

1.9.1 Future Research Directions

Wearable computing by definition means that the interaction is managed in a way that assistsrather than impedes its user. This is applicable when the wearable computer is used forhuman communication, and equally applicable when it is used for other purposes. Researchin improving the means for interaction will therefore have the benefit of showing an impacton many different application domains.

The HotWire is in its current incarnation able to reveal intricacies of wearable computingnot uncovered by earlier methods and studies, and I see several possibilities where furtherstudies of various wearable user interfaces can be conducted using the apparatus. Such studieswill have the benefit of pointing out how the HotWire itself can evolve, while the HotWire inturn evaluates and sheds new light on the topic of wearable interaction. As this is one of thefirst reproducible and easy to use evaluation apparatuses, I foresee that much research can beconducted around the HotWire, and that this will bring many scientific benefits to the studyof wearable interaction.

Naturally, user studies also need to be complemented with real world prototyping anddevelopment, in order to bring the theoretical insights into practical use in the real world.Deploying a wearable computer for human communication at a nursing home for very longterm use, is something which would provide very valuable insight in how this technologyis being used. For such long term deployment however, the wearable computers used mustbe extremely durable and not prone to errors, and this is something which requires muchmore effort in terms of construction and implementation than creating a research prototypefor short term use.

1.9.2 Conclusions

This thesis has studied different interaction aspects of wearable computing for human com-munication, and proposed solutions and guidelines for how the user’s interaction can becomemore streamlined and easy to use. The main scientific contributions of this thesis can besummarized as follows.

• A concept called the knowledgeable user denoting the use of wearable computing forhuman communication, with exploratory field trials using hardware and software pro-totypes to survey this research topic.

• Guidelines for properly managing the interruption and notification of the user of awearable computer, both in ideal settings when the user’s interaction is already stream-lined, as well as in more demanding situations of physical and practical nature.

• The HotWire apparatus which can be used to evaluate arbitrary wearable user inter-faces in a controlled and reproducible manner, while at the same time being easy to

Page 50: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

32 Thesis Introduction

adapt to different kinds of application domains where mobility and attention are thekey characteristics.

• The concept of a dynamic wearable computer realized through a framework that en-ables ubiquitous communication, facilitating the use of arbitrary input and output de-vices in the surrounding environment to suit the user’s interaction demands.

Based on results from field trials, user studies, laboratory experiments, and theoreticalanalysis of these, the thesis suggests that wearable computing is a feasible way to enablehuman communication over a distance. This has the potential to increase a user’s task per-formance, and thereby save time and money as a result. In health-care and nursing homes inparticular, lack of time is often a major problem for the personnel, thereby making such facil-ities suitable and feasible for the deployment of this kind of applications. However, deployingan application has no purpose unless it is also taken in use. To make a wearable computeruseful, great care needs to be taken to make it easy to operate, which can be achieved byproviding a streamlined and unobtrusive user interface. The results presented in this thesisshows that it is possible to involve end users in the design process, and make a flexible anddynamic system which will be accepted and taken in actual use at the facility. The results alsoshow that interruption is an important issue to consider, and that by managing interruptionsand notifications properly, task performance can be further enhanced while reducing theirnegative impact.

Broadening the view and looking at wearable computing for human communication ingeneral, the research presented in this thesis contains guidelines on many aspects of interac-tion. These guidelines are foremost applicable to the domain of wearable computing, but alsoin research on mobile telephony and other wearable technology. In our current society, wecan see the emergence of increasingly powerful smart phones and PDAs, together with higherbandwidth and larger coverage of wireless and cellular networks. This means that more so-phisticated ways of communicating through different media are now becoming available tothe general public. With this, users’ demands on unobtrusive and streamlined interaction willbecome increasingly important. As the concept of wearable computing by definition con-forms, ideally, to these demands, the results in this thesis can be expected to have an impacton how interaction is designed for the mobile phone or wearable computer of the future.

Even though prototypes have been built using specific hardware and software compo-nents, the results leading up to these guidelines have primarily originated from the underlyinginteraction demands of human beings. As human nature and the desire to communicate is notexpected to change radically in the foreseeable time, the results will be applicable to presentand future technologies alike. By this, I conclude this thesis by stating that it points out a leapforward in human communication, and that the results contained herein provides yet anotherstep forward in that direction.

1.10 Personal Contribution

The remainder of this thesis consists of six publications. This section describes my owncontribution in each of the papers.

Page 51: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Thesis Introduction 33

Paper 1: Marcus Nilsson is the main author of this paper. I contributed to major parts ofthe sections about the mobile user and what lies beyond communication, and to theevaluation and conclusions both in writing as well as through discussions. The testsand experiments which the paper is based on have been conducted over an extendedperiod of time, and is the result of joint work between myself and Marcus Nilsson.

Paper 2: I am the main author of this paper, I wrote most of the text. The tests and ex-periments which the paper is based on have been conducted over an extended periodof time, and is the result of joint work between myself and Marcus Nilsson. RolandParviainen contributed with his history and web interface tools for Marratech Pro.

Paper 3: I am the main author of this paper, I wrote most of the text with comments frommy co-authors. I, Marcus Nilsson and Urban Liljedahl were equally responsible for thesetup and execution of the user study. The simulation software used is derived from Dr.Daniel C. McFarlane’s original source code, and was adapted by myself and MarcusNilsson to suit our experiment. The discussion and conclusions are the result of workmainly involving myself and Marcus Nilsson.

Paper 4: I am the main author of this paper, I wrote half the introduction and related work,and most of the experiment and user study sections. The experimental setup was pri-marily designed by myself in collaboration with Hendrik Witt. The HotWire experi-mental apparatus was conceived in discussions between myself and Hendrik Witt, andwe are both equally involved in its creation, overall design, and usage. I prepared theHotWire simulation software while Hendrik Witt constructed the hardware used in thestudy. The DataGlove used in the study was constructed within Hendrik Witt’s re-search group, who also wrote the necessary drivers for interfacing with the simulationprogram. We were both equally involved in executing the user study and performingthe subsequent analysis. The discussion and conclusions are the result of work involv-ing myself and Hendrik Witt.

Paper 5: I am the main author of this paper, I wrote most of the text except the section aboutthe Wizard of Oz method which was written by Josef Hallberg. The work leading upto this paper, such as interviews, ethnographical studies, tests and prototyping with theelderly-care personnel, was shared equally between myself and Josef Hallberg.

Paper 6: Johan Kristiansson is the main author of this paper. I and Johan Kristiansson con-tributed most to the paper, while Josef Hallberg contributed with the text about theinformation repositories and the personal management agent. My contribution was theintroduction, related work, the remote control user interface, and the proof of conceptprototype. The framework was implemented mainly by Johan Kristiansson, while Iimplemented the modules dealing with the remote user interface handling and presen-tation. The design and subsequent analysis of the framework was conducted by myself,Johan Kristiansson, and Josef Hallberg.

Page 52: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

34

Page 53: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 2

Sharing Experience andKnowledge with Wearable

Computers

35

Page 54: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 55: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Sharing Experience and Knowledge with Wearable Computers 37

Sharing Experience and Knowledge with Wearable Computers

Marcus Nilsson, Mikael Drugge, Peter ParnesDivision of Media Technology

Department of Computer Science and Electrical EngineeringLuleå University of Technology

SE–971 87 Luleå, Sweden{marcus.nilsson, mikael.drugge, peter.parnes}@ltu.se

April, 2004

Abstract

Wearable computers have mostly been looked on when used in isolation, but the wearablecomputer with Internet connection is a good tool for communication and for sharing knowl-edge and experience with other people. The unobtrusiveness of this type of equipment makesit easy to communicate at most types of locations and contexts. The wearable computermakes it easy to be a mediator of other people knowledge and becoming a knowledgeableuser. This paper describes the experience gained from testing the wearable computer as acommunication tool and being the knowledgeable user on different fairs.

2.1 Introduction

Wearable computers can today be made by off the shelf equipment, and are becoming morecommonly used in some areas as construction, health care, etc. Researchers in the wearablecomputer area believe that wearable computers will be equipment for everyone that aids theuser all day. This aid is in areas where computers are more suited than humans, for examplememory tasks. Wearable computer research has been focusing on the usage of wearablecomputers in isolation [19].

It is believed in the Media Technology group at Luleå University of Technology thata big usage of the wearable computer will be the connection the wearable computer canmake possible, both with people and the surrounding environment. Research on this is beingconducted in what we call Borderland [56], which is about wearable computers and the toolsfor them to communicate with people and technology. A wearable computer with networkconnection can make it possible to have a communication with people that are at distantlocations independent of the users current location. This is of course possible today withmobile phones etc, but a significant difference with the wearable computer is the possibilityof a broader use of media and the unobtrusiveness of using a wearable computer.

One of the goals for wearable computers is that the user could operate it without diminish-ing his presence in the real world [7]. This together with the wearable computer as a tool for

Page 56: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

38 Sharing Experience and Knowledge with Wearable Computers

rich1 communication make it possible for new ways of communicating. A wearable computeruser could become a beacon of several people’s knowledge and experience, a knowledgeableuser. The wearable computer would not just be a tool for receiving expert help [34] but a toolto give the impression to other people that the user does have the knowledge in himself.

The research questions this brings forward include by what means communication cantake place, what type of media is important for this type of communication?

There is also the question of how this way of communicating will affect the participantsinvolved, what advantages and disadvantages there are with this form of communication.

In this paper we present experiences that have been made on using wearable computersas a tool to communicate knowledge and experience from both the user and other participantsover the network or locally.

2.1.1 Environment for Testing

The usage of wearable computers for communication was tested under different fairs that theMedia Technology group attended. The wearable computer was part of the exhibition of thegroup and used to communicate with the immobile part of the exhibition. Communicationwas also established with remote persons from the group that was not attending the fairs.Both the immobile and remote participants could communicate with the wearable computerthrough video, audio and text.

The type of fairs ranged from small fairs locally to the university for attracting new stu-dents, to bigger fairs where research was presented for investors and other interested parties.

2.2 Related Work

Collaborative work using wearable computers has been discussed in several publications [2,3, 73]. The work has focused on how several wearable computers and/or computer users cancollaborate. Not much work has been done on how the wearable computer user can be amediator for knowledge and experience of other people. Lyons and Starners work on capturethe experience of the wearable computer user [41] is interesting and some of the work therecan be used for sharing knowledge and experience in real time. But it is also important toconsider the other way around where people are sharing to the wearable computer user.

As pointed out in [19], wearable computers tend to be most often used in isolation. Webelieve it is important to study how communication with other people can be enabled andenhanced by using this kind of platform.

2.3 The Mobile User

We see the mobile user as one using a wearable computer that is seamlessly connected to theInternet throughout the day, regardless of where the user is currently situated. In Borderland

1With rich we mean that several different media is used as audio, video, text, etc.

Page 57: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Sharing Experience and Knowledge with Wearable Computers 39

Figure 2.1: The Borderland laptop-based wearable computer.

we currently have two different platforms which both enable this; one is based on a laptopand the other is based on a PDA. In this section we discuss our current hardware and softwaresolution used for the laptop-based prototype. This prototype is also the one used throughoutthe remainder of this paper, unless explicitly stated otherwise.

2.3.1 Hardware Equipment

The wearable computer prototype consists of a Dell Latitude C400 laptop with a PentiumIII 1.2 GHz processor, 1 GB of main memory and built-in IEEE 802.11b. Connected tothe laptop is a semi-transparent head-mounted display by TekGear called the M2 PersonalViewer, which provides the user with a monocular full color view of the regular laptop displayin 800x600 resolution. Fit onto the head-mounted display is a Nogatech NV3000N web

Page 58: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

40 Sharing Experience and Knowledge with Wearable Computers

Figure 2.2: The Borderland PDA-based wearable computer.

camera that is used to capture video of what the user is currently looking or aiming his headat. A small wired headset with an earplug and microphone provides audio capabilities. Userinput is received through a PS/2-based Twiddler2 providing a mouse and chording keyboardvia a USB adapter. The laptop together with an USB-hub and a battery for the head-mounteddisplay are placed in a backpack for convenience of carrying everything. A battery for thelaptop lasts about 3 hours while the head-mounted display can run for about 6 hours beforerecharging is needed. What the equipment looks like when being worn by a user is shown infigure 2.1.

Note that the hardware consists only of standard consumer components. While it wouldbe possible to make the wearable computer less physically obtrusive by using more special-ized custom-made hardware, which is not a goal in itself at this time. We do, however, try toreduce its size as new consumer components become available.

There is work being done on a PDA based wearable that can be seen in figure 2.2. Thegoal is that it will be much more useful outside the Media Technology group at Luleå Univer-sity of Technology and by that make it possible to do some real life test on the knowledgeableuser.

2.3.2 Software Solution

The commercial collaborative work application Marratech Pro2 running under Windows XPprovides the user with the ability to send and receive video, audio and text to and from otherparticipants using either IP-multicast or unicast. In addition to this there is also a sharedwhiteboard and shared web browser. An example of what the user may see in his head-mounted display is shown in figure 2.3.

2http://www.marratech.com

Page 59: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Sharing Experience and Knowledge with Wearable Computers 41

Figure 2.3: The collaborative work application Marratech Pro as seen in the head-mounteddisplay.

2.4 Beyond Communication

With a wearable computer, several novel uses emerge as a side effect of the communicationability that the platform allows. In this section we will focus on how knowledge and expe-riences can be conveyed between users and remote participants. Examples will be given onhow this sharing of information can be applied in real world scenarios.

2.4.1 Becoming a Knowledgeable User

One of the key findings at the different fairs was how easily a single person could representthe entire research group, provided he was mobile and could communicate with them. Whenmeeting someone, the wearable computer user could ask questions and provide answers thatmay in fact have originated from someone else at the division. As long as the remote in-formation, e.g. questions, answers, comments and advices, was presented for our user in anon-intrusive manner, it provided an excellent way to make the flow of information as smoothas possible.

For example, if a person asked what a certain course or program was like at our university,the participants at the division would hear the question as it was asked and could respondwith what they knew. The wearable computer user then just had to summarize those bits ofinformation in order to provide a very informative and professional answer.

Page 60: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

42 Sharing Experience and Knowledge with Wearable Computers

This ability can be further extended and generalized as in the following scenario. Imag-ine a person who is very charismatic, who is excellent at holding speeches and can presentinformation to an audience in a convincing manner. However, lacking technical knowledge,such a person would not be very credible when it comes to explaining actual technical detailsthat may be brought up. If such a person is equipped with a wearable computer, he will beable to receive information from an expert group of people and should thus be able to answerany question. In effect, that person will now know everything and be able to present it all ina credible manner, hopefully for the benefit of all people involved.

Further studies are needed to find out whether and how this scenario would work in reallife — can for example an external person convey the entire knowledge of, for example aresearch group, and can this be done without the opposite party noticing it? From a technicalstandpoint this transmission of knowledge is possible to do with Borderland today, but wouldan audience socially accept it or would they feel they are being deceived?

Another, perhaps more important, use for this way of conveying knowledge is in health-care. In rural areas there may be a long way from hospital to patients’ homes, and resourcesin terms of time and money may be too sparse to let a medical doctor visit all the patients inperson. However, a nurse who is attending a patient in his home can use a wearable computerto keep in contact with the doctor who may be at a central location. The doctor can thenhelp make diagnoses and advise the nurse on what to do. He can also ask questions andhear the patient answer in his own words, thereby eliminating risks of misinterpretation andmisunderstanding. This allows the doctor to virtually visit more patients than would havebeen possible using conventional means, it serves as an example on how the knowledge of asingle person can be distributed and shared over a distance.

2.4.2 Involving External People in Meetings

When in an online meeting, it is sometimes desirable for an ordinary user to be able to jumpinto the discussion and say a few words. Maybe a friend of yours comes by your office whileyou are in a conversation with some other people, and you invite him to participate for somereason, maybe he knows a few of them and just wants to have a quick chat. While this istrivial to achieve when at a desktop — you just turn over your camera and hand a microphoneto your friend — this is not so easily done with a wearable computer for practical reasons.

Even though this situation may not be that common to deserve any real attention, we havenoticed an interesting trait of mobile users participating in this kind of meetings. The morepeople you meet when you are mobile, the bigger chance there is that some remote participantwill know someone among those people, and thus the desire for him to communicate withthat person becomes more prevalent. For this reason, it has suddenly become much moreimportant to be able to involve ordinary users — those you just meet happenstance — in themeeting without any time to prepare the other person for it.

A common happening at the different fairs was that the wearable computer user metor saw a few persons who some participant turned out to know and wanted to speak with.Lacking any way besides using the headset to hear what the remote participants said, the onlyway to convey information was for our user to act as a voice buffer, repeating the spoken

Page 61: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Sharing Experience and Knowledge with Wearable Computers 43

words in the headset to the other person. Obviously, it would have been much easier to handover the headset, but several people seemed intimidated by it. They would all try on thehead-mounted display, but were very reluctant to speak in the headset. 3

To alleviate this problem, we found it would likely be very useful to have a small speakeras part of the wearable computer through which the persons you meet could hear the par-ticipants. That way, the happenstance meeting can take place immediately and the wearablecomputer user need not even take part in any way, he just acts as a walking beacon throughwhich people can communicate. Of course, a side effect of this novel way of communicatingmay well be that the user gets to know the other person as well and thus, in the end, builds alarger contact network of his own.

We believe that with a mobile participant, this kind of unplanned meetings will happeneven more frequently. Imagine, for example, all the people you meet when walking down astreet or entering a local store. Being able to involve such persons in a meeting the way it hasbeen described here may be very socially beneficial in the long run.

2.4.3 When Wearable Computer Users Meet

Besides being able to involve external persons as discussed in the section before, there is alsothe special case of inviting other wearable computer users to participate in a meeting. This issomething that can be done using the Session Initiation Protocol (SIP) [28].

A scenario that exemplifies when meetings between several wearable computer users atdifferent locations would be highly useful is in the area of fire-fighting.4 When a fire breaksout, the first team of firefighters arrives at the scene to assess the nature of the fire and proceedwith further actions. Often a fire engineer with expertise knowledge arrives at the scene sometime after the initial team in order to assist them. Upon arrival he is briefed of the situation andcan then provide advice on how to best extinguish the fire. The briefing itself is usually donein front of a shared whiteboard on the side of one of the fire-fighting vehicles. Consideringthe amount of time the fire engineer spends while being transported to the scene, it would behighly beneficial if the briefing could start immediately instead of waiting until he arrives.

By equipping the fire engineer and some of the firefighters with wearable computers, theywould be able to start communicate early on upon the first team’s arrival. Not only does thisallow the fire engineer to be briefed of the situation in advance, but he can also get a firstperson perspective over the scene and assess the whole situation better. Just as in kraut’swork [35] the fire engineer as an expert can assist the less knowledgeable before reaching thedestination. As the briefing is usually done with help of a shared whiteboard — which alsoexists in the collaborative work application in Borderland — there would be no conceptualchange to their work procedures other than the change from a physical whiteboard to anelectronic one. This is important to stress — the platform does not force people to changetheir existing work behavior, but rather allows the same work procedures to be applied inthe virtual domain when that is beneficial. In this case the benefit lies in briefing being

3Another exhibitor of a voice-based application mentioned they had the same problem when requesting peopleto try it out; in general people seemed very uncomfortable speaking into unknown devices.

4This scenario is based on discussions with a person involved in fire fighting methods and procedures in Sweden.

Page 62: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

44 Sharing Experience and Knowledge with Wearable Computers

done remotely, thereby saving valuable time. It may even be so that the fire engineer nolonger needs to travel physically to the scene, but can provide all guidance remotely andserve multiple scenes at once. In a catastrophe scenario, this ability for a single person toshare his knowledge and convey it to people at remote locations may well help in savinglives.

2.5 Evaluation

The findings we have done are based on experiences from the fairs and exhibitions we haveattended so far, as well as from pilot studies done in different situations at our university.

The communication that the platform enables allows for a user to receive informationfrom remote participants and convey this to local peers. As participants can get a highlyrealistic feeling of “being there” when experiencing the world from the wearable computeruser’s perspective, the distance between those who possess knowledge and the user who needsit appears to shrink. Thus, not only is the gap of physical distance bridged by the platform,but so is the gap of context and situation.

While a similar feeling of presence might be achieved through the use of an ordinary videocamera that a person is carrying around together with a microphone, there are a number ofpoints that dramatically sets the wearable computer user apart from such.

• The user will eventually become more and more used to the wearable computer, thusmaking the task of capturing information and conveying this to other participants moreof a subconscious task. This means that the user can still be an active contributingparticipant, and not just someone who goes around recording.

• As the head-mounted display aims in the same direction as the user’s head, a morerealistic feeling of presence is conveyed as subtle glances, deliberate stares, seekinglooks and other kinds of unconscious behavior is conveyed. The camera movementand what is captured on video thus becomes more natural in this sense.

• The participants could interact with the user and tell him to do something or go some-where. While this is possible even without a wearable computer, this interaction incombination with the feeling of presence that already existed gave a boost to it all. Notonly did they experience the world as seen through the user’s eyes, but they were nowable to remotely “control” that user.

2.5.1 The Importance of Text

Even though audio may be well suited for communicating with people, there are occasionswhere textual chat is more preferable. The main advantage of text as we see it is that unlikeaudio, the processing of the information can be postponed for later. This has three conse-quences, all of which are very beneficial for the user.

Page 63: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Sharing Experience and Knowledge with Wearable Computers 45

1. The user can choose when to process the information, unlike a voice that requiresimmediate attention. This also means processing can be done in a more arbitrary, non-sequential, order compared to audio.

2. The user may be in a crowded place and/or talk to other people while the informationis received. In such environments, it may be easier to have the information presentedas text rather than in an audible form, as the former would interfere less with the user’snormal task.

3. The text remains accessible for a longer period of time meaning the user does not needto memorize the information in the pace it is given. For things such as URL:s, telephonenumbers, mathematical formulas and the like, a textual representation is likely to be ofmore use than the same spoken information.

While there was no problem in using voice when talking with the other participants, onseveral occasions the need to get information as text rather than voice became apparent. Mostof the time, the reason was that while in a live conversation with someone, the interruptionand increased cognitive workload placed upon the user became too difficult to deal with. Inour case, the user often turned off the audio while in a conversation so as not to be disturbed.The downside of this was that the rest of the participants in the meeting no longer had anyway of interacting or providing useful information during the conversation. 5

There may also be privacy concerns that apply; a user standing in a crowd or attendinga formal meeting may need to communicate in private with someone. In such situations,sending textual messages may be the only choice. This means that the user of a wearablecomputer need not only be able to receive text, he must also be able to send it. We can evenimagine a meeting with only wearable computer participants to make it clear that sendingtext will definitely remain an important need.

Hand-held chord keyboards such as the Twiddler have showed to give good result fortyping [42]. But these types of devices still take time to learn and for those who seldom needto use them the motivation to learn typing efficiently may never come. Other alternatives thatprovide a regular keyboard setup, such as the Canesta KeyboardTM Perception ChipsetTM

that uses IR to track the user’s fingers on a projected keyboard, also exist and may well bea viable option to use. Virtual keyboards shown on the display may be another alternativeand can be used with a touch-sensitive screen or eye-tracking software in the case of a head-mounted display. Voice recognition systems translating voice to text may be of some use,although these will not work in situations where privacy or quietness is of concern. It would,of course, also be possible for the user to carry a regular keyboard with him, but that canhardly be classified as convenient enough to be truly wearable.

There is one final advantage of text compared to audio, and that is the lower bandwidthrequirements of the former compared to the latter. On some occasions there may simply notbe enough bandwidth, or the bandwidth may be too expensive, for communicating by othermeans than through text.

5This was our first public test of the platform in an uncontrolled environment, so neither of the participants wassure of what was the best thing to do in the hectic and more or less chaotic world that emerged. Still, much waslearnt thanks to exactly that.

Page 64: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

46 Sharing Experience and Knowledge with Wearable Computers

2.5.2 Camera and Video

Opinions about the placement of the camera on the user’s body varied among the participants.Most of them liked having the camera always pointing in the same direction as the user’shead, although there were reports of becoming disoriented when the user turned his head toofrequently.

Some participants wanted the camera to be more body-stabilized, e.g. mounted on theshoulder, in order to avoid this kind of problem. While this placement would give a morestable image it may reduce the feeling of presence as well as obscure the hints of what catchesthe user’s attention. In fact, some participants expressed a desire to be given an even moredetailed view of what the user was looking at by tracking his eye movements, as that issomething which can not be conveyed merely by having the camera mounted on the user’shead. As Fussell points out [20] there are problems that have to be identified with head-mounted cameras. Some of these problems may be solved by changing the placement on thebody for the camera. However, further studies are needed to draw any real conclusions of theeffects of the different choices when used in this kind of situation.

Some participants reported a feeling of motion sickness with a framerate (about 5 Hz),and for that reason preferred a lower framerate (about 1 Hz) providing almost a slideshowof still images. However, those who had no tendency for motion sickness preferred as highframerate as possible because otherwise it became difficult to keep track of the direction whenthe user moved or looked around suddenly.

In [1] it is stated that a high framerate (15 Hz) is desirable in immersive environments toavoid motion sickness. This suggests our notion of high framerate was still too low, and byincreasing it further it might have helped eliminate this kind of problem.

2.5.3 Microphone and Audio

Audio was deemed as very important. Through the headset microphone the participantswould hear much of the random noise from the remote location as well as discussions withpersons the user met, thereby enhancing the feeling of “being there” tremendously

Of course, there are also situations in which participants are only interested in hearing theuser when he speaks, thereby pointing out the need for good silence suppression to reduceany background noise.

2.5.4 Transmission of Knowledge

Conveying knowledge to a user at a remote location seems in our experience to be highlyuseful. So far, text and audio have most of the time been enough to provide a user with theinformation needed, but we have also experienced a few situations calling for visual aids suchas images or video.

Page 65: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Sharing Experience and Knowledge with Wearable Computers 47

2.6 Conclusions

We have presented our prototype of a mobile platform in form of a wearable computer thatallows its user to communicate with other. We have discussed how remote participants canprovide a single user with information in order to represent a larger group, and also how asingle expert user can share the knowledge he possesses in order to assist multiple personsat a distance. The benefits of this sharing have been exemplified with scenarios taken fromhealth-care and fire-fighting situations. The platform serves as a proof-of-concept that thisform of communication is possible today.

Based on experiences from fairs and exhibitions, we have found and identified a numberof areas that need further refinement in order to make this form of communication more con-venient for everyone involved. The importance of text and the configuration and placementof video has been discussed.

The equipment used in these trials is not very specialized and can be bought and builtby anyone. The big challenges in wearable computers today are the usage and in this papera usage of the wearable computer as a tool for sharing of knowledge and experience waspresented.

2.6.1 Future Work

We currently lack quantitative measures for our evaluation. For this a wearable computer thatordinary people will accept to use in their everyday life is needed. It is believed that the PDAbased wearable that was mentioned earlier in this paper is that kind of wearable computerand the plan is to do user test for some of the scenarios that have been mentioned in earlier inthe paper.

There are also plans to improve the prototype with more tools for improving sharingof experience and knowledge. One thing that is being worked on now is to incorporate atelepointer over the video so distant participants can share with the wearable computer userwhat they are talking about or what have their attention at the moment.

2.7 Acknowledgements

This work was sponsored by the Centre for Distance-spanning Technology (CDT) and Mäk-italo Research Centre (MRC) under the VINNOVA RadioSphere and VITAL project, and bythe Centre for Distance-spanning Health care (CDH).

Page 66: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

48

Page 67: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 3

Experiences of Using WearableComputers for Ambient

Telepresence and RemoteInteraction

49

Page 68: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 69: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 51

Experiences of Using Wearable Computersfor Ambient Telepresence and Remote Interaction

Mikael Drugge, Marcus Nilsson, Roland Parviainen, Peter ParnesDivision of Media Technology

Department of Computer Science and Electrical EngineeringLuleå University of Technology

SE–971 87 Luleå, Sweden{mikael.drugge, marcus.nilsson, roland.parviainen, peter.parnes}@ltu.se

October, 2004

Abstract

We present our experiences of using wearable computers for providing an ambient form oftelepresence to members of an e-meeting. Using a continuously running e-meeting sessionas a testbed for formal and informal studies and observations, this form of telepresence canbe investigated from the perspective of remote and local participants alike. Based on actualexperiences in real-life scenarios, we point out the key issues that prohibit the remote inter-action from being entirely seamless, and follow up with suggestions on how those problemscan be resolved or alleviated. Furthermore, we evaluate our system with respect to overallusability and the different means for an end-user to experience the remote world.

3.1 Introduction

Wearable computing offers a novel platform for telepresence in general, capable of providinga highly immersive and subjective experience of remote events. By use of video, audio andpersonal annotations and observations, the user of a wearable computer can convey a feelingof “being there” even to those people who are not. The platform also enables a level ofinteraction between remote and local participants, allowing information to flow back andforth passing through the wearable computer user acting as a mediator. All in all, wearablecomputers emerge as a promising platform for providing telepresence, yet this statement alsobrings forward the following research questions:

• What form of telepresence can be provided using today’s wearable computing technol-ogy?

• How can the telepresence provided be seamlessly used and employed in real-life sce-narios?

Page 70: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

52 Experiences of Using Wearable Computers for Ambient Telepresence ...

• What is required to further improve the experience and simplify its deployment ineveryday life?

In the Media Technology research group, collaborative work applications are used on adaily basis, providing each group member with an e-meeting facility from their regular desk-top computer. In addition to holding more formal e-meetings as a complement to physicalmeetings, the applications also provide group members with a sense of presence of eachother throughout the day. This latter case is referred to as the “e-corridor” — a virtual officelandscape in which group members can interact, communicate and keep in touch with eachother. As the e-corridor allows fellow co-workers to be together regardless of their physicalwhereabouts, it has become a natural and integrated part of our work environment.

As part of our ongoing research in wearable computing, we have had the wearable com-puter user join the e-corridor whenever possible; for example at research exhibitions, mar-keting events and student recruitment fairs. Since the members of our research group arealready used to interact with each other through their desktop computers, we can build on ourexisting knowledge about e-meetings to study the interaction that takes place with a wearablecomputer user. This gives us a rather unique opportunity for studying the real-life situationsthat such a user is exposed to, and derive the strengths and weaknesses with this form oftelepresence.

The key contribution of this paper is our experiences and observations of the currentproblems with remote interaction through wearable computing, and what obstacles must beovercome to make it more seamless. Furthermore, we propose solutions for how these short-comings can be alleviated or resolved, and how that in turn opens up for further research inthis area.

The organization of the paper is as follows: In section 3.2 we give a thorough introductionof our use of the e-corridor, serving as the basis for many of our observations and experiments.This is followed by section 3.3 in which we introduce our wearable computing research, anddiscuss how a wearable computer user can partake in the e-corridor. Section 3.4 continuesby presenting our experiences of this form of telepresence, focusing on the shortcomings ofthe interaction, both from a technical as well as a social standpoint. The issues identifiedare subsequently addressed, followed by an overall evaluation of the system in section 3.5.Finally, section 3.6 concludes the paper together with a discussion of future work.

3.1.1 Related Work

Telepresence using wearable computers has been studied in a number of different settings.Early work by Steve Mann, et al., have explored using wearable computers for personalimaging [45,47], as well as composing images by the natural process of looking around [46].Mann has also extensively used the “Wearable Wireless Webcam”1 — a wearable computingequipment for publishing images onto the Internet, allowing people to see his current viewas captured by the camera. Our work is similar to this in that we use wearable computers toprovide telepresence, yet it differentiates itself by instead conveying the experience into ane-meeting session.

1http://wearcam.org/

Page 71: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 53

In computer supported cooperative work (CSCW), telepresence by wearable computershas often been used to aid service technicians in a certain task. Examples of this include[73] by Siegel et al. who present an empirical study of aircraft maintenance workers. Thispaper addresses telepresence that is not as goal-oriented as typical CSCW applications —instead, emphasis is laid on what ways there are to convey the everyday presence of eachother, without any specific tasks or goals in mind. Roussel’s work on the Well [70] is a goodexample of the kind of informal, everyday, communication our research enables.

A related example is the research done by Ganapathy et al. on tele-collaboration [21] inboth the real and virtual world. This has similarities to our work, yet differs in that we attemptto diminish the importance of the virtual world, focusing more on bringing the audience tothe real world conveyed by a remote user. The audience should experience a feeling of “beingthere”, while the remote user should similarly have a feeling of them “being with him” — butnot necessarily becoming immersed in their worlds.

In [24], Goldberg et al. present the “Tele-Actor” which can be either a robot or a wearablecomputer equipped human person at some remote location, allowing the audience to vote onwhere it should go and what it should do. A more thorough description of the “Tele-Actor”and the voting mechanism in particular can be found in [25]. The function of the “Tele-Actor”is similar to what is enabled by our wearable computing prototypes, but our paper focuses onproviding that control through natural human to human interaction, rather than employing avoting mechanism.

As a contrast to using a human actor, an advanced surrogate robot for telepresence ispresented by Jouppi in [32]. The robot is meant to provide a user with a sense of being at aremote business meeting, as well as give the audience there the feeling of having that personvisiting them. The surrogate robot offers a highly immersive experience for the person incontrol, with advanced abilities to provide high quality video via HDTV or projectors, aswell as accurately recreating the remote sound field. Besides our use of a human being ratherthan a robot, and not focusing on business meetings in particular, we investigate this area fromthe opposite standpoint: Given today’s technology with e-meeting from the user’s desktop,what kind of telepresence experience can be offered by a human user, and is that experience“good enough”?

In [2], a spatial conferencing space is presented where the user is immersed in the wear-able computing world communicating with other participants. Another, highly immersiveexperience, is presented in [77] where Tang et al. demonstrate a way for two users to shareand exchange viewpoints generated and explored using head motions. In contrast, our paperdoes not strive to immerse the user in the wearable computer, but rather provide the experi-ence of an ambient, non-intrusive, presence of the participants. The motivation for this choiceis that we want the participants to experience telepresence, and for that reason the remote useris required to remain focused on the real world — not immersed in a virtual world.

In [41], Lyons and Starner investigate the interaction between the user, his wearable com-puter and the external context as perceived by the user, for the purpose of performing usabilitystudies more easily. Our paper reaches similar conclusions in how such a system should bebuilt, but differentiates itself through our focus on telepresence rather than usability studies.

Page 72: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

54 Experiences of Using Wearable Computers for Ambient Telepresence ...

In [57], we present our experiences from sharing experience and knowledge through theuse of wearable computers. We call this the Knowledgeable User concept, focusing on howinformation, knowledge and advice can be conveyed from the participants to the user of awearable computer. In this paper, we instead discuss how this information can be conveyed inthe other direction — from the remote side and back to a group of participants. Furthermore,we elaborate on this concept by discussing the current problems in this setup, our solutionsto these, and how the end result allows us to achieve a more streamlined experience.

3.2 Everyday Telepresence

In the Media Technology research group, collaborative work applications are used on a dailybasis. Not only are regular e-meetings held from the user’s desktop as a complement tophysical meetings, but the applications run 24 hours a day in order to provide the groupmembers with a continuous sense of presence of each other at all times. In this section, wewill discuss how we use this so called “e-corridor” to provide everyday telepresence.

The collaborative work application that we use for the e-corridor is called Marratech Pro,a commercial product from Marratech AB2 based on earlier research [58] in our researchgroup. Marratech Pro runs on an ordinary desktop computer and allows all the traditionalways of multimodal communication through use of video, audio and text. In addition, itprovides a shared web browser and a whiteboard serving as a shared workspace, as well asapplication sharing between participants. Figure 3.1 shows the e-corridor as a typical exampleof a Marratech Pro session.

The members of our research group join a dedicated meeting session, the e-corridor,leaving the Marratech Pro client running in the background throughout the day. By allowingthose in the group to see and interact with each other, this provides the members with a senseof presence of each other. Normally, each member works from their regular office at theuniversity, using the client for general discussions and questions that may arise. Even thoughmost members have their offices in the same physical corridor, the client is often preferredas it is less intrusive than a physical meeting. For example, for a general question a personmight get responses from multiple members, rather than just a single answer which a physicalvisit at someone’s office may have yielded. Similarly, each member can decide whether topartake in a discussion or not, based on available time and how much they have to contribute.The ambient presence provided by running the client throughout the day allows members toassess their fellows’ workload, glance who are present or not, and in general provide a feelingof being together as a group.

However, providing presence for people who are still physically close to each other isnot everything; the true advantage of using the e-corridor becomes more apparent whengroup members are situated at remote locations. The following examples illustrate how thee-corridor has been used to provide a sense of telepresence for its members.

Working from home. Sometimes, a person needs to work from their home for some reason;maybe their child has caught a cold, or the weather is too bad to warrant a long com-

2http://www.marratech.com/

Page 73: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 55

Figure 3.1: A snapshot of a typical Marratech Pro session.

muting distance. In such situations, rather than becoming isolated and only use phoneor email to keep in touch with the outside world, the e-corridor is used to get a senseof “being at work” together with their fellow co-workers.

Living in other places. In our research group, some members have for a period of time beenliving in another city or country, and thus been unable to commute to their regular officeon a daily, weekly or even monthly basis. For example, one doctorand worked as anexchange student in another country for several months, while another person for overa year lived in a city hundreds of miles away. By using the e-corridor, the feeling ofseparation became significantly diminished; as testimonied by both the remote personas well as the local members remaining, it was sometimes difficult to realize that theywere physically separated at all.

Attending conferences. As members of the research group travel to national or internationalconferences, they have been accustomed to enjoy their fellow co-workers’ company re-gardless of time or place. For example, during long and tedious hours of waiting in theairport, members often join the e-corridor to perform some work, discuss some issue,or simply to chat with people in general. When attending the conference, the remotemember can transmit speeches with live video and audio to the e-corridor, allowingpeople who are interested in the topic to listen, follow the discussion, and even askquestions themselves through that person. If the remote person is holding a presenta-tion, it has often been the case that the entire research group has been able to follow it;encouraging, listening to, and providing support, comments and feedback to the pre-senter. In a sense, this allows the entire research group to “be there” at the conference

Page 74: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

56 Experiences of Using Wearable Computers for Ambient Telepresence ...

Figure 3.2: The wearable computer prototype being worn by one of the authors.

itself, and it also allows the remote person to experience a similar feeling of having thegroup with him.

The seemingly trivial level of presence provided in ways like those described aboveshould not be underestimated; even with simple means, this form of ambient everyday telep-resence can have a strong influence on people and their work. Another testimony of theimportance of this form of subtle, ambient, presence can be found e.g. in [61], where Paulosmentions similar awareness techniques for attaining user satisfaction.

Subsequently, by enabling a wearable computer user to join the e-corridor, the participantsshould be able to experience an encompassing form of telepresence. The remote user shouldsimilarly be able to feel the participants as “being with him”, but not necessarily becomingimmersed in the same way as they are.

3.3 Wearable Computers

In this section our wearable computer prototypes are presented, focusing on the hardware andsoftware used to allow the prototypes to function as a platform for telepresence.

Page 75: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 57

Figure 3.3: The Marratech Pro client as seen through the user’s head-mounted display.

In terms of hardware, the wearable computer prototypes we build are based entirely onstandard consumer components which can be assembled. The reason for favouring this ap-proach, rather than building customized or specialized hardware, is that it allows for easyreplication of the prototypes. For example, other researchers or associated companies whowish to deploy a wearable computing solution of their own, can easily build a similar plat-form.

The current prototype consists of a backpack containing a Dell Latitude C400 laptopwith built-in IEEE 802.11b wireless network support. The laptop is connected to an M2Personal Viewer head-mounted display, with a web camera mounted on one side providinga view of what the user sees. Interaction with the computer is done through a Twiddler2hand-held keyboard and mouse, and a headset is provided for audio communication. Figure3.2 shows the prototype when being worn by one of the authors. This setup allows the userof the wearable computer to interface with a regular Windows XP desktop, permitting easydeployment, testing and studying of applications for mobility.

To perform studies on remote interaction and telepresence, the platform needs suitablesoftware — in our case, we have chosen to run the Marratech Pro client. Figure 3.3 showsthe user’s view of the application as seen through the head-mounted display.

There are both advantages and disadvantages with using an existing e-meeting applica-tion, such as Marratech Pro, for the prototype. The main advantage is that it provides acomplete, fully working, product that our research group already uses on a daily basis. Thisis, naturally, a desirable trait rather than “reinventing the wheel” by developing an applicationfor mobile communication from scratch. It should be noted that as the product is a spin-off

Page 76: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

58 Experiences of Using Wearable Computers for Ambient Telepresence ...

from previous research, we have access to the source code and can make modifications ifneeded, adapting it gradually for use in wearable computing scenarios. The second, perhapsmost important advantage, is that the client allows us to participate in the e-corridor. Thismakes studies, observations and experiments on wearable computing telepresence easy todeploy and setup.

The disadvantage that we have found lies in the user interface which, albeit suitable forordinary desktop computing, can become very cumbersome to use in the context of wear-able computing. This observation holds true for most traditional WIMP3 user interfaces, forthat matter; as noted e.g. by Rhodes in [67] and Clark in [10], the common user interfacesemployed for desktop computing become severely flawed for wearable computing purposes.Although the user interface is not streamlined for being used in wearable computing, it re-mains useable enough to allow a person to walk around while taking part in e-meetings.Furthermore, the problems that emerge actually serve to point out which functions are re-quired for wearable computing telepresence, allowing research effort to go into solving thoseexact issues. In this way, focus is not aimed at developing the perfect wearable user interfacefrom scratch, as that can risk emphasizing functionality that will perhaps not be frequentlyused in the end. Rather, in taking a working desktop application, the most critical flaws canbe addressed as they appear, all while having a fully functional e-meeting application duringthe entire research and development cycle.

3.4 Experiences of Telepresence

In this section, the experiences of using a wearable computer for telepresence in the e-corridorwill be discussed. The problems that arose during those experiences will be brought forward,together with proposals and evaluations on how those issues can be resolved.

The wearable computer prototype has mainly been tested at different fairs and events,providing a telepresence experience for people within our research group as well as to vis-itors and students. The fairs have ranged from small-scale student recruitment happenings,medium-sized demonstrations and presentations for researchers and visitors, and to large-scale research exhibitions for companies and funding partners. The prototype has been usedin the local university campus area, as well as in more uncontrolled environments — e.g. inexhibition halls in other cities. In the former case, the necessary wireless network infrastruc-ture have been under our direct control, allowing for a predictive level of service as the userroams the area covered by the network. However, in the latter case, the network behaviour isoften more difficult to predict, occasionally restricting how and where the user can walk, andwhat quality of the network to expect. Both these cases, and especially the latter, serve asvaluable examples on the shifting conditions that a wearable computer user will, eventually,be exposed to in a real-world setting. We believe it is hard or impossible to estimate many ofthese conditions in a lab environment, warranting this kind of studies being made in actualreal-life settings.

When using a wearable computer for telepresence, unexpected problems frequently ariseat the remote user’s side — problems that are at times both counter-intuitive and hard to

3Windows, Icons, Menus, Pointer.

Page 77: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 59

predict. These need to be resolved in order to provide a seamless experience to the audience,or else the feeling of “being there” risk being spoiled. Below follows the primary issuesidentified during the course of our studies.

3.4.1 User Interface Problems

As mentioned previously, the common WIMP user interfaces employed on the desktop doesnot work well in wearable computing. The primary reason for this is that the graphical userinterface requires too much attention and too fine-grained level of control, thereby causinginterference with the user’s interaction with the real world. What may not be entirely appar-ent, however, is that these problems in turn can have severe social implications for the user,and those in turn interfere and interrupt the experience given to the audience.

As an example, the seemingly trivial task for the user to mute incoming audio will begiven. This observation was initially made at a large, quite formal fair, arranged by fundingpartners and companies, but we have experienced it on other occasions as well. In order tomute audio, the collaborative work application offers a small button, easily accessible throughthe graphical user interface with a click of the mouse. Normally, the remote user receivedincoming audio in order to hear comments from the audience while walking around at thefair. However, upon being approached by another person, the user quickly wanted to mutethis audio so as to be able to focus entirely on that person. It was at this point that severalunforeseen difficulties arose.

The social conventions when meeting someone typically involves making eye-contact,shaking hands while presenting yourself, and memorizing the other person’s name and affil-iation. The deceptively simple task of muting incoming audio involves looking in the head-mounted display (preventing eye-contact), using the hand-held mouse to move the pointerto the correct button (preventing you to shake hands), trying to mute the incoming audio(currently preventing you to hear what the other person says). These conflicts either made itnecessary to ignore the person approaching you until you were ready, or to try to do it all atonce which was bound to fail. The third alternative, physically removing the headset fromthe ear, was often the most pragmatical solution we chose to use in these situations.

Although this episode may sound somewhat humorous, which it in fact also was at thattime, there are some serious conclusions that must be drawn from experiences like this. Ifsuch a simple task as muting audio can be so difficult, there must surely be a number of similartasks, more or less complex, that can pose similar problems in this kind of setting. Somethingas trivial as carrying the Twiddler mouse and keyboard in the user’s hand, can effectivelyprevent a person, or at least make it more inconvenient, to shake hands with someone. As therisk of breaking social conventions like this will affect the experience for everyone involved— the remote user, the person approaching, and the audience taking part — care must betaken to avoid this type of problems.

The specific situation above has been encountered in other, more general forms, on severaloccasions. The wearable computer allows for the remote user to work even while conveyinglive audio and video back to participants. An example of when this situation occurs is whenthe remote user attends a lecture. The topic may not be of immediate interest to the remote

Page 78: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

60 Experiences of Using Wearable Computers for Ambient Telepresence ...

user, thereby allowing her to perform some other work with the wearable computer in themeantime. However, those persons on the other side who are following the lecture may findit interesting, perhaps interesting enough to ask a question through the remote user. In thiscase, that user may quickly need to bring up the e-meeting application, allowing her to serveas an efficient mediator between the lecturer and the other persons. In our experience, thiscontext switch can be difficult with any kind of interface, as the work tasks need to be hiddenand replaced with the e-meeting application in a ready state. The cost in time and effort indoing context switches like this effectively prevents a fully seamless remote interaction.

With the goal of providing a seamless and unhindered experience of telepresence, the userinterface for the remote user clearly needs to be improved in general. Rather than trying todesign the ideal user interface — a grand endeavour that falls outside the scope of this paper— we propose three, easy to implement, solutions to the type of problems related to the userinterface of a wearable telepresence system.

• Utilize a “Wizard of Oz” approach [12]. It is not unreasonable to let a a team mem-ber help controlling the user interface of the remote user, especially not since there isalready a group of people immersed in the remote world. We have done some prelimi-nary experiments on using VNC [69], allowing a person sitting at his local desktop toassist the user of the wearable computer by having full control of her remote desktop.For example, typing in long URL:s can be difficult if one is not accustomed to typingon a Twiddler keyboard, but through VNC the assistant can type them on a keyboardon demand from the remote user. In a similar experiment, one person followed theremote user around, using a PDA with a VNC client running that allowed him to giveassistance. It should be noted that this solution still offers some form of telepresencefor the assistant, as that person can still see, via the remote desktop, a similar view aswould have been seen otherwise.

• Automatically switch between real and virtual world. Even a trivial solution such asswapping between two different desktops — one suitable for the real world (i.e. thee-meeting application for telepresence), and the other suitable for work in the virtualdomain (i.e. any other applications for work or leisure that the remote user may berunning) — would make life simpler. By letting the switch be coupled to natural actionsperformed, e.g. sitting down, standing up, holding or releasing the Twiddler, the useris relieved of the burden of having to actively switch between two applications. Theadvantage may be small, but it can still be significant for efficiently moving betweenthe real and virtual worlds.

• Reduce the need for having a user interface at all.4 Plain and simple, the less the remoteuser has to interact with the computer, the more he can focus on conveying the remotelocation to the audience. The hard part here is to find a proper balance, so that theremote user can still maintain the feeling of having his group present and followinghim.

4If a user interface is still required for some reason, our research in the Borderland architecture [56] intends toprovide ubiquitous access to the tools needed.

Page 79: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 61

3.4.2 Choice of Media for Communicating

For verbal communication, Marratech Pro offers both audio and text. Either media is im-portant to have access to at certain occasions, as evidenced by our experiences describedin [57]. As the wearable computer user is exposed to a number of different scenarios, beingable to change between these media is a prerequisite for the communication to remain con-tinuous and free from interruptions. For example, in the case discussed above, the remoteparticipants’ spoken comments interfered with the user’s real world spoken dialogue. Ratherthan muting audio, a better solution would have been if the participants had instead switchedover to sending their comments as text. This is something that is relatively simple to enforceby pure social protocols; as the participants are already immersed in the world that the userpresents, they will be able to determine for themselves when it is appropriate to speak or not.However, although users can switch media at their own choice, this is not an ideal solutionfor seamless communication. For example, it requires participants to consciously care aboutwhich media to use, and does not take in account that they in turn may prefer one media overanother for some reason.

To alleviate the problem of having all participants agree on using the same media, wehave developed a prototype group communication system in Java that can arbitrarily convertbetween voice and text. Running the prototype, a user can choose to send using one media,while the receiver gets it converted to the other media. For example, a wearable computeruser can choose to receive everything as text, while the other participants communicate byeither spoken or written words. As speech recognition and voice synthesis techniques arewell researched areas, the prototype is built using standard consumer products offering suchfunctionality; currently the Microsoft Speech SDK 5.15 is used.

The architecture for the system can be seen in figure 3.4. The system accepts incomingstreams of audio or text entering through the network, which are then optionally convertedusing speech recognition or voice synthesis, before they are presented to the user. Similarly,outgoing streams can be converted before they reach the network and are transmitted to theother participants. In practice, the implementation cannot perform speech recognition at thereceiving side, nor voice synthesis at the sending side, due to limitations in the speech SDKcurrently used. Both of these conversions are, however, fully supported at the opposite sides.

The prototype allows the choice of conversions being made to be controlled both locallyand remotely. This means that participants can choose by what media communication fromthe remote user should be conveyed. For example, the remote user may lack any means forentering text, forcing her to rely solely on sending and receiving audio for communication.The participants, on the other hand, may prefer to communicate via text only. E.g. for aperson attending a formal meeting, the only way to communicate with the outside world maybe sending and receiving text through a laptop. The person in the meeting may thereforerequest the remote prototype to convert all outgoing communication to text. Similarly, theremote user has his prototype synthesizing incoming text into voice. In this way, a group ofpeople can communicate with each other, with each person doing it through their preferredmedia.

5http://www.microsoft.com/speech/

Page 80: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

62 Experiences of Using Wearable Computers for Ambient Telepresence ...

Text

Text

Voice

VoiceSynthesizer

Voice

SpeechRecognition

(TTS)

(SR)

Figure 3.4: Architecture of the voice/text converter prototype, enabling communicationacross different media.

The prototype runs under Windows XP serving as a proof of concept. Initial experimentshave been performed using it for communication across different media. In the experiment,three persons held a discussion with each other, with each person using a certain media orchanging between them arbitrarily. The results of these experiments indicate that this isa viable way of enabling seamless communication. Naturally, there are still flaws in thespeech recognition, and background noise may cause interference with the speaker’s voice.Nevertheless, as further progress is made in research on speech recognition, we consider asystem like this will be able to provide a more streamlined experience of telepresence.

3.5 Evaluation

In this section we will give an overall evaluation of our wearable system for telepresence.Emphasis will be placed on evaluating its overall usability and the different means for howan end-user can experience and interact with the remote world.

3.5.1 Time for Setup and Use

The time to setup the system for delivering an experience depends on how quickly partici-pants and wearable computer users can get ready. The strength of our approach of utilizingMarratech Pro and the e-corridor, is that the software is used throughout the day by all par-ticipants. This means that in all experiments we have performed, we have never had anyrequirements for persons to e.g. move to a dedicated meeting room, start any specific appli-cation, or to dedicate a certain timeslot to follow the experience. For them, the telepresencebecomes an ambient experience that can be enjoyed as much or as little as desired, all fromthe comfort of their own desktop.

Page 81: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 63

As for the user equipped with a wearable computer, the setup time is often much longerdue to the reasons listed below.

• The backpack, head-mounted display, headset and Twiddler are surprisingly cumber-some to put on and remove. Even though everything is relatively unobtrusive once fullyworn, the time to actually prepare it is too long; for example, the head-mounted dis-play needs to be arranged properly on the user’s head, and cables become intertwinedmore often than not. All this makes the wearable computer less used in situations thatwarrant its use within short notice.

• The batteries for the laptop and the head-mounted display needs to be charged andready for use. As this can not always be done with just a few hours worth of notice,this effectively prevents rapid deployment of the wearable computer to capture a certainevent.

• The time for the laptop to start, together with gaining a network connection and launch-ing the e-meeting application, is about 5 minutes in total — this is too long to be ac-ceptable.

These are relatively minor problems, yet in resolving these the wearable computer canbecome more easily used for telepresence experiences than it is today. We consider this tobe a prerequisite before it will be commonly accepted outside of the research area as a viabletool for telepresence. Therefore, in order to overcome these limitations, the next generationwearable system we design shall exhibit the properties listed below.

• By using a vest instead of a backpack containing the wearable computer, the head-mounted display, headset and Twiddler can be contained in pockets. This way, theyremain hidden until the vest is fully worn and the user can produce them more easily.

• By using an ordinary coat hanger for the vest, a “docking station” can easily be con-structed that allows battery connectors to be easily plugged in for recharging. This alsomakes using the vest-based wearable computer more natural, and thus also more easilyused and accepted by the general public.

• By having the wearable computer always on or in a hibernated state when not worn andused, it allows easy restoration of the e-meeting so that anyone can wear and operate itwithin short notice.

These properties will serve to make the wearable computer easier to wear and use, therebymaking it possible for anyone to wear it in order to deliver an experience of telepresence.

3.5.2 Different Levels of Immersion

The e-corridor normally delivers a live stream of information (e.g. video, audio, chat, etc.)which the participants can choose to immerse themselves in. Typically, this is also the mostcommon way of utilizing e-meeting applications like this. However, previous research in

Page 82: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

64 Experiences of Using Wearable Computers for Ambient Telepresence ...

Figure 3.5: A screenshot of the Marratech Pro web interface, allowing access to e-meetingsvia web browsers.

our group has added other ways of being part of an e-meeting; the first is a web interface[60], while the second is a history tool [59]. This gives us three distinct levels for how thetelepresence can be experienced.

Marratech Pro. Using the e-meeting application, a live stream of video and audio al-lows the participants to get a first-hand experience of the event. The participants can delivercomments and instructions for the remote user, giving them a feeling of “being there” and al-lowing some degree of control of that user. Similarly, the remote user can deliver annotationsand comments from the event, increasing the participants’ experience further. What they say,do, and desire all have an immediate effect on the whole group, making the immersion veryencompassing.

Web interface. Occasionally, persons are in locations where the network traffic to thee-meeting application is blocked by firewalls, or where the network is too weak to deliverlive audio and video streams. To deal with such occasions, research was done on a webinterface [60] that provides a snapshot of the current video, together with the full history ofthe text-based chat. The web interface can be seen as a screenshot in figure 3.5. Accessingthis interface through the web, participants can get a sense for what is currently going onat the moment. Although they are not able to get a live, streaming, experience, the webinterface has proven to work good enough to allow participants to control and follow thewearable computer user around.

For example, at one occasion, a person used to doing demonstrations of the wearablecomputer was attending an international conference, the same day as a large exhibition wasto take place at his university back home. As he was away, another person had to take on

Page 83: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 65

Figure 3.6: A screenshot of the Marratech Pro history tool, archiving events of interest.

his role of performing the demonstration. Due to problems in the network prohibiting theregular e-meeting client to run properly, the web interface was the only possible way ofjoining the e-corridor. Nevertheless, this allowed him to follow that remote user during thedemonstration — offering advice and guidance, and even being able to talk (through theremote user) to persons he could identify in the video snapshots. For this person, the webinterface allowed him to “be” at the demonstration, while he in fact was in another country,and another time zone for that matter, waiting for the conference presentations to commence.This example serves to illustrate that very small means seem to be necessary to performeffective telepresence, and also how a user can seamlessly switch between different levels ofimmersion and still have a fruitful experience.

History tool. The history tool [59] is a research prototype that captures and archivesevents from an e-meeting session. A screenshot of the tool can be found in figure 3.6. The toolallows people to search for comments or video events, as well as browse it in chronologicalorder to basically see what has happened during the last hours, days or weeks (e.g to seewhether a meeting has taken place or not). Snapshots of the video for a particular user arerecorded whenever a user enters a chat message, together with the text message itself andthe time when it was written. Using motion detection techniques, snapshots are also takenwhenever something happens in the video stream. E.g. when a person enters or leaves theiroffice, video frames from a few seconds before and after the triggering event will be recorded,thus being able to see whether that person is actually entering or leaving the room. Naturally,this is mainly suitable and used for clients equipped with a stationary camera, because a head-mounted camera tends to move around a lot causing most video to be recorded. Furthermore,events related to a single person can be filtered out in order to follow that particular personduring the course of a day, for example.

Page 84: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

66 Experiences of Using Wearable Computers for Ambient Telepresence ...

In terms of telepresence, the tool is, as the name suggests, a history tool and as suchdoes not offer any means for interacting with the persons6. However, it serves as a valuablestarting point for someone who has missed the beginning of e.g. the coverage of a certainexhibition, and who wants a summary and recap of the course of events so far. This may bedone in order to prepare the user for becoming more immersed when following the rest of thecoverage live, something which can be more easily done having first received the summaryinformation as a primer.

The advantage of using the history tool, rather than letting the user watch a completerecording of the events so far, is that the tool often tends to manage capturing the events thatare of key interest. For example, as something is seen by or through the wearable computeruser, the amount of chat and conversations often rise, thereby capturing a large amount ofvideo as well as audio clips around that point in time. In this way, the history tool serves asan efficient summary mechanism that implicitly captures events of interest; the more interest,the more conversations and actions, and the more will be archived and subsequently reviewed.After having gone through the history tool, the user can easily switch to more live coveragevia the client or web interface. Thus, the history tool serves to make the transgression fromthe real world to the immersion in the remote world more seamless.

3.5.3 Appearance and Aesthetics

We have found that the appearance and looks of the wearable computer can dramaticallyinfluence the audience’s experience of telepresence. What we mean by this statement is thatthe user of a wearable computer tends to stand out among a crowd, often drawing a lot ofattention causing more people to approach the person out of curiosity — more so than whatwould have been the case without wearing the computer. Sometimes, people even becomeintimidated by being confronted with a “living computer” — again, causing people to reactin ways they would not normally do. Although the effects are not always negative7, it isimportant to be aware of the fact that they do exist and that they will, invariably, affect how theremote location is perceived. This becomes even more important to bear in mind consideringthat the audience may have no idea that this takes place, thereby being given a flawed or atleast skewed perception of the remote location.

As telepresence should, in our opinion, be able to offer the participants a representation ofa remote location that is as true and realistic as possible, measures need to be taken to ensurethat the wearable computer will blend in with its user and the surrounding environment. Forthis reason, our next generation wearable computer will be smaller and designed to hide thetechnology as much as possible, according to the following criterias.

• A head-mounted display is difficult to hide and, due to its novelty, draws a lot of at-tention. With a smaller display, optionally mounted on a pair of glasses, it will be lessnoticed and easier to hide. At the same time, it becomes easier to motivate its usewhen people ask questions — motivating the use of a large, bulky display does not

6For any interaction, either the Marratech Pro client or the web interface can be used.7On the contrary, wearable computers often generate much attention and numerous socially beneficial interactions

with people.

Page 85: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 67

tend to sound credible to most people we have met. The less focus that is laid on thetechnology permitting telepresence, the more effective will it be.

• Eye-contact is very important; our experiences have shown that for efficient socialinteraction, both parties need to see both of each others’ eyes. A semi-transparenthead-mounted display allows the remote user to get eye-contact, yet one eye remainsobscured from the other person’s viewpoint. In this respect, the choice of a semi-transparent or opaque display has little impact on telepresence — the primary require-ment is that it allows for eye-contact so that the experience delivered is not hindered.

• The camera is very important as it conveys video to the other participants. As discussedin [20], there are benefits and drawbacks with different placements, so a definite answeris hard to give for the case of providing good telepresence. Also, from a socio-technicalstandpoint, the question is whether the camera should be hidden well enough not todisturb the scene it captures, or if it should remain visible to let people know theiractions are being conveyed to other people watching. For the time being, the camerafor our wearable computer will remain head-mounted and visible to the spectators,since this allows us to effectively convey the scene with a relatively modest level ofdisturbance.

Referring to the previous discussion regarding eye-contact; in terms of allowing the au-dience to “meet” with a remote person seen through the wearable computer, they mustbe given the impression of eye-contact with that person. In [9], Chen presents a studyof how accurately persons can perceive eye-contact. The results can be interpreted assuggesting the upper part of the head, rather than the areas in the lower part or shoulderareas, as the proper position for a head-mounted camera. Such placement, e.g. on topof the user’s head or at the sides (as it is in our current setup), should provide a feel-ing of eye-contact for the audience, without drawing too much attention from the user.However, a more formal user study is required to validate this hypothesis of properplacement for eye-contact with a wearable camera.

• The Twiddler mouse and keyboard is currently a prerequisite for interacting with thewearable computer, yet as discussed before in section 3.4.1, it also interferes with theuser’s interactions in the real world. However, for the sole purpose of providing telep-resence, the only interaction that is actually required on behalf of the remote user iswhen comments need to be entered as text. This observation means that if the partici-pants can cope without such feedback, it will free the remote user’s hands and allow fora more effective interaction with the remote environment. This, in turn, should makefor a better experience that is not interrupted by the technology behind it. Of course,there is still the question whether this benefit outweighs the lack of textual comments,but that is likely to vary depending on the event that is covered. There may also beother types of keyboard which are less likely to cause this kind of problems, althoughwe have only utilized the Twiddler so far in our experiments.

• Using a vest rather than a backpack to hold the computing equipment will enable theuser to move around, and especially sit down, much more comfortably. With a back-pack, the user lacks support for his back when sitting or leaning against objects, at

Page 86: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

68 Experiences of Using Wearable Computers for Ambient Telepresence ...

the same time the added weight of the batteries and laptop cause fatigue in the area ofshoulders and neck. This fatigue tends to reduce the physical movement of the remoteuser after long hours of covering an event, and this is detrimental for the audience andserves to reduce their motivation for following the event. Also, to allow for an im-mersive telepresence, the remote user should be able to partake in social activities —especially something as simple such as sitting down discussing with someone over acup of coffee. Using a vest, the weight and computing equipment is distributed overa larger part of the user’s body, thereby making it less obtrusive and permitting morefreedom of movement and positions possible.

The above list constitutes our observations of using wearable computers in telepresence.Many of the problems are commonly known in the field of wearable computing, yet theiractual implications for telepresence have not been emphasized. Motivated by the need for theexperience to be as effective and unbiased as possible, our conclusion is that the appearanceand aesthetics of a wearable computer must be taken in consideration when planning to usesuch a platform for telepresence.

3.5.4 Remote Interactions made Possible

The remote interactions that the system allows is currently limited mainly to unidirectionalcommunication, coming from the persons at the remote side to the local participants whoreceive it. The people at the remote location currently have no way of seeing the participants,as the remote user is “opaque” in that sense. Participants who wish to speak with remotepersons must do so through the user of the wearable computer, who serves as a mediatorfor the communication. This is further described in [57] where we utilize this opacity in theKnowledgeable User concept, where the remote user effectively becomes a representative forthe shared knowledge of the other participants. Except for the option of adding a speaker tothe wearable computer, thus allowing participants to speak directly with remote persons, wedo not have any plans to allow for bidirectional interaction. Rather, we remain focused onproviding an ambient sense of presence to the remote user as well as the participants.

3.5.5 Summary

We will summarize this evaluation of our wearable telepresence system in three statements,serving as advice for those who wish to reproduce and deploy a similar system.

• The time to prepare, setup and use the system will influence how much it will be usedin everyday situations, warranting the design of a streamlined system if an investmentin such technology is to be made.

• A participant can easily shift between different levels of immersion, and even withrelatively unsophisticated means get a good experience and interact with the remoteenvironment.

Page 87: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Experiences of Using Wearable Computers for Ambient Telepresence ... 69

• The aesthetical appearance of the wearable computing equipment should not be ne-glected, as this may otherwise influence the people at the remote location for better orfor worse.

3.6 Conclusions

We have presented our experiences of using a wearable computer as a platform for telep-resence, conveying the presence of a remote locations to the participants of a continuouslyrunning e-meeting session. Experiences in real-life scenarios such as fairs, events and ev-eryday situations, have allowed us to identify shortcomings and subsequently address theseto improve the platform. We have evaluated the platform in terms of overall usability, andmotivated what is of importance for the audience’s experience to be as seamless as possible.In the introduction, we posed three research questions which we will now summarize ouranswers to.

• The form of telepresence that can be provided using today’s wearable computing tech-nology can be very encompassing; even with an ordinary e-meeting application at theuser’s desktop, a fruitful experience can be delivered. For users who are already accus-tomed to enjoy the everyday presence of their fellow co-workers at their desktops, thestep into mobile telepresence is a small one to take in order to extend its reach evenfurther.

• To deliver a seamless experience of telepresence, the remote user must be able to freelyinteract with his environment, without social or technical obstacles that are not part ofwhat should be conveyed. From a participant’s point of view, having access to multipleinterfaces (i.e. live, via the web, or via historical accounts) through which an event canbe experienced, can be desirable in order for a seamless experience regardless of placeand time.

• To simplify the deployment of wearable telepresence in everyday life, the remote user’sequipment needs to be unobtrusive to handle and less noticeable, in order not to inter-fere with the remote environment. The user interface of the remote user must for thisreason be highly efficient, while for participants an ordinary e-meeting application canserve to provide an experience that is good enough.

3.6.1 Future Work

We will redesign our current wearable computer prototype and fully incorporate the solutionssuggested in this paper, in order to streamline the user’s interaction with the wearable com-puter and the surrounding environment. The long term goal is to make remote interactionmore efficient in general, allowing knowledge to pass back and forth between local and re-mote participants, either directly through the wearable technology itself or through the userof it acting as a mediator.

Page 88: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

70 Experiences of Using Wearable Computers for Ambient Telepresence ...

3.7 Acknowledgments

This work was funded by the Centre for Distance-spanning Technology (CDT) under theVITAL Mål-1 project, and by the Centre for Distance-spanning Health care (CDH).

Page 89: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 4

Methods for Interrupting aWearable Computer User

71

Page 90: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 91: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 73

Methods for Interrupting a Wearable Computer User

Mikael Drugge1, Marcus Nilsson1, Urban Liljedahl2, Kåre Synnes1, Peter Parnes1

1Division of Media Technology, 2Division of Computer Science and NetworkingDepartment of Computer Science & Electrical Engineering

Luleå University of TechnologySE–971 87 Luleå, Sweden

{mikael.drugge, marcus.nilsson, urban.liljedahl, kare.synnes, peter.parnes}@ltu.se

November, 2004

Abstract

A wearable computer equipped with a head-mounted display allows its user to receive no-tifications and advice that is readily visible in her field of view. While needless interruptionof the user should be avoided, there are times when the information is of such importancethat it must demand the user’s attention. As the user is mobile and likely interacts with thereal world when these situations occur, it is important to know in what way the user can benotified without increasing her cognitive workload more than necessary. To investigate waysof presenting information without increasing the cognitive workload of the recipient, an ex-periment was performed testing different approaches. The experiment described in this paperis based on an existing study of interruption of people in human-computer interaction, butour focus is instead on finding out how this applies to wearable computer users engaged inreal world tasks.

4.1 Introduction

As time goes by, wearable computers can be made smaller, increasingly powerful and moreconvenient to carry. When such a computer is network enabled within a pervasive computingenvironment, its user is able to access a wide range of information while at the same timeallowing herself to be notified over the network. Such notification can either be expected likein a conversation, or it can come unexpectedly in which the recipient has no way of anticipat-ing the information — neither its content nor its time of arrival. While interrupting the userneedlessly should be avoided in general, this latter kind of notification can be exemplified byemergency situations in which the user must be notified about an issue and resolve it, yet stillbe able to continue functioning in doing real world tasks.

For example, a medical doctor at an emergency site or a fire fighter in a disaster area mayneed to perform their normal work in the real world, but at the same time they must also bekept informed about the progress of other workers and possibly assist with guidance througha wearable computer. Since both of these tasks are viewed as important by the user, it is vital

Page 92: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

74 Methods for Interrupting a Wearable Computer User

to assess how the virtual task can be presented for a user while minimizing interference withher real world task.

Furthermore, since the wearable computer is meant to act as an assistant for its user ineveryday life, (e.g. as exemplified by the remembrance agent [66] and the shopping jacket[63]), it is important to increase the knowledge on how interruption of users should be done.As wearable computers become more common it is important to develop tools to capture datafor usability studies [41]. This should be done so that the future design of wearable computerscan go from building complex and specialized hardware to developing user interfaces thatsupport the interaction with the user.

The research question this brings forward is how to interrupt the user of a wearable com-puter without increasing her cognitive workload more than is absolutely necessary. Consider-ing a wearable computer built out of standard consumer products with basic video and audiocapabilities, what ways are there to present information to the user? In what ways can a userbe notified that new information exists and needs to be dealt with, and which is the mostpreferable method for doing so?

Our main hypothesis is that the type of notification will have a disparate impact on theuser’s workload, and that the performance will be affected differently depending on how theuser is allowed to handle the interruptions.

The organization of the paper is as follows. Section 4.2 presents the experiment withthe tasks and treatments used. Section 4.3 discusses the method used for conducting theexperiments, and section 4.4 presents the results. Finally, section 4.5 concludes the papertogether with a discussion of future work.

4.1.1 Related Work

In [53], McFarlane presents the first empirical study of all four known approaches to theproblem of how to coordinate user interruption in human-computer interaction and multipletasks. His study is done with respect on how to interrupt the user within the context ofdoing computer work without increasing that person’s cognitive workload. A more detaileddescription of this study is given in [52].

The study presented in our paper repeats the experiment done in [53], but focuses insteadon the interruption of a wearable computer user involved in real world tasks. We are thusable to compare the results from both studies to see whether they differ and how the user isaffected by performing the tasks in a wearable computing scenario.

In [30], the use of sensors in order to determine human interruptibility is presented. Whilethis is most certainly useful and would be highly valuable to have in a wearable computerenvironment, our study instead focuses on when the interruption is of such importance that itcannot be postponed. That is, regardless of how involved the person is in real world tasks, theinterruption must still take place even if that would be intrusive and may affect performancenegatively. As an example of when this would occur, imagine having two tasks of equalimportance, where one task cannot be put on hold for a very long time at the expense of theother.

Page 93: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 75

In [11] an experiment is presented where a person asks questions to a user playing a game,thereby interrupting him and forcing him to respond before continuing playing. The studyshows what happens if the asker is given clues about the user’s workload, as that should allowhim to ask questions at more appropriate times and withhold them during critical periodsin the game. In a wearable computer environment, this information could be conveyed bysending live video and audio streams from the wearable computer user to a person at a remotelocation. However, there are privacy concerns with this approach, and it may also be the casethat the interruption is not initiated by a person being able to assess the situation — it maybe machine initiated or triggered by events beyond human control. For such occasions, webelieve interruption will still occur even during critical periods of time, and thus it is stilldesirable to know what methods of interruption will disturb the user the least.

A related study is Maglio’s study of peripheral information [43] where the user’s cog-nitive workload is measured when working on one task while getting unrelated peripheralinformation. The study does not consider the use of wearable computers but is interesting asthe use of peripheral information could be a good way to notify users of such computers. Incontrast to our study, the users did not act on the notification given.

The study made by Brewster [6] shows that sound is important in single tasks when thevisual capabilities of the device are restricted. Our study also investigates the effect of soundbut in a scenario with dual tasks.

4.2 Experiment

The experiment addresses how different methods of interrupting the user of a wearable com-puter will affect that person’s cognitive workload. The interruption in this case originatesfrom the wearable computer and calls for the user to interact and then carry on with the realworld task as before. In order to measure the user’s performance in both types of tasks, thesemust be represented in an experimental model. This section describes the general idea of eachtask and how they are combined in the experiment, the setup is based on that used in [53].

4.2.1 Real World Task

The experiment has a real world task represented as a trivial yet challenging computer game1

which the user plays on a laptop computer. The objective of the game is to bounce jumpingdiplomats on a stretcher three times so that each diplomat lands safely in a truck. A screenshotfrom the game can be seen in figure 4.1.

For simplicity, each diplomat jumps and bounces in an identical trajectory so that thestretcher needs only be placed in any of three fixed positions. If the user misses a diplomatthat person is lost and cannot be saved. The number of saved and lost diplomats is recordedduring the game in order to get statistics about user performance.

The total number of jumping diplomats in a game is held constant, and they appear ran-domly throughout the game. As the time for each game is kept constant as well, this random-

1Original code by Dr. Daniel C. McFarlane.

Page 94: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

76 Methods for Interrupting a Wearable Computer User

Figure 4.1: The bouncing diplomats game.

ness means that at times there may be few or no diplomats while at other times there may beseveral of them that need to be saved. Thus, the user gets a varied task that requires attentionand is difficult to perform automatically.

4.2.2 Interruption Task

The interruption task consists of a matching task2 shown in the user’s semi-transparent head-mounted display. When the task appears, the user is presented with three objects of variedcolour and shape as shown in the example screenshot in figure 4.2. The top object is used asreference and the user is informed by a text in the middle of the screen to match this objectwith one of the two objects at the base. The matching can be either by colour or by shape,and only a single object will match the reference object.

As the colour and shape is determined at random, the user should not be able to learnany specific pattern or order in which they will appear. No feedback is given to the user afterselecting an object regardless of whether the matching is correct or wrong, in order to avoidadditional stress and distraction for the user.

4.2.3 Combining the Tasks

While the user is playing the bouncing diplomats game, he will be interrupted by matchingtasks appearing at random intervals. The tasks are either presented without user intervention

2Original code by Dr. Daniel C. McFarlane.

Page 95: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 77

Figure 4.2: The matching task.

or announced by use of visual or audible notification. For the announced tasks, the usernegotiates and decides when to present them. When a task is shown, the user may choose torespond to it by selecting an object or ignore it while continuing with the game. If the task isnot handled fast enough, new matching tasks will be added to a queue (hidden from the user)which must eventually be taken care of.

To prevent the user from deliberately ignoring the interruption task throughout the entiregame, the user is informed in advance that both tasks are of equal importance from an exper-imental standpoint. Although personal opinions about the importance of tasks may differ —e.g. saving the jumping diplomats may be perceived as being more important than matchingobjects — pilot testing did not reveal any such bias in our case.

4.2.4 Treatments

In order to investigate the different methods of interrupting the user, five different treatmentswere used where each of them tests a certain aspect of the interruption.

1. Game only Control case where only the bouncing diplomats game is played for a givenperiod of time. The user will never be interrupted in this treatment.

2. Match only Control case where only the matching task appears at random during a givenperiod of time, the length of it identical to that for Game only. The user will not bepresented with the bouncing diplomats game during this time.

Page 96: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

78 Methods for Interrupting a Wearable Computer User

3. Negotiated visual User plays the bouncing diplomats game. Matching tasks are announcedvisually by flashing a blank matching task for 150 ms in the head-mounted display. Theuser can choose when to present and respond to it, and also to hide it again e.g. in caseof a sudden increase in workload in the game.

4. Negotiated audible Identical to Negotiated visual but the matching tasks are announcedaudibly by playing a bell-like sound for about half a second each time a new matchingtask is added.

5. Scheduled User plays the bouncing diplomats game. Matching tasks are accumulatedover a period of time and the entire queue is presented at regular intervals. The usercan not negotiate when the matching tasks are presented, and neither can they be hiddenonce they have appeared. The only way for the user not to have the tasks presented isto respond to every task in the queue, after that there will be no interruption until thenext interval round.

It should be noted that in [53], six different treatments were used; in addition to the twocontrol cases (Game only and Match only) and the Scheduled treatment were Immediate,Negotiated and Mediated. Due to the nature of what this study tests those treatments wereabandoned or modified because of the following reasons:

• Immediate presents the matching task immediately when it appears, forcing the userto respond to it as the game is replaced with the matching task. However, as the useris involved in real world tasks there is no such enforcement as he can simply chooseto ignore the matching task while continuing in the real world. Thus, the treatment isreduced to a variant of Negotiated, and therefore it was abandoned.

• Negotiated was extended so that an audible announcement was added in addition tothe visual announcement, thus splitting up the treatment in the two separate treatmentsNegotiated visual and Negotiated audible. These treatments are identical to the originalNegotiated treatment, with the exception that the game is still playable even when amatching task is present. Since some wearable computers can only notify the userthrough audio [71], it is important to see if there exists a difference between audio andvisual notifications when considering the user’s cognitive workload.

• Mediated measured the workload based on the number of diplomats currently beingbounced. For real world tasks the workload may depend on numerous factors whichcan be difficult to take into account outside of a lab environment, so a better approachis then to monitor the user’s response to the workload. Since a wearable computer isused, biometric data (e.g. heart and eye blink rate) can be retrieved to derive the user’sfocus and stress level. However, this is in itself a complex study outside the scope ofthis paper, and therefore the treatment was abandoned.

The two control cases, Game only and Match only, provide a baseline for the performanceof the user. For the remaining treatments, Negotiated visual, Negotiated audio and Scheduled,they will all interrupt the user and may thereby affect the performance.

Page 97: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 79

4.3 User Study

A total number of 20 subjects were recruited among students and a larger testbed called“Testbed Botnia” (http://www.testplats.com) where the user study was announced togetherwith a set of questions. Individuals wishing to partake in the study responded to the questionsto express their interest. Based on their answers, a heterogeneous group of 16 males and 4females aged between 12 and 39 years were selected for participation. As members of thetestbed the participants receive points for each study they partake in and can later exchangethose points for merchandise. Due to the test session’s length of 90 minutes, they were alsogiven a cinema ticket as compensation for their participation in the study. They were alsoinformed they would receive this ticket unconditionally even if not completing the full studyfor some reason.

Upon arrival, each subject was informed by a test leader about the purpose of the studyand how it would be performed. Each treatment was described in general terms, much likethe description in section 4.2.4, but the exact number of diplomats or matching tasks was notdisclosed. The instructions for a specific treatment were also repeated in the pause precedingeach of them. Pilot studies indicated this repetition was useful as it served to remind thesubject of what to expect before proceeding. It also seemed to help in making the atmospherein the lab environment less strict and not as tense, thereby making the subjects feel morecomfortable and willing to comment on the experiment.

Before the test, the subject was asked to fill in a questionnaire with general questionsabout their computer skill and ability to work under stress. Demographic questions abouttheir age, gender, education and whether they were color blind were also given; the latterbeing relevant since the matching task depends on being able to match corresponding colours.Two colour blind subjects participated in the study, but they had no problems differentiatingbetween the colours used in the matching task.

Just before the experiment was started the subject put on the head-mounted display. Asthe display is rather sensitive to the viewing angle, a sample image was shown in the displayto help the subject align it properly. The same image was also shown in each pause in the testsession so as to give the subject a chance to adjust it further if needed.

After the test, the subject filled in another questionnaire with questions about the test, e.g.how they had experienced the treatments and their rating of them in order of preference. Theywere also given highly subjective questions, such as which treatment (excluding the controlcases) was the least complex one to perform, even though the number of matching tasks andjumping diplomats were kept constant in all treatments.

4.3.1 Test Session

The test is a within subjects design with the single factor of different treatments used asindependent variable. The participants were randomly divided into 5 groups; in each group,the order in which the treatments were presented differed to avoid bias and learning effects.The order of the treatments in the different groups was chosen to comply with a Latin squaredistribution.

Page 98: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

80 Methods for Interrupting a Wearable Computer User

The test session consists of each round of treatments being done twice; one practice roundand one experimental round. During the first round the subject is given a chance to learn aboutthe five treatments — the data from this round is not included in the final results. At the endof the practice round, each subject is sufficiently trained for the experimental round; here thefive treatments are done once more but this time the data will be included in the final results.

Session Length. Pilot studies indicated that subject learning had stabilized after about 4.5minutes, so during the first round each treatment was done only once. Even though learningstabilized early, the subjects were still required to practice each of the five treatments in orderto learn them in detail. The total effective length of a treatment is 4.5 minutes, when includingthe pause the actual length becomes about 5 minutes. The practice round with five treatmentsthus takes 25 minutes to complete; adding 5 more minutes for questions makes the practiceround take about 30 minutes in total.

In the experimental round, each treatment is done twice so as to get enough statisticallyvalid data. Each treatment is divided in two with a short pause in between to give the user timeto relax and get rid of fatigue. Thus, each treatment takes 2 * 4.5 = 9 minutes to complete,with pauses included the time is about 10 minutes in total. The experimental round will thustake 50 minutes to complete all five treatments. Adding 10 minutes for the subject to beinstructed and fill in the questionnaires before and after the test makes the entire session takeabout 90 minutes to complete.

Number of Diplomats and Matching Tasks. During the practice round a total of 38 jump-ing diplomats and 40 matching tasks were used per treatment. In the experimental round,these numbers were raised to 59 diplomats and 80 matching tasks per treatment. The num-bers were chosen to be the same as in [53] to allow for direct comparisons between the studies.None of the subjects expressed any negative opinion about this increase; on the contrary itseemed the added difficulty served as extra motivation.

4.3.2 Apparatus

The apparatus used in the experiment consists of a Dell Latitude C400 laptop with a 12.1”screen, Intel Pentium III 1.2 GHz processor and 1 GB of main memory. Connected to thelaptop is a semi-transparent head-mounted display by TekGear called the M2 Personal Viewerproviding the user with a monocular full colour view in 800x600 resolution. In effect, thishead-mounted display gives the appearance of a 14” screen floating about a meter in front ofthe user’s eye. As the display is semi-transparent the user can normally look right through itwithout problems, but when the interruption task is presented the view with that eye is moreor less obscured.

The bouncing diplomats game is shown on the laptop’s 12.1” screen in 800x600 resolu-tion, while the matching task is shown in the head-mounted display in 800x600 resolution.The actual screen space taken up by the game and matching task is 640x480 pixels, the restof the area is coloured black.

Page 99: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 81

User input is received through an external keyboard connected to the laptop. In the game,the user moves the stretcher left and right by pressing the left and right arrow keys, respec-tively. The matching task is controlled by pressing the “Delete” key to select the left object,and “Page Down” to select the right object. In the Negotiated treatments, pressing the uparrow presents a matching task under condition the queue is not empty, while pressing thedown arrow hides any matching task currently presented. As shown in figure 4.3, the naturalmapping of keys as they appear on an ordinary keyboard should make control fairly intuitivefor the user.

Leftobject

Rightobject

Moveleft

Hide

Show

Moveright

Figure 4.3: Keys for controlling the tasks.

The laptop was elevated 20 cm over the table so that the subject when sitting down facesit approximately straight ahead. By elevating the laptop the head-mounted display was alsomore naturally aligned so that the laptop’s screen would be covered, this was done inten-tionally in order to try and force the user to look through the head-mounted display at alltime. Although an option is to let the head-mounted display be positioned below or above theuser’s normal gaze, the enforcement of looking through it was chosen because such situationsare assumed to occur in real life with this kind of display. Our pilot studies also indicatedthe chair and external keyboard allowed the subject to sit comfortably and control the taskswithout strain. Figure 4.4 shows the complete setup.

Figure 4.4: User study setup.

Page 100: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

82 Methods for Interrupting a Wearable Computer User

4.4 Results

The measurements chosen were the same as in [53], in order to allow for an easy comparisonbetween the two sets of results. The graphs in figure 4.5 show the average value, togetherwith one standard error, of the measurements below.

Diplomats saved. Number of jumping diplomats saved.

Matched wrong. Number of matchings answered wrong.

Percent done wrong. Percentage of matching tasks done answered wrong.

Matches not done. Number of matching tasks not answered before treatment ended.

Average match age. Length between onset of matching task until it was responded to.

The original study also measured the number of times the subject changed between gameand matching task. However, as the user in our study can switch mentally between taskswithout using the keyboard, this measurement is not valid unless other equipment (e.g. gazetracking) is used.

(a) Diplomats saved. (b) Matched wrong.(c) Percent done wrong.

(d) Matches not done. (e) Average match age.

Figure 4.5: Average measurements.

Page 101: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 83

When doing measurements on the same variables and the same subject under differentconditions it is important to accomodate for this in the analysis. A repeated measures ANOVAwas therefore used on the data to see if any significant differences were present between thetreatments. The results of these tests can be seen in table 4.1, indicating that the means forthe measurements are not all equal.

Table 4.1: Repeated measures ANOVA.

Measurement P-value

Diplomats saved <0.0001Matched wrong 0.0022

Percent done wrong 0.0014Matches not done 0.0003

Average match age <0.0001

4.4.1 Comparison with Base Cases

When performing a post-hoc statistical paired samples t-test comparing the two base casetreatments, Game only and Match only, with the remaining three treatments, a number ofsignificant differences were shown to exist. This asserts the assumption that interrupting theuser will have a detrimental effect on that person’s performance. In table 4.2, a summary ofthese comparisons is shown, indicating whether there is a significant difference between thebase cases and treatments. To accomodate for multiple comparisons, a Bonferroni adjustedalpha value of 0.008 (0.05/6) is used when testing for significance.

Table 4.2: T-tests of base cases vs. treatments.

Measurement Base Vis. Aud. Sched.case

Diplomats saved Game <0.0001 0.0013 0.0012Matched wrong Match 0.0021 0.0014 0.0671

Percent done wrong Match 0.0011 0.0013 0.0406Matches not done Match 0.1408 0.4189 0.0072

Average match age Match 0.0074 0.0020 <0.0001

The only measurements which were not significantly different from the base case was“Matches not done” for the two Negotiated treatments, and “Matched wrong” together with“Percent matched wrong” for the Scheduled treatment. The reason for the former is that thesubjects often completed roughly the same number of matching tasks as in the base casetreatment. This suggests that allowing subjects to negotiate when to present the matchingtask does not cause it to be omitted more than what would have been the case had the match-ing task been the only task present. The latter indicates that in Scheduled, the subject canbetter concentrate on the matching tasks. The significant difference for “Matches not done”compared to the Scheduled treatment is most likely caused by matching tasks being queuedbut not presented before the treatment is over.

Page 102: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

84 Methods for Interrupting a Wearable Computer User

4.4.2 Pairwise Comparison of Treatments

The three treatments Negotiated visual, Negotiated audible and Scheduled were comparedto each other using a paired samples t-test. Table 4.3 shows a summary of this indicatingwhether a significant difference exists between each pair of treatments. A Bonferroni cor-rected alpha value of 0.008 is used when testing for significance.

Table 4.3: Pairwise t-tests of treatments.

Measurement Vis. / Aud. / Sched. /Aud. Sched. Vis.

Diplomats saved 0.2152 0.4131 0.1952Matched wrong 0.1256 0.2315 0.0286

Percent done wrong 0.0959 0.3575 0.0464Matches not done 0.0471 0.0002 <0.0001

Average match age 0.1258 <0.0001 <0.0001

As shown in table 4.3, there were no significant differences in terms of diplomats savedor matching tasks done answered wrong. This means that our test was not sensitive enough touncover any differences, if such exists, between the treatments for these measurements. How-ever, the “Average match age” measurement is significantly different between the Scheduledand the two Negotiated treatments. For the two Negotiated treatments, the difference is notsignificant enough (p = 0.1258). Nevertheless, the performance of certain subjects togetherwith their comments indicate that there may still be an underlying difference that was notfully uncovered by our study. When relating to what is shown in the graph in figure 4.5(e);the average age of a matching task is less for Negotiated audible than for Negotiated visual.Thus, the use of sound may be a stronger reminder that there are matching tasks to per-form, compared to using a visual signal. Furthermore, the graph in figure 4.5(d) shows thatthe number of tasks not done is also less for Negotiated audible than for Negotiated visual.While it is not marked as significant in the table (p = 0.0471), it still suggests that a differencemay exist. This strengthens the indication that sound can have a higher impact on subjectswhen it comes to reminding them to perform the matching tasks.

As an audible announcement seems to be stronger than a visual one, it is of interestto know how this affects the number of diplomats saved. Referring to the graph in figure4.5(a), there is a minor advantage of audio over visual with nearly one more diplomat savedin Negotiated audible, but this difference is not significant enough (p = 0.2152) to drawany conclusions from. Also, referring to the graphs in figures 4.5(b) and 4.5(c) shows anadvantage of audio over visual when it comes to reducing the number and percentage ofmatching tasks answered wrong, but these are also not significant enough (p = 0.1256, p= 0.0959). Further studies are needed to see whether the advantage of audible over visualannouncements have a positive effect also for these.

The Scheduled treatment left significantly more matching tasks undone at the end of atreatment compared to the negotiated treatments. The reason is that when tasks are presentedjust before the end of the treatment, a large number of them may be in the queue and are notanswered before the time runs out. The other measurements were, however, better in Sched-uled than in the negotiated treatments. This suggests that our decision to skip the Immediate

Page 103: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Methods for Interrupting a Wearable Computer User 85

condition was erroneous, and that it is likely to have exhibited the benefits of Scheduledwithout the drawback of high average age.

4.4.3 Comparison with Original Study

In general, the subjects’ in our study scored better results than in in the original study [53].This is most likely caused by the different setting in which our study was done; as the twotasks could run simultaneously without the matching task blocking input for the game, theuser could quickly switch mentally back and forth between them. The user could answer thematching tasks while still seeing the game in the background, and could thus more easilydetect when the game task needed attention. The number of diplomats saved was around10% higher for Game only, and one third higher for our two Negotiated treatments. This didnot, however, affect the matching task negatively; the number of tasks answered wrong wasaround 45–55% less, suggesting our setup was less prone to leave subjects making wrongdecisions. The number of tasks not done was 40–72% less for both negotiated treatments inour study, while in Scheduled it was 56% higher. Likely the subjects in the original studywere more cautious to switch to the matching task, while in Scheduled they had to finishanswering them before proceeding with the game. In our study they could switch freelybetween the dual tasks, explaining this difference. Our average match age was 2 secondshigher for Match only, 5 seconds higher for Negotiated visual, yet only 1 second higher forNegotiated audible. For Scheduled, the average age was 26 seconds higher since the subjectscould still play the game while the queue of tasks was present.

Audio notification was not used in the original study, but appeared to give a slightly betterresult than for visual, suggesting that the type of notification can be significant.

4.4.4 Subjective Comments

In addition to the quantitative data presented in the previous sections, there is also some qual-itative data of interest. This data was given either by word of mouth or as written commentsin the questionnaires the subjects filled in.

Three subjects reported that the use of sound in Negotiated audible lost its meaning whenit was played at the same time as a diplomat was bounced. The sound was merely interpretedas a “bouncing sound” and not as an indication that there was a new matching task to perform,even though participants were fully aware of the actual meaning of the sound. This suggeststhat for certain tasks, care must be taken not to let the sound coincide and relate to the task— especially if the two tasks are meant to be disjoint.

Two subjects reported that hearing a sound was more difficult to relate to in a temporalsense compared to seeing a visual flash. At times the subjects made an attempt to show thematching tasks, only to realize that no new tasks had been added. Apparently the chronolog-ical order of when a sound is played can be more difficult to determine compared to when avisual flash is shown, at least when the task to be informed about is also done in the visualdomain. Whether the same situation would occur for a task in the audible domain remains anopen question.

Page 104: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

86 Methods for Interrupting a Wearable Computer User

4.5 Conclusions

We have presented a study investigating the interruption of a wearable computer user, someof the methods to achieve this and what effects they will have on the user. The results indi-cate that the scheduled treatment gave the best results, with the drawback of a considerablyhigher average age before tasks were answered. The negotiated treatments, where the usercould decide when to handle the interruptions, were more useful when considering the overallperformance of the user; they had a much shorter average task age with only slightly worseperformance compared to the scheduled treatment. It was suggested that an audible notifi-cation increased the performance of the matching tasks, while at the same time not affectingthe game task negatively compared to the visual treatment. However, a more detailed studyis required to assert the significance of this observation. All in all, this indicates that bothhypotheses posed in the introduction are true; a user’s performance is affected by how in-terruptions are allowed to be handled, and the type of notification used will have a furtherimpact.

4.5.1 Future Work

As the user has no direct feedback about the number of interruption tasks currently in thequeue, it may be interesting to investigate how such feedback would affect the results. Wouldthe user appreciate seeing this number to plan ahead or would it merely have a detrimentaleffect?

In the experimental setup, the subjects were enforced to look through the head-mounteddisplay. An alternative is to have the display placed to either side, above or below the subject’snormal gaze. By not obscuring the game it should be easier to selectively focus on either task,but on the other hand that may make one task easier to ignore.

4.6 Acknowledgments

This work was funded by the Centre for Distance-spanning Technology (CDT) under theVINNOVA RadioSphere and VITAL Mål-1 project, and by the Centre for Distance-spanningHealth care (CDH). Original code by Dr. Daniel C. McFarlane ([email protected]), devel-oped at the Naval Research Laboratory’s Navy Center for Applied Research in Artificial In-telligence (http://www.aic.nrl.navy.mil), Washington DC under sponsorship from Dr. JamesBallas ([email protected]). We thank Dr. McFarlane for providing us with the sourcecode for the game and matching task and giving us permission to modify them for our study.We thank Dr. David Carr as well as the anonymous reviewers for insightful comments andadvice given. The authors finally wish to thank all the volunteers who participated in ourstudy.

Page 105: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 5

Using the "HotWire" to StudyInterruptions in WearableComputing Primary Tasks

87

Page 106: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 107: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 89

Using the “HotWire” to Study Interruptionsin Wearable Computing Primary Tasks

Mikael Drugge1, Hendrik Witt2, Peter Parnes1, Kåre Synnes1

1 Media Technology, Luleå University of Technology, SE-97187 Luleå, Sweden2 TZI, Wearable Computing Lab., University of Bremen, D-28359 Bremen, Germany

[email protected], [email protected], [email protected], [email protected]

October, 2006

Abstract

As users of wearable computers are often involved in real-world tasks of critical nature, themanagement and handling of interruptions is crucial for efficient interaction and task per-formance. We present a study about the impact that different methods for interruption haveon those users, to determine how interruptions should be handled. The study is performedusing an apparatus called “HotWire” for simulating primary tasks in a laboratory experi-ment, while retaining the properties of wearable computers being used in mobile, physical,and practical tasks.

5.1 Introduction

In stationary computing users concentrate mainly on one task to be performed with the com-puter. Wearable computing, however, typically expects users to accomplish two differenttasks. A primary task involves real world physical actions, while the secondary task is oftendedicated to interacting with a wearable computer. As these two tasks often interfere, study-ing interruption aspects in wearable computing is of major interest in order to build wearableuser interfaces that support users during work with minimized cognitive load.

5.1.1 Motivation

Limitations of human attention have been widely studied over decades by psychological sci-ence. What we commonly understand as attention consists of several different but interrelatedabilities [40]. In wearable computing we are particularly interested in divided attention, i.e.the ability of humans to allocate attention to different simultaneously occurring tasks. It isalready known that divided attention is affected by different factors such as task similarity,task difference, and practice [18]. The question of when to interrupt a user can be decidedby estimating human interruptability [33], while the question of how depends on the methodsused. Although studying divided attention has already provided detailed findings, applying

Page 108: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

90 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

and validating them for wearable computing is still a challenging issue. Once approved, theycan be used in wearable user interface design to adapt the interface to the wearer’s environ-ment and task. Furthermore, being able to measure such attention enables the specificationof heuristics that can help to design the interface towards maximal performance and minimalinvestment in attention [75]. Here, however, a major problem is the simulation of typicalreal world primary tasks under laboratory conditions. Such simulation is needed to analyzecoherence between attention on a primary task and user performance in different interactionstyles.

In this paper we present a study of different ways to interrupt a user performing a physicaltask. We will investigate the correlations between cognitive engagement, interruption type,and overall performance of the users.

5.1.2 Outline

The remainder of the paper is structured as follows: Section 5.2 reviews related work tothe presented interruption study. Then, in section 5.3 we describe the experiment conductedincluding the different interruption methods tested. Section 5.4 explains the user study itselfand the apparatus used for primary task simulation. The results are discussed in section 5.5,while the apparatus itself is evaluated in 5.6. Finally, section 5.7 concludes the paper.

5.2 Related Work

In [53], McFarlane presents the first empirical study of all four known approaches to coor-dinate user interruption in human-computer interaction with multiple tasks. The study con-cerns how to interrupt users within the context of doing computer work without increasingtheir cognitive load. The method applied in the laboratory experiments was based on a sim-ple computer game that requires constant user attention, while being randomly interruptedby a color and shape matching task. As a continuation of McFarlane’s original interruptionstudy for the scope of wearable computing, in [15] a head-mounted display (HMD) was usedto display the matching tasks. It was found that the scheduled approach gave the best per-formance, while using notifications came second although with shorter response time. Aswearable computers are closely connected to the user, performance is not the only factor tobe considered — the user’s preferences on interruption also need to be taken into account.In [55] it was found that audio notification appeared to give slightly better performance al-though users considered it more stressful, compared to visual signals that on the other handwere more distracting for the primary task. Although the mentioned work was able to relatehuman-computer interaction findings to wearable computing, the conducted laboratory ex-periments only use virtual primary tasks in form of computer games. This does not entirelyencompass the properties of wearable computers being used in mobile and physical tasks,indicating that a follow-up study is needed to complement the earlier studies.

Page 109: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 91

Figure 5.1: The HotWire apparatus used.

5.3 Experiment

The experiment addresses how different methods of interrupting the user of a wearable com-puter affects that person’s cognitive workload. The scenario involves the user performing aprimary task in the real world, while interruptions originate from the wearable computer andcall for the user to handle them. By observing the user’s performance in the primary taskand in the interruption task, conclusions can be drawn on what methods for handling inter-ruptions are appropriate to use. In order to measure the user’s performance in both types oftasks, these must be represented in an experimental model. This section describes each taskand how they are combined in the experiment.

5.3.1 Primary Task

The primary task needs to be one that represents the typical scenarios in which wearablecomputers are being used. Primary tasks in wearable computing are often physical tasks, i.e.tasks that require users to work with their hands on real world objects while being mobile(e.g. assembly or inspection tasks). For the purpose of our study, the task has to be easyto learn by novice users to reduce errors in the experiment caused by misunderstandings orlack of proficiency. The time to make the user proficient and fully trained should also beshort enough to make a practice period just before the actual experiment sufficient, so thatthe user’s performance will then remain on the same level throughout the experiment. Tosimulate such a task in a controlled laboratory environment, we decided to use the “HotWire”experimental setup [83].

Page 110: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

92 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

Figure 5.2: Matching task presented in HMD.

The HotWire apparatus was developed for simulating primary tasks that satisfy the re-quirements discussed above. It is based on a children’s game commonly known as “The HotWire”. It consists of a metallic wire bent in different shapes that is mounted on both ends toa base plate, plus a special tool with a grip and a metallic ring. The idea of the game is thata person has to pass the ring from one end of the wire to the other end without touching thewire itself. If the wire is touched with the ring while being on the track an acoustic feedbackindicates an error. For our apparatus, shown in figure 5.1, we constructed the bent metallicwire out of differently shaped smaller segments each connected via windings to another seg-ment. This allows the difficulty or characteristic of the primary task to be varied by replacingor changing the sequence of connected segments.

5.3.2 Interruption Task

The secondary task consists of matching tasks presented in the user’s HMD. An example ofthis is shown in figure 5.2. Three figures are shown of random shapes and colors, and theuser must match the figure on top with either the left or the right figure at the bottom of thedisplay. A text instructs the user to match either by color or by shape, making the task alwaysrequire some mental effort to answer correctly. There are 3 possible shapes (square, circle,triangle) and 6 colors (red, yellow, cyan, green, blue, purple), allowing for a large number ofcombinations. Tasks are created at random so that on average a new task appears every fiveseconds, and if the user is unable to handle them soon enough they will be added to a queueof pending tasks.

5.3.3 Methods for Handling Interruptions

The methods used for managing the interruptions are based on the four approaches describedin McFarlane’s taxonomy in [53]. During all of these methods, the user performs the HotWireprimary task while being subject to interruption. The methods used are as follows

• Immediate: Matching tasks are created at random and presented for the user in theinstant they are created.

Page 111: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 93

• Negotiated: When a matching task is randomly created, the user is notified by either avisual or audible signal, and can then decide when to present the task and handle it.

• Scheduled: Matching tasks are created at random but presented for the user only atspecific time intervals of 25 seconds, typically this causes the matching tasks to queueup and cluster.

• Mediated: The presentation of matching tasks is withheld during times when the userappears to be in a difficult section of the HotWire. The algorithm used is very simple;based on the time when a contact was last made with the wire, there is a time windowof 5 seconds during which no matching task will be presented. The idea is that when alot of errors are made, the user is likely in a difficult section so no interruption shouldtake place until the situation is better.

In addition to these methods, there are also two base cases included serving as reference.These are as follows

• HotWire only: The user performs only the HotWire primary task without any inter-ruptions, allowing for a theoretical best case performance of this task.

• Match only: The user performs only the matching tasks for 90 seconds, approximatelythe same period of time it takes to complete a HotWire game. This allows for a theo-retical best case performance.

Taken together, and having two variants — audio and visual notification — for the nego-tiated method, there are seven methods that will be tested in the study.

5.4 User Study

A total of 21 subjects were selected for participation from students and staff at the localuniversity — 13 males and 8 females aged between 22–67 years (mean 30.8). The study usesa within subjects design with the method as the single independent variable, meaning that allsubjects will test every method. To avoid bias and learning effects, the subjects are dividedinto counterbalanced groups where the order of methods differs. As there are seven methodsto test, a Latin Square of the same order was used to distribute the 21 participants evenly into7 groups with 3 subjects in each.

A single test session consists of one practice round where the subject gets to practicethe HotWire and matching tasks, followed by one experimental round during which data iscollected for analysis. The time to complete a HotWire game naturally varies depending onhow quick the subject is, but on average pilot studies indicated it will take around 90–120seconds for one single run over the wire. With 7 methods of interruption to test with shortbreaks between each, one practice and one experimental round, plus time for questions andinstructions, the total time required for a session is around 40–45 minutes.

Page 112: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

94 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

Figure 5.3: Experiment performed by a user.

5.4.1 Apparatus

The apparatus used in the study is depicted in figure 5.3, where the HotWire is shown togetherwith a user holding the ring tool and wearing a HMD. The HotWire is mounted around a tableand approximately 4 meters in length. To avoid vibrations because of its length, the wire wasstabilized with electrically isolated screws in the table. An opening in the ring allowed thesubject to move the ring past the screws while still staying on track. To follow the wire withthe tool, the user needs to move around the table over the course of the experiment. The usermay also need to kneel down or reach upwards to follow the wire, furthermore emphasizingthe mobile manner in which wearable computers are used. Figure 5.4 illustrates the varietyof body positions observed during the study.

In the current setup, the user is not wearing a wearable computer per se, as the HMDand tool is connected to a stationary computer running the experiment. However, as thewires and cabling for the HMD and tool are still coupled to the user to avoid tangling, thisshould not influence the outcome compared to if a truly wearable computer had been used. Inparticular, we also used a special textile vest the users have to wear during the experiment thatwas designed and tailored to unobtrusively carry a wearable computer, as well as all neededcabilings for a HMD without effecting the wearers freedom in movement. For having an evenmore realistic situation we put a OQO micro computer in the vest to simulate also the weighta wearable computer equipment would have outside the laboratory environment.

Page 113: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 95

The matching tasks are presented in a non-transparent SV-6 monocular HMD from Mi-croOptical. A data-glove used in earlier research [4] is worn on the user’s left hand servingas the interface to control the matching tasks. To ensure maximum freedom in movementof the user, the data-glove uses a Bluetooth interface for communication with the computer.By tapping index finger and thumb together, an event is triggered through a magnetic switchsensor based on the position of the user’s hand at the time. Using a tilt sensor with earthgravity as reference, the glove can sense the hand being held with the thumb pointing left,right or upwards. When the hand is held in a neutral position with the thumb up, the first ofany pending matching tasks in the queue is presented to the user in the HMD. When the handis turned to the left or to the right, the corresponding object is chosen in the matching task.For the negotiated methods, the user taps once to bring the new matching tasks up, and sub-sequently turns the hand to the left or right and taps to answer them. For the immediate andmediated methods where matching tasks appear without notification, the user need only turnleft or right and tap. Because of the novelty of the interface, feedback is required to let theuser know when an action has been performed. In general, any feedback will risk interferingwith the experiment and notifications used, but in the current setup an audio signal is usedas it was deemed to be the least invasive. In order not to confound the user, the same audiosignal was used regardless of whether the user answered correctly or not.

(a) Standing (b) Kneeling (c) Bending

Figure 5.4: Different body positions observed.

Page 114: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

96 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

5.5 Results

After all data had been collected in the user study, the data was analyzed to study whicheffect different methods had on user performance. For this analysis, the following metricswere used

• Time: The time required for the subject to complete the HotWire track from start toend.

• Contacts: The number of contacts the subject made between the ring and the wire.

• Error rate: The percentage of matching tasks the subject answered wrong.

• Average age: The average time from when a matching task was created until the sub-ject answered it, i.e. its average age.

The graphs in figure 5.5 summarizes the overall user performance by showing the aver-ages of the metrics together with one standard error.

0

20000

40000

60000

80000

100000

120000

140000

HotWireonly

Vis. Aud. Sch. Imm. Med.

mill

isec

on

ds

(a) Time

0

10

20

30

40

50

60

70

HotWireonly

Vis. Aud. Sch. Imm. Med.

con

tact

s

(b) Contacts

0,00

0,02

0,04

0,06

0,08

0,10

0,12

0,14

0,16

0,18

Matchonly

Vis. Aud. Sch. Imm. Med.

erro

r ra

te

(c) Error rate

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

Matchonly

Vis. Aud. Sch. Imm. Med.

mill

isec

on

ds

(d) Average age

Figure 5.5: Averages of user performance.

Page 115: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 97

A statistical repeated measures ANOVA was performed to see whether there existed anysignificant differences among the methods used. The results are shown in table 5.1. Forall metrics except the error rate, strong significance (p<0.001) was found indicating thatdifferences do exist.

Table 5.1: Repeated measures ANOVA.

Metric P-value

Time <0.001Contacts <0.001Error rate 0.973

Average age <0.001

To investigate these differences in more detail, paired samples t-tests were performedcomparing the two base cases (HotWire only and Match only) to each of the five interruptionmethods. The results are shown in table 5.2. To accomodate for multiple comparisons, aBonferroni corrected alpha value of 0.003 (0.05/15) was used when testing for significance.

Table 5.2: Base case comparison t-tests.

Metric Vis. Aud. Sch. Imm. Med.

Time <0.0001 <0.0001 <0.0001 0.0002 0.0003Contacts <0.0001 <0.0001 0.0022 <0.0001 0.0004Error rate 0.7035 0.1108 0.0668 0.8973 0.4979

Average age 0.0012 0.0001 <0.0001 0.0194 0.0046

All of these differences are expected; the completion time will be longer when there arematching tasks to do at the same time, and the error rate is likely to increase because ofthat reason. Also, the average age is expected to be longer than for the base case since theuser is involved with the HotWire when matching tasks appear, and both the scheduled andmediated methods will by definition cause matching tasks to queue up with increased ageas a result. That no significant differences in the matching tasks’ error rate was found wasunexpected, intuitively there should be more mistakes made when the subject is involved in aprimary task. However, when looking at the data collected, most subjects answered the tasksas good in the interruption methods as they did in the base case of match only. Since therewas nothing in the primary task that “forced” the subjects to make mistakes, as e.g. imposinga short time limit on the tasks would certainly have done, the subjects mainly gave accuraterather than quick and erroneous answers. All in all, this comparison of methods with basecases shows that in general, adding interruptions and a dual task scenario with a physical andmobile primary task will be more difficult for the subject to carry out successfully.

Following, the five interruption methods were then compared to each other using a pairedsamples t-test, the results of which is shown in table 5.3. As can be seen, a number ofsignificant differences were found between the interruption methods. We will now analyzeeach of the metrics in turn to learn more about the characteristics of each method.

Page 116: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

98 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

Table 5.3: Pairwise t-tests of methods.

Time Vis. Aud. Sch. Imm. Med.

Vis. - 0.6859 <0.0001 0.0001 <0.0001Aud. 0.6859 - 0.0003 <0.0001 <0.0001Sch. <0.0001 0.0003 - 0.9773 0.8157

Imm. 0.0001 <0.0001 0.9773 - 0.7988Med. <0.0001 <0.0001 0.8157 0.7988 -

Contacts Vis. Aud. Sch. Imm. Med.

Vis. - 0.9434 0.0002 0.1508 0.0006Aud. 0.9434 - <0.0001 0.0240 0.0002Sch. 0.0002 <0.0001 - 0.0038 0.4217

Imm. 0.1508 0.0240 0.0038 - 0.0031Med. 0.0006 0.0002 0.4217 0.0031 -

Errorrate Vis. Aud. Sch. Imm. Med.

Vis. - 0.2744 0.4335 0.9041 0.8153Aud. 0.2744 - 0.5258 0.3356 0.1039Sch. 0.4335 0.5258 - 0.5852 0.6118

Imm. 0.9041 0.3356 0.5852 - 0.7668Med. 0.8153 0.1039 0.6118 0.7668 -

Averageage Vis. Aud. Sch. Imm. Med.

Vis. - 0.5758 0.0001 0.0470 0.2180Aud. 0.5758 - <0.0001 0.0170 0.1411Sch. 0.0001 <0.0001 - <0.0001 0.3256

Imm. 0.0470 0.0170 <0.0001 - 0.0061Med. 0.2180 0.1411 0.3256 0.0061 -

5.5.1 Time

With regards to the completion time, the interruption methods can be divided into two groups;one for the two negotiated methods (visual and audio), and one for the remaining three meth-ods (scheduled, immediate and mediated). There are strong significant differences betweenthe two groups, but not between the methods in the same group. The reason for the highercompletion time of the negotiated methods is because of the extra effort required by the userto present matching tasks. As this additional interaction required to bring the tasks up is likelyto slow the user down, this result was expected. An important finding was, however, that theoverhead (24.8 seconds higher, an increase of 26%) was much higher than expected. A loweroverhead was expected, considering the relative ease — in theory — of holding the thumbupwards and tapping thumb and finger together to present the matching tasks, but in practicethe subjects found this to be difficult when doing it simultaneously as the HotWire primarytask. The data-glove itself accurately recognizes the desired gestures when done right, but theproblem is that the subjects experience problems because their sense of direction is lost whendoing the physical task, something we noticed when watching videos of the subjects in retro-spect. Relating to our findings in [15], where the primary task was less physical as the usersat in front of a computer and interacted using a keyboard, we see that even seemingly simpleways to interact can have a much higher impact when used in wearable computing scenarios.

Page 117: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 99

Therefore, we argue that using a more physical primary task can increase the validity of userstudies in wearable computing.

5.5.2 Contacts

Looking at the number of contacts between the ring and the wire, i.e. the number of physicalerrors the subject made in this primary task, we can discern three groups for the methods. Thetwo negotiated methods form one group, where the additional interaction required to presentmatching tasks also cause more contacts with the wire. The scheduled and mediated methodsform a second group with the lowest number of hotwire contacts. The immediate method liesin between and significant differences for this method were only found for the scheduled andmediated methods. It is of interest to know what causes these differences, if it is interferencewith the subject’s motorical sense because of the dual tasks, or some other underlying factor.

As can be seen, there is a correlation between the completion time and error rate, whichcan be interpreted as indicating that the number of contacts made depends mainly on the timespent in the HotWire track, and is not affected by the different interruption methods per se.To analyze this further, the rate r of contacts over time was examined.

r =contacts

time

When comparing this rate between all interruption methods, no significant differences werefound. This can be expected because of the correlation between time and contacts made.However, since there are both easy and more difficult sections of the HotWire, such a naiveway of computing the overall contact rate risks nullifying these changes in track difficulty.To examine the error rate in detail and take the HotWire track itself in account, assuming theuser moved the ring with a constant speed on average, we divided the track in 20 segments(see figure 5.6(a)) and compared the rate ri per segment i between the methods1. However,no significant differences could be found here either. This suggests that our experiment wasunable to uncover the impact of the interruption method as a whole, if such an effect exists,on the amount of contacts made in the HotWire.

Assuming that solely the appearance of matching tasks in the HMD cause more contactsbeing made, we decided to test this hypothesis. The contact rates were divided in two cat-egories; r0 indicated the rate of contacts over time when no matching task was present inthe HMD, while r1 indicated the rate of contacts over time with a matching task visible (seefigure 5.6(b)). The rates r0 and r1 then underwent a paired samples t-test for each of the in-terruption methods, to see whether the means of these two kind of rates differed. Accordingto the hypothesis, having a matching task present in the HMD should increase the contactrate r1 compared to the rate r0 when no matching task is present. Surprisingly, no significantdifference was found. This can be taken as indication that either no difference exists, or morelikely, that the number of contacts made by our HotWire apparatus is too random so that thesmaller underlying effects of having a matching task present become lost in this noise. As our

1To get a more accurate segmentation, the ring’s position on the track would need to be monitored over time,something our current apparatus does not yet support.

Page 118: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

100 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

. . . ri . . .r1 r20r2 r3

(a) Fixed-length

r0 r0r0r1r0 r1

(b) Interruption-based

Figure 5.6: Segmenting the track for analysis.

initial version of the HotWire apparatus [83] could reveal these differences with stronger sig-nificance in pilot studies, it suggests the version used in this larger study simply became toodifficult. Since the user now needed to walk around the track and change into different bodypositions, this would cause more random contacts being made than with a version where theuser stands still, thereby causing so big variance in the data collected that small differencescaused by the matching task or interruption method cannot be found.

To determine whether the methods influence the subject overall and make him or her moreprone to make errors, we compared first the rate r1 between different methods, and then r0 inthe same manner. For r1, when there was a matching task shown, the mediated interruptionmethod had the lowest contact rate (0.38) while immediate had the highest rate (0.69), yetwith p=0.04 this is not significant enough to state with certainty when Bonferroni correctionis applied. For r0, however, the mediated interruption method still had the lowest contactrate (0.33), while the two negotiated methods had the highest (both 0.48), and this differencewas observed with significance p<0.003 confirming the hypothesis that the mediated methodwill help reduce this number. This finding shows that the algorithm we used for the mediatedmethod can make the user perform the primary task slightly better in between interruptions,compared to letting her negotiate and decide for herself when to present the matching tasks.

Page 119: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 101

5.5.3 Error rate

The error rate for the matching tasks exhibited no significant differences regardless of method.One reason for this is likely that a majority of the subjects answered all matching tasks cor-rectly, (the median was zero for all methods except negotiated), while four subjects had veryhigh consistent error rates (20∼70%) through all methods, including the base case, that con-tributed to a high variance. In other words, the matching task may be a bit too easy for mostpeople, while some can find it very difficult to perform.

A difference found compared to [15] is that the error rates for negotiated audio and vi-sual have been exchanged so that audio, rather than visual, now exhibits worse performance.Although this cannot be said with statistical certainty in either case, it may indicate that dif-ferences do exist between subjects and their preference, and likely also by the kind of primarytask being done.

5.5.4 Average age

Naturally, the average age is expected to be the highest for the scheduled method, sincethe matching tasks are by definition queued for an expected 12.5 seconds on average. Thiswas also found with strong statistical significance (p<0.0001) for all methods but mediated.With an average age of 13.5 seconds on average, and an expected age of 12.5 seconds, thismeans the user only spends on average 1 second to respond to the queued matching tasks.Comparing this to the immediate (4.1 sec) and negotiated (6.5 and 7.1 sec) methods, this issignificantly (p≤0.0002) faster, likely because the need to mentally switch between primaryand matching task is reduced because of the clustering.

Mediated on the other hand exhibited such high variance in its data, about an order ofmagnitude larger than for the other methods, so no real significant differences could be shown.The reason for this high variance is because the mediated algorithm was based on a fixed timewindow, and for some users who made errors very frequently this time window was simplytoo large so that the queued matching tasks showed up very seldom.

5.6 Evaluating the apparatus

Since the HotWire is an apparatus for evaluating wearable user interfaces, it is important todetermine how suitable it is compared to other laboratory setups. In [15] a computer gameand keyboard was used in a non-mobile setting where the user sat still during the course ofthe study, and we will use this as reference setup for the comparison.

The task of matching was the same in both studies, with minor differences in the fre-quency of appearance and the HMD used to present them in, as well as the physical means tointeract with the task. As can be seen, the metrics that are comparable across the studies —the error rate and the average age — had a better significance in the former study. This wouldindicate that our current setup is less likely to uncover differences, if such exist, comparedto the former non-mobile setup. Reasons may be that our study used a shorter time span for

Page 120: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

102 Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks

each method and that a novel interaction method was used, thereby increasing the varianceof the data collected and diminishing the significance by which differences can be observed.

The primary task cannot easily be compared across studies; in the former study the num-ber of errors was bounded and time was kept constant, whereas in our new study both errorsand completion time are variable and unbounded. The former study thus had the errors asthe only metric, whereas the HotWire offers both errors and time as metrics of performance.However, what can be seen is that in the former study no real significant differences could befound for the error metric between methods. With the HotWire, strong significant differenceswere observed in a majority of the tests for both the error and time metrics. This shows thatdifferences do indeed exist between the interruption methods, and that these can more easilybe uncovered by the apparatus we used. Therefore, as the HotWire apparatus is more mobile,physical, and more realistically represents a wearable computing scenario, we argue that us-ing this in favour of the stationary setup is better for evaluating and studying wearable userinterfaces.

Considering the fact that very few significant differences could be observed when lookinginto closer detail on the errors over time, as discussed in section 5.5.2, this basically indicatesthat there are more factors that need to be taken in account for research in wearable inter-action. Ease of interaction, mobility, walking, changing body position, using both hands tohandle the dual tasks — all of these factors cause errors being made in the primary task, whilethe effects of interruption and the modality used have less impact. Thus, we argue that theHotWire can aid in focusing on the problems most relevant in wearable computing interac-tion, as details that are of less importance in the first stages are clearly not revealed until theimportant problems are dealt with. In our study, we used a data-glove that is conceptuallysimple to operate — the user can select left, right, or up — yet even this was shown to be toodifficult when operated in a more realistic wearable computing scenario.

5.7 Conclusions

The recommendation when implementing efficient interruption handling in wearable com-puting scenarios is to examine the needs of the primary and secondary task, and choose themethod which best adheres to these as there are specific advantages and drawbacks with eachmethod. The HotWire study both confirms and complements the findings in [15] and [55]applied in a wearable computing scenario. Overall, the scheduled, immediate, and mediatedmethods result in fewer errors than the negotiated methods. Scheduled and mediated meth-ods cause a slower response to the matching tasks, whereas immediate allows for quickerresponse at the cost of more errors in the primary task. The algorithm used in the mediatedmethod was, despite its simplicity, able to reduce the error rate in the primary task in betweenthe matching tasks compared to the negotiated method. Therefore, it can in certain situationsbe better to utilize context awareness and take the primary task in account, rather than explic-itly allowing the user to decide when matching tasks should be presented. The new metricof completion time indicates that a significant overhead on the primary task is imposed whensubjects get to negotiate and decide when to present the matching tasks, which also resultsin a larger number of errors being made. The cause of this was unforeseen difficulties in the

Page 121: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Using the "HotWire" to Study Interruptions in Wearable Computing Primary Tasks 103

interaction, even though a conceptually simple data-glove was used to control the matching.This suggests that efforts should primarily be focused on improving the interaction style andease of use, while the actual methods used for interruption is of secondary importance.

The architectural implications of the different methods will still be relevant to considerin any case. Assuming the wearable computer is part of a more complex system where in-terruptions originate from elsewhere, the immediate and negotiated methods both requirecontinuous network access so that the task to handle can be forwarded to the user immedi-ately. On the other hand, the clustering of tasks that result from the scheduled and mediatedmethods may only require sporadic access, e.g. at wireless hot-spots or certain areas in theworking place with adequate network coverage.

The HotWire apparatus itself demonstrated that many findings from non-mobile interrup-tion studies could be confirmed, while also pointing out that there are inherent differencesin wearable computing due to mobility and performing physical primary tasks. These differ-ences cause some findings to stand out stronger than other, and as the apparatus more accu-rately resembles a realistic wearable computing scenario, this will better help guide researchin wearable interaction to the areas where most focus is needed in the first stages of devel-opment. Since this represents a compelling (and worst case) scenario involving very highcognitive and physical workload, the results can likely be applicable in application domainswith more relaxed constraints such as business and consumer use.

5.7.1 Future Work

For more accurate and in-depth analysis of the data collected from the HotWire, the user’sposition around the track would need to be monitored to know where contacts are being madeand what causes them. This would show if the contacts are primarily caused by difficultsections on the track, or from the interruption task or interaction device used. Furthermore,the algorithm in the mediated method was able to demonstrate benefits despite being trivial.It would therefore be interesting to evaluate different algorithms for this kind of contextawareness, that through very simple means can be applied in real life scenarios and stillhave a positive effect.

5.8 Acknowledgments

This work has been partly funded by the European Commission through IST ProjectwearIT@work (No. IP 004216-2004) and also partly funded by the Centre for Distance-spanning Healthcare and the Centre for Distance-spanning Technology at Luleå Universityof Technology. We thank Dr. McFarlane for providing us with the source code to his originalexperiments and giving us permission to modify it for our studies.

Page 122: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

104

Page 123: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 6

Wearable Systems in NursingHome Care: Prototyping

Experience

105

Page 124: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 125: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Wearable Systems in Nursing Home Care: Prototyping Experience 107

Wearable Systems in Nursing Home Care: Prototyping Experience

Mikael Drugge, Josef Hallberg, Peter Parnes, Kåre SynnesDepartment of Computer Science and Electrical Engineering

Luleå University of TechnologySE–971 87 Luleå, Sweden

{mikael.drugge, josef.hallberg, peter.parnes, kare.synnes}@ltu.se

January–March, 2006

6.1 Introduction

Medical workers at nursing homes spend much time on communication to get the right in-formation to the right person at the right time. This communication is a prerequisite forproper patient care. Delays cause stress, discomfort, and dissatisfaction among caretakersand patients as well as possible detrimental health consequences for patients. We believe per-vasive computing technologies can improve this situation by speeding communication anddocumenting care more effectively.

In fact, pervasive computing is a promising solution to many problems that medical work-ers face, and today it’s increasingly practical. Yet actual deployment is still in its infancy.Deploying prototypes that solve specific problems can help medical staff see its benefits. In-volving them early in the design process also helps ensure that the right problems are beingsolved and the solutions will be accepted.

We decided to try rapid prototyping in a real nursing home. We set a tight four-weekdeadline for ourselves and began work to build a testable prototype with two half-time devel-opers. This gave us one person-month to develop a useful prototype to investigate researchquestions including:

• What methodologies are useful for prototyping pervasive computing systems?

• How do we engage end users in prototype design and interactions?

• Can we rapidly deploy prototypes built from existing technology in real settings?

• How can we translate conceptual solutions to functional prototypes?

We worked with the Lighthouse, a local nursing home that provides short-term residentialcare in apartments. Self-sufficient elderly who’ve been set back by accidents or illnesses canreceive the support they need to recuperate. With up to 40 guests, the Lighthouse sees acontinuous stream of patients. Although busy, it’s small enough for us to easily deploy andstudy prototypes in real situations.

Page 126: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

108 Wearable Systems in Nursing Home Care: Prototyping Experience

6.2 Scoping the Project

We first had to identify actual problems nurses face in everyday work, determine which couldbe solved within our research’s scope, and identify which would bring the most gain. Al-though we had reports on typical problems, we decided a field study would give us first-handexperience with their work. This was accomplished by a "quick and dirty" ethnographicalstudy [31], where we accompanied a group of four nurses for a day and observed the profes-sional tasks they perform.

During this study, we observed scenarios such as drawing blood samples and administer-ing pain medication. A consistent theme for many tasks was the difficulty in getting necessarypatient information at the point of care. The patient charts contain the most important infor-mation, but they are only accessible from the nurses’ office computers. Nurses typically mustwalk hundreds of meters and change floors to attend patients in their apartments. Going backto the office computers takes time, so the nurses need a system which supports retrievingsuch information in advance and updating the patient charts upon returning to the office. Thenurses currently keep updates in short-term memory or handwritten notes.

When nurses need more informal information, they must be able to contact the personwho previously cared for the patient. Typically, the Lighthouse nurses used mobile phonesfor this purpose. When the phone calls reached the previous caretaker at all, and often theydidn’t, they usually interrupted the recipient. Likewise, other people calling the Lighthousenurses interrupted their patient care, increasing stress and discomfort for both the nurse andpatient.

We also found mobile phones lacking multimodal features, supporting only voice, whilethe situation itself required video to convey certain information. For example, rather thanhaving a patient point out pains directly to a physiotherapist, the nurses must relay this infor-mation over the phone, thereby losing the subtle details that body language can reveal.

At the end of the day, we summarized the problems we encountered, both in scenarioform and as a list of specific items, including

• communication,

• information dissemination,

• access to patient charts, and

• organizational issues.

A few days later, we went back to the Lighthouse for a meeting with the nurses to validateour findings and ensure that the identified problems were real. This helped us decide whichwere the most important.

Access to patient charts involves strict privacy and security considerations, and remoteaccessing requires detailed analysis and evaluation of security infrastructure. So we simulatedthis information for our prototype. Organizational issues such as staffing, budgets, and workschedules are economic and political issues that we won’t discuss further.

Page 127: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Wearable Systems in Nursing Home Care: Prototyping Experience 109

We could address the communication and information dissemination issues among thepersonnel within the limited scope of project. These are closely related, since communicationis a way of having the right information for the right person at the right time. We discussedthese issues with the nurses and came to a joint conclusion regarding the research prototype’sfocus. The consensus was that it should support easier communication among the personneland also function as a documentation tool for informal notes. It should be mobile and allowfor access from anywhere in the building, be less intrusive than a mobile phone, and employa highly streamlined interface to avoid taking focus from the patients.

6.3 Paper Prototyping

Hardware inventions to embody and run pervasive applications are often wearable or highlyportable. Research concepts often rely heavily on unique hardware and require workingprototype to test and illustrate the operational concepts. Producing these prototypes can beprohibitively expensive and time consuming. You can simplify hardware prototyping byusing modular approaches – for example, the Smart-Its project (www.smart-its.org) operatesthis way. Yet such prototyping remains focused on hardware technology, which runs the riskof distracting from function and usability.

Traditional HCI researchers have used paper prototyping to good effect [65]. This simplyinvolves drawing user interface components on paper, making it easy to alter designs andfix flaws early. Not everyone can use design software for prototyping user interfaces, buteveryone knows how to draw sketches with a pen. So paper prototyping allows end users tobecome part of the design process early. Paper prototyping is so artificial that it can removethe focus on technology. Instead, participants can focus on the product’s underlying conceptand usability. This works, however, because of the fixed and rigid desktop paradigm with itsgraphical presentation space that can be adequately represented on a sheet of paper.

Paper prototyping in pervasive computing applications involves additional challenges.The multimodal interaction in such environments supports more complex scenarios thandesktop GUIs accommodate. This freedom can mean the paper itself imposes restrictionson what can be done – for example, not having access to large sheets of paper can preventparticipants from envisioning whiteboards, while lacking small sheets can discourage themfrom thinking of handheld devices.

6.3.1 Paper, Pen, and Plastic

Following our requirements study with the nurses, we arranged a meeting to try our tech-nique. We informed them that we needed their input and feedback to design a prototypethat would be useful to them and that we would employ paper prototyping. We preparedscripted scenarios from the data we’d collected on typical nursing tasks, such as visiting andexamining patients, informing physiotherapists, taking blood samples, and making roundswith physicians. We let them role-play the scenarios and discuss various solutions to prob-lems they encountered. One of the nurses played herself in work situations, while two othersplayed patient roles. Each patient player received a brief description of the scenario.

Page 128: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

110 Wearable Systems in Nursing Home Care: Prototyping Experience

In addition to the scenario, we had text cards representing typical phone calls. We emu-lated a context-aware service that could determine whether the nurse was currently occupiedand could then choose between presenting phone calls directly or taking a message. For callsthat weren’t urgent, we displayed a simulated text message on a transparent piece of plastic.We then either set it in front of the nurse to view in private or placed it on the device thenurse carried. Presenting these messages randomly during the role play allowed the nurses todecide what visualization form they preferred.

After completing the scenario, we talked with the nurses about their experience. They ap-preciated being able to avoid interruptions from non-acute calls through simple mechanismssuch as screening incoming messages. They liked having text messages presented becauseit let them read without interrupting what they were doing. They found it useful to have apatient’s information available in advance or upon contact because it let them better preparefor encounters with special patients. For example, if a patient posed difficulties in takingblood samples, the nurses could add extra test tubes to their medical kits. The nurses alsoliked having the live video to directly show, for example, a patient’s shoulder fracture. Theythought this would save them time that they could spend on more important tasks. In general,the paper prototyping session made the prototype’s benefits clear without having to deployfully functional hardware and software.

6.3.2 Paper Prototyping Benefits

The benefits of early paper prototyping over an online, functional software-hardware pro-totype became clear at the end of the meeting. We brought some hardware just to show thenurses what is available with today’s technology. Immediately, we noticed their focus shiftingaway from usefulness to technical details regarding the software and hardware. They ques-tioned font sizes ("too small for me to read"), commented on video quality ("better than theone we saw on another project"), and expressed astonishment over features ("so I could evenwrite my emails on this"). As their focus scattered, they often addressed details irrelevant tothe tasks they would need to perform.

Importantly, while interacting with the computer, we noticed the nurses started to thinkin terms of traditional user interface widgets, such as buttons, menus, and keyboards, and torestrict themselves to the kind of interactions typical of desktop PC applications. They alsoappeared more dejected and hesitant to suggest improvements, and they expressed slightlynegative comments and questions such as, "We’ll need to take courses to understand this.You’ll arrange that for us, right?" In general, the nurses shifted their entire focus, assumingnow that only minor changes in technical details were possible. Clearly, such restrictions inthe design space should not dominate the early stages of prototyping.

We concluded that paper prototyping in the initial development stages makes participantsfocus more on a product’s concept and actual usefulness rather than letting technology con-strain their thinking and dictate what is allowed. Furthermore, unlike technology, paper doesnot restrict the design space, allowing participants to think beyond the inherent limitationsof current software and hardware. We see the same benefits for pervasive computing re-search that it exhibits for traditional HCI in desktop computing. Taking merely one week inpreparation, execution, and analysis, we deem this time well spent.

Page 129: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Wearable Systems in Nursing Home Care: Prototyping Experience 111

6.4 Moving to Multimodal Devices

With the newfound design considerations from the paper prototyping session in mind, wewent on to see what research technologies could help in realizing a multimodal prototype.Our research group’s background is based in real-time audio and video communication overthe Internet, and we have extensive experience in desktop e-meeting tools. We’ve also ven-tured into the pervasive computing field, investigating mobile and ubiquitous applicationsof the technology. Beyond technical barriers, we’ve found that the end users’ concerns andpreconceptions often determine whether new systems are adopted.

For this application, the prototype had to be mobile and non-intrusive, which fits wellwithin the field of wearable computing. We also saw healthcare applications requiring specialconsiderations, as illustrated in one nurse’s opinion: "It was not to sit in front of computersI chose this profession". The prototype had to avoid the negative emotions that time-stealingand crashprone desktop computers currently cause. An important way to hide technology andstreamline the interaction is to use context and situation awareness, automating the informa-tion presented to the nurses according to the current activity and workload.

Some of these concerns are major research problems in themselves, which meant that ouronline prototype couldn’t realize all concepts within a reasonable time frame. However, byemploying a Wizard of Oz approach to the user interface [12], we could still demonstrate thefunctionality and get the nurses’ opinions. Knowing how, or whether, the nurses used certainfunctions let us select what areas to put the most effort on in future prototypes.

6.4.1 Wearable Prototype

The next step was to build an online prototype with live software and hardware. We wantedthe prototype ready within a week from the paper prototyping session. This gave us no timeto build customized hardware, meaning we had to assemble our prototype from off-the-shelfproducts.

As a wearable computer, we chose a Sony Vaio U70P, a notebook computer with a 1-GHzPentium mobile processor and 512 Mbytes of memory. With dimensions of 16.7 x 11 x 2.8cm and weighing 550 g, it can be worn easily by strapping it to a belt, which the nursesdeemed suitable during our last meeting.

Because the nurses liked viewing information in private, we added a head-mounted dis-play as an alternative to looking at the U70P. We chose the semitransparent monocular M2Personal Viewer with full-color graphics in 800 x 600 resolution, even though it requires ahalf-kilogram battery. Since the prototype demonstrates concepts rather than a final product,we deemed the quality of graphics to be more important than the additional weight at thisstage. Figure 6.1 shows the wearable computer and display.

6.4.2 Communication Application

We chose Marratech (www.marratech.com) e-meeting application software because it fits thecommunication needs of the envisioned prototype. Marratech is a commercial product based

Page 130: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

112 Wearable Systems in Nursing Home Care: Prototyping Experience

Figure 6.1: Wearable computer outfit: battery pack and VGA converter for the head-mounteddisplay (left), Sony U70P computer (lower center), Bluetooth headset (upper center), and M2head-mounted display (right).

on earlier research in our group [58]. It allows for audio and video group communication,together with text chat and a shared whiteboard. Connecting the wearable computer to an e-meeting over a wireless network lets the nurse instantly contact other participants and becomeaware of their locations. The application also lets a nurse make phone calls, which canbecome part of the e-meeting. This allows persons not yet using the software to be included,easing its deployment.

Figure 6.2 shows the wearable computer running the Marratech Pro application, withvideo streams provided by Web cameras. The nurse can use the camera to convey live videoto physiotherapists or others to aid diagnoses.

6.4.3 Wizard of Oz Testing

We wanted to show a simple and automatic system to the nurses, so we decided on a Wizardof Oz experiment to simulate the context-aware and situation-aware system components. Thewizard retrieved information, processed interrupting phone calls, simplified communicationwith others, and minimized the interaction needed with the wearable computer.

Each morning the nurses make several calls for information regarding their patients thatday. In our prototype, the wizard collects this information and displays it in the shared white-board, thus shortening the time nurses need to set their daily schedule. With access to pa-

Page 131: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Wearable Systems in Nursing Home Care: Prototyping Experience 113

Figure 6.2: The Sony Vaio U70P running the Marratech Pro e-meeting application.

tients’ charts, historical notes, and other nurses’ locations and schedules, the wizard can postappropriate information throughout the day.

If a nurse is attending to a patient or is otherwise busy, the wizard intercepts all phonecalls. It sends only urgent calls directly to the nurse and either records the rest or posts themas a chat room message to read when there is time. This substantially reduces interruptionsduring patient encounters. When nurses wish to reach someone, the wizard can invite thatperson into the e-meeting session, thus allowing richer communication than normal phones.

To further simplify the nurses’ work, we let the wizard act as a speech recognizer. Nursescan take informal notes about certain patients for insertion into their chart. They can alsorequest information to be read out loud, which further simplifies the interaction with thewearable computer. In addition, the nurses can use voice to insert new tasks into their sched-ules.

Most of the wizard’s functions can be realized with today’s technology, such as RFIDtags, sensors, and speech recognition capabilities.

6.4.4 Feedback From the Nurses

We brought our wearable prototype to the Lighthouse to test it for a day, starting with audio-only communication. We equipped one nurse with the wearable computer and a Bluetoothheadset, while another nurse in another room used a laptop. We connected both devices tothe same e-meeting, which effectively mimicked mobile phones but reduced interruptionsand offered higher audio quality. We also allowed review of a fictitious patient history andchart to demonstrate the capability.

Page 132: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

114 Wearable Systems in Nursing Home Care: Prototyping Experience

Figure 6.3: A nurse wearing a computer and head-mounted display while attending a patient.

Next, we introduced video communication. As one e-meeting participant walked thecorridors, the nurses expressed their fascination with instantly seeing where their colleagueswere. This increased group awareness seemed beneficial, especially for finding the right per-son at the right time. Initially, the nurse sent video via a handheld camera and used the note-book’s display to view other participants. Because these tasks encumber the nurse’s hands,we also let them test the head-mounted display and camera. After a brief time, about 15 min-utes, they accustomed themselves to the novel display and learned to focus on the display orthe real world as needed. Soon the nurses could easily perform routine patient examinationswhile conveying video to other medical workers. This aided patient diagnosis and treatmentdeliberations. Figure 6.3 shows a patient encounter. Although noting the added weight, thenurses remained focused on the design concept being demonstrated and its benefits.

Thus, on the basis of results from paper prototyping and a wearable prototype built fromoff-the-shelf components, we successfully demonstrated the envisioned benefits and gaineduser acceptance for an assistive device in this environment. By starting with the basic func-tions and gradually adding more, we avoided intimidating the users with the amount of hard-ware and cables.

6.5 Final Remarks

About a month after the online study, we revisited the Lighthouse to discuss the prototypingin hindsight. Despite the time that had passed, the nurses still deemed the prototype use-ful and appropriate. We see this as confirming that the process yielded usable results and

Page 133: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Wearable Systems in Nursing Home Care: Prototyping Experience 115

that ethnographical studies and paper prototyping can be effective in pervasive computingresearch.

Ethnographical study provides valuable first-hand insight on how work is performed andgains the confidence of the user community. Paper prototyping offers a cheap and easy wayto get quick feedback on what constitutes a good or bad solution. Furthermore, people notin the pervasive computing research community still consider much of what is possible withtoday’s technology as science fiction. We found it difficult to get those people to envisionwhat’s possible, and we had to find ways of freeing them from traditional PC interface ideas,while not imposing our ideas on how things should be done. Paper prototyping can aid this tosome degree, and joint discussions of paper results encourage fuller exploration of the designspace. It also moves the participants’ focus from technology to usability and function.

One major challenge we found concerning paper prototyping is to communicate what isrealizable without constraining participants from exploring the whole design space spectrum.We think the best solution is to introduce concepts in a brainstorming session before the paperprototyping. You can add the wildest ideas to the hardware research agenda and use ideasthat can be realized with current hardware technology in the prototype. This should allow formore freedom of thought in the subsequent paper-prototyping stage while constraining theoverall process to prototypes possible given the available time, budget, and technology.

Realizing the paper prototype in functional hardware reveals differences between realityand visions that can require compromises. The Wizard of Oz approach allows for emulationof functionality not immediately realizable. However, if the prototype is meant to be deployedfor longer term studies, the researcher should be sure this functionality is realizable withinthe envelope of current technology.

Finally, rapid prototyping with end-user involvement from the start has a value in itself.The nurses we worked with spontaneously expressed enthusiasm for our project due to its fastpace. They felt that something was happening and that their input was valued. As opposed toother projects where meetings are half a year apart, rapid prototyping let them see progressfrom week to week.

6.6 Acknowledgments

We wish to acknowledge funding by the Centre for Distance-spanning Healthcare at LuleåUniversity of Technology. We also express our thanks to the nurses and medical workers atthe Lighthouse (in Swedish, "Fyrens korttidsboende") in Luleå, Sweden, for their participa-tion. We also thank the reviewers for valuable guidance.

Page 134: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

116

Page 135: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Part 7

Enabling MultimediaCommunication using a Dynamic

Wearable Computer in UbiquitousEnvironments

117

Page 136: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 137: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 119

Enabling Multimedia Communicationusing a Dynamic Wearable Computer

in Ubiquitous Environments

Johan Kristiannson, Mikael Drugge, Josef Hallberg, Peter Parnes, Kåre SynnesDepartment of Computer Science and Electrical Engineering

Luleå University of TechnologySE–971 87 Luleå, Sweden

{johan.kristiansson, mikael.drugge, josef.hallberg, peter.parnes, kare.synnes}@ltu.se

June, 2006

Abstract

The paradigm of wearable computing aims at providing unobtrusive communication for usersinvolved in real-world tasks. At the same time, current research trends in networking and mul-timedia envision ubiquitous multimedia communication where users can seamlessly meet andcommunicate anytime, anywhere, and via any device. Combining wearable computing andubiquitous multimedia communication can be used to achieve this vision, and enable richercommunication via a more lightweight wearable computer. When implementing such a sys-tem it is important to minimize configuration efforts required by the users to avoid disruptingthe user’s primary task, which is often of physical and cognitively demanding nature.

This paper presents a framework which enables users of wearable computers to commu-nicate using the most beneficial communication resources while reducing the configurationefforts. The framework consists of four components; the Information Repositories which forma distributed database, the Personal Communication Management Agent which makes con-figuration decisions, the Remote Control User Interface which is a protocol which enablescustomizing user interfaces for specific devices, and the Mobility Manager which switchesbetween resources. These components combined makes a dynamic wearable computer pos-sible, which can be tailored to current communication tasks. The paper presents an analysisof the number of messages required for these components to interact with each other, basedon the number of users and resources available. The paper also analyses the bandwidthrequirements of the current implementation.

As a proof of concept, a working prototype has been built by integrating the frameworkwith a commercially available e-meeting application. This prototype demonstrates how touse the framework to improve multimedia communication on a wearable computer in a real-world scenario. A user study presented in this paper shows that the prototype simplifies thework for nurses at a local nursing home, ultimately saving time which can be better spent ontheir patients. The result is a system enabling nurses to choose a lightweight alternative to apreviously used wearable computer with the same purpose.

Page 138: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

120 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

7.1 Introduction

The proliferation of mobile and wearable computing devices over the last decade have led toan increasingly nomadic computing lifestyle. At the same time, research has been conductedon allowing the users to seamlessly utilize equipment and communication software in smartrooms or in the environments to improve communication. Therefore, it is natural to envisionwearable computers which can use resources in the environment to adapt to user needs, andenable supportive communication when performing primary tasks in the real world.

Current research trends aim at supporting the users and enable seamless ubiquitous com-munication anytime, anywhere, and via any device. As an extension to these ideas, a usershould preferably be able to utilize equipment in the environment to create a dynamic wear-able computer. The purpose of such a system would be to provide unobtrusive support, whichin combination with rich communication could improve user experience, save time and in-crease efficiency. For example, the user would have access to information specifically relatedto the current task, and be able to discuss this with experts at remote locations. The diversityof tasks requires the wearable computer to be modular and easy to adapt to different situa-tions, both in terms of functionality provided as well as resources used. The user would beempowered1 with the choice of which resources to use, to pick up only those needed for thetask at hand, avoiding cumbersome or unnecessary equipment.

The work presented in this paper aims at combining ubiquitous computing with wear-able computing to create a dynamic wearable computer. The dynamic wearable computeris dynamic in a sense that it can use resources in the environment, as well as resources thatthe user decides to bring along. This could include typical wearable devices such as head-mounted displays, as well as stationary devices such as monitors and cameras found in theenvironment. Ultimately, this allows communication services to become more adaptable touser needs both in terms of functionality, quality, and cost. For example, the user can talk in amobile phone and be able to add a shared whiteboard function to the communication sessionwhen entering a room where a computer running an e-meeting2 software is available.

Switching between resources should be made as transparently and unobtrusive as possi-ble as the user is involved in primary tasks in the real world. The user should not have tomanually reconfigure the wearable computer to suit a different task. Instead, it should behandled simply and intuitively to relieve the user from unnecessary distractions. In addition,the user should be able to move around even if the system itself is distributed, and the infor-mation about the user, the resources and the environment should accompany the user and beintegrated into new environments. To make this possible and implement a dynamic wearablecomputer, several research problems need to be solved.

• What functionality is needed to transparently combine and switch between resourcescarried by the user and those available in the environment?

1The term “empowerment” denotes having the right to make one’s own choices and having the ability to act onthem.

2The term "e-meeting" denotes a group web conferencing session which can include video, audio and chat amongother media. Rather than requiring a dedicated meeting room, e-meetings can take place from the user’s desktop andbe used for either formal or informal communication.

Page 139: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 121

• What functionality is needed to automatically configure resources to be used in thedynamic wearable computer?

• How can the distributed information storage infrastructure be designed to provide easyaccess to information and support the decision making process?

This paper presents a framework which adresses the aforementioned problems, and buildson ideas from previous research [16, 27, 36]. The purpose of the framework is primarily toassist designers to develop ubiquitous communication systems. As an example, the paperpresents a proof of concept prototype which was deployed in a nursing home to supportnurses with their work.

The rest of this paper is organized as follows. Section 7.2 gives a brief introduction toprevious work related to ubiquitous multimedia communication. Section 7.3 presents theframework, followed by Section 7.4 with an evaluation of the implementation and proofof concept prototype tested in a real world scenario. Finally, in Section 7.5 the paper isconcluded with discussion and pointers to future work.

7.2 Background and Related Work

Over the last decade, wearable computing has been used to augment human abilities invarious domains, for example in healthcare [39], inspection tasks [4], and military opera-tions [85]. While wearable computing can be used for solving a wide range of problems, thispaper concerns mediated human communication such as having a remote expert providingguidance to field workers solving a real world problem. Previous work by the authors [16]exemplify how e-meetings through a wearable computer can be applied to share knowledgeand increase the capability of the field worker, as well as how such e-meeting systems canbe prototyped for a health care setting [14]. This paper improves this concept further by al-lowing the field worker to utilize resources in the surrounding environment, to provide richercommunication through new media.

Although there has been much research, e.g. [29, 38, 44], conducted on isolated parts ofe-meeting systems, little research has been done to combine all these parts into a coherentand functional framework which can be used to create a dynamic wearable computer. Forexample, early work by [62] presents a framework for mobile and ubiquitous access to mul-timedia services. Although the idea is very similar to the one proposed in this paper, it lacksseveral important functions needed to create a working system. This includes functionalityto automatically configure resources, such as information management or components formaking decisions on which resources to use, based on the needs of the user. Moreover, it fo-cuses mainly on general multimedia services, whereas this paper focuses on mediated humancommunication.

Context-awareness is a widespread approach to customize applications, and provide userswith an increased service value. Pioneer work in this field includes the Active Badge system[80] which improves channel selection through autonomous routing of phone calls to thephone nearest the user. Aura [22] is a general purpose architecture which tries to utilizeresources in the user’s environment to give the user access to desired tools, while simplifying

Page 140: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

122 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

the configuration tasks needed by the user. However, even though Aura is an interestingidea, it relies on advanced artificial intelligence which makes it hard to deploy in reality.Recent research in this area focuses on specific scenarios, such as the EasyMeeting system [8]which tries to provide relevant services to speakers and the audience in an auditorium. Forexample, automatically handling an overhead projector during a presentation. Similar to theEasyMeeting architecture, the framework presented in this paper employs context-awarenessand ontologies to improve communication, although it does not restrict itself to one isolatedsmart space but allows users to utilize resources as they move around.

Regarding switching between resources, several mobility management architectures [44,79] have been proposed in the literature. For example, the Mobile People Architecture [44]performs person-level routing through a proxy to let users communicate from any device,network, or application. Similar functionality can be implemented by using the Session Initi-ation Protocol [28]. In general, these architectures require user interaction to change device.Moreover, they can not handle the aggregation of multiple devices, for example transparentlycombine a mobile phone with an e-meeting.

The major contribution of this paper is a framework that combines mobility managementwith context-awareness to create a dynamic wearable computer for mediated human commu-nication. Another contribution is a formal evaluation of the implementation and a proof ofconcept prototype used in a field test. All parts of this paper are unpublished, except for themedia resource selection algorithm mentioned in Section 7.3.2. The next section describesthe framework further and the functionality needed to build a dynamic wearable computer.

7.3 The Ubiquitous Communication Management Frame-work

To be able to conceptually handle aggregation of multiple communication tools, a model forubiquitous multimedia communication was introduced in [36]. An essential part of this modelare the terms media source and media sink, which are used to encapsulate media devices ofthe same type and make comparisons, thus increasing the flexibility by increasing the user’sselection of media resources. In short, a media source is an abstraction of a media-capturinghardware device, such as a camera or microphone and the software that generates the mediastream. Similarly, a media sink is an abstraction of a media-rendering hardware device suchas a loudspeaker or a display and its accompanying software, which is used by the device toreceive the media stream. The term media resource is henceforth used instead of the termcommunication tool to denote either a media sink or a media source.

Figure 7.1 illustrates an overview of the framework, which consists of four componentsneeded to configure and manage media resources; the Information Repositories, the PersonalCommunication Management Agent, the Remote Control User Interface, and the MobilityManager. The Information Repository component is a distributed database containing infor-mation about the system and the users. This information is collected by several sensors whichpublish information to the information repositories. The Personal Communication Manage-ment Agent, henceforth denoted as the agent, is an agent with the purpose of finding and

Page 141: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 123

PersonalCommunication

ManagementAgent

Configure

Media Resource

User

interaction

Subscribe

on events

InformationRepositories

Sensor

Sensor

Sensor

Sensor

Publish

Publish

Publish

Publish

RemoteControl UI

MobilityManager

Search

for information

MediaResources

Media Resource

Sensor

Publish

Configure

Figure 7.1: Overview of the framework.

selecting media resources for satisfying the user needs. This is achieved by traversing infor-mation in the information repositories and observing when the state of the system changes,for example when a user moves to another location. When a new media resource is availablethe user is notified through a Remote Control User Interface, which is a user interface thatcan easily be customized for specific devices. The agent configures the Mobility Managerif the user decides to use a new media resource. The manager then migrates media streamsbetween media resources so these can be used together in a dynamic wearable computer.

For example, if the user is talking on a mobile phone and enters an office with an e-meeting system running on a desktop computer, a location sensor updates the user’s currentposition to the information repository component. The information repository componentthen notifies the agent which starts searching for potential media resources registered to theinformation repositories. If it finds a better media resource, i.e. the media resources providedby the e-meeting system, it notifies the user via the remote control user interface, allowingthe user to accept the switch to the media resource. Assuming the user want to switch, themobility manager then redirects the audio stream from the mobile phone to the e-meetingsystem, or integrates the mobile phone together with the resources provided by the e-meetingsystem in case the user wish to keep the mobile phone for communication but take advantageof other media available in the e-meeting.

The remainder of this section presents the framework components in more detail, startingwith the information repositories. This is followed by a description of the personal commu-nication management agent, the remote control user interface, and the mobility manager.

Page 142: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

124 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

7.3.1 Information Repositories

In order for the agent to search for information and select which media resource to use, theinformation needs to be obtained from several sensors and organized to be interpretable bythe agent. This means that information needs to be represented with limited variable typesand quantified where applicable so it can be utilized by simple rules. For example, by usingpredefined variable types and structuring information repositories internally in a well definedmanner, it is possible to create rules based on information about the user’s needs, availableresources and their capabilities, and the environment. One way of managing this informationis using ontologies3.

Ontologies

Over the years, several standardized ontologies have been proposed to make it easier to im-plement management functions and integrate independent systems. For example, the IETFManagement Information Base [50] provides a set of specifications defining how a deviceor service can be managed. Another standard, the Standard Upper Ontology [74], providesa general ontology that can be used to construct more specific domain ontologies (medical,financial, engineering, etc.).

The ontology used in this paper divides information into three different domains, eachwhich contains relevant information about its domain. These domains are the Media ResourceInformation Repository, the Environment Information Repository, and the User InformationRepository. To be able to process the stored information there is a specified list of attributesand variables which can be stored and accessed; each domain has its own list.

The first domain, the media resource information repository, contains relevant contextsabout a single media resource. The list of attributes and values in this domain is specificto each resource type, e.g. video source and sink, audio source and sink, and input device.For example, an audio source might include information about sampling rate and used audiocodec, and a video source could include resolution or other relevant information about thevideo codec.

The second domain, the environment information repository, contains links to the in-formation repositories of all media resources which are located in the environment. It alsocontains contexts about the environment which might help describe the situation a user iscurrently in. This could include information about other people in the environment, about thepurpose of the environment (e.g. office, conference room, bedroom, bus transportation, etc.),and other aspects about the environment itself, such as temperature and noise level. Finally,if the environment exists as a part of a larger one or if the environment has a sub-environmentand these have their own information repository there would be a link to these as well.

The third domain, the user information repository, contains contexts about a user, theuser preferences, and user defined rules. It also contains links to environments’ informationrepositories where the user is currently active, and to used media resources’ information

3An ontology is a formal description of a specific domain that enables computers to process information relatedto the domain. It is a network of relationships that are self-describing and used to track how items or words arerelated to one another. It consists of concepts, relationships, and axioms in relation to the specific domain.

Page 143: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 125

repositories. There are also links to the respondents’ user information repositories, whichcan be used to see what media resource types the respondent is using, and what other mediaresource types are available in the respondent’s environment. If any of the respondent’s usedmedia resources are a privacy risk to the user, the user is notified. However, the respondent’sinformation repository does not necessarily provide access to other contexts stored about therespondent, such as if the respondent is busy, the respondent’s location, etc. Such contextsabout the user are primarily used to determine the user’s communication needs, but can ofcourse be used for other services if approved by the user.

The information which is stored in the information repositories are tuples containing rawdata and an associated key. As an information repository can contain a large amount ofdifferent tuples, it is important to make abstractions in order to make it easier for users toconfigure the system and to provide the means for computers to compare media resources ofthe same type to each other. This was done in [36] by using three abstractions: cost, privacyand quality. As the list of all permitted contexts for each media resource type is known by thesystem it is possible to create an algorithm which weighs the contexts for a media resourceand create indexes for each abstraction. Once the quantified abstractions are calculated it is asimple task to compare other media resources of the same type with each other.

Data Storage

So far ontologies for managing data have been discussed. However, storing and accessingdata is just as important, which presses the importance of a context-storage infrastructurewith query-possibilities. This infrastructure needs to be flexible and able to handle a largeamount of data, it should also be easy to administrate and manage. There are already severalsystems for storing and accessing data, such as centralized databases, or centralized context-awareness platforms [13,72]. Although these are generally good at handling much data, theycan be hard to administrate as administrators must grant permissions to everyone contributingwith information. These also introduce a single point of failure, and can introduce delay ifit is topologically far away from the access network being used. However, current researchis moving towards decentralized solutions for privacy and scalability reasons [29]. Anothersolution is to implement it using a distributed hash table such as CAN [64] or Chord [76].

No matter which storage infrastructure is used there are some crucial issues which needto be addressed. Ubiquitous services often require fine grained access control, ubiquitouscommunication is no different. This access control can be based on identity or differentcontexts. For security and privacy reasons it needs to be clear to the user which informationis being shared with other users and which information is only used for access control orused for the system’s decision making process. The storage infrastructure should also beself-managed, meaning sensors would be able to publish information to the system whichwould cause the system to update, for example links to other information repositories areupdated when the user moves to another location. It is also important to be able to access thecorrect information repository as it is likely there are many different in the area.

One way of dealing with access control is to arrange media resources into groups, whereonly group members can use the media resources. Another way is the principle of local-ity [37]. Locality means that the user is only allowed to use devices which are nearby the

Page 144: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

126 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

Information

Information

User Information Repository

(e.g Information

about Mikael)

Media Resource

Information Repository

(e.g. a HMD)

Environment

Information Repository

(e.g. Mikael’s office)

Media Resource

Information Repository

(e.g. a keyboard)

Media Resource

Information Repository

(e.g. a office phone)

Sensor

SensorSensor

Sensor

Information

Information

Information

Figure 7.2: Information Repositories.

user, for example in the same room. Locality makes sense not only for reasons of usefulnessbut also when considering privacy. Without the principle of locality, cameras and other mediasources could be used as remote surveillance devices, thereby introducing a large risk of vio-lating other people’s privacy. Locality also ensures that media sinks are not used without thereceiver’s permission, thereby protecting the receiver from unwanted interruptions. However,there are two issues which need to be considered when applying the principle of locality. Thefirst is what is to be considered as close enough, and the second is how to verify the locationof the user and the media resource. Even if the principle of locality is useful, it is not neces-sarily true that a user should have access to all nearby resources. This further decreases thenumber of relevant resources to those nearby the user which are not busy and which the userhas permission to access.

The storage infrastructure used in the proof of concept implementation (see Section 7.4)is a tree-based structure as depicted in Figure 7.2. In the case where information reposito-ries are distributed to different computers a global directory service is used to keep track oflinks to other information repositories. When a sensor has new information to publish, it isautomatically directed to the correct information repository with the help of location or anidentification key which is associated with a media resource, an environment, or a user. Whenthe agent subscribes to information in an information repository it is automatically notifiedwhen new information is published. In case a new information repository becomes availablethe agent is also notified and can access this new information to take appropriate actions.

Sensors

The information that is published to the information repositories can come from many dif-ferent sources. A naming service which maps a user’s current location to a new environmentwith a information repository could be one, another could be knowing a user is actively typ-ing on a keyboard in the office, and then there are of course the more conventional sensors,

Page 145: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 127

such as location sensors, activity sensors, etc. For a sensor to be used with the informationrepositories it must provide at least one of the attributes or variables which is supported bythe system. Each of these attributes or variables has its own standard format which the sen-sors much comply with. However, as most existing sensor systems are not compatible anddoes not support the information repositories, some methods are needed to integrate exist-ing sensors. This can be done by either modifying existing sensor implementations or byencapsulating them.

7.3.2 Personal Communication Management Agent

As mentioned in Section 7.3.1 information of the user’s situation and needs, the availableresources and their capabilities, as well as information about the environment are used tomake media resource selection decisions. This information is accessible from the differentinformation repositories but still need to be processed. Because the information repositoriesonly allow certain attributes and variables and as these are known by the agent, rules can beformed which utilizes the information. This is done by the media resource selection algorithmwhich decouples the problems involved in the selection process in three parts: abstraction,reading and processing of user preferences, and notifying the user. Each part will be describedin further detail in this section.

In order to make the actual resource selection the agent first needs to make a few ab-stractions, such as the quantifiable variables mentioned in Section 7.3.1 (cost, privacy, andquality), and a mobility abstraction. The mobility abstraction is a new addition since previouswork presented in [36] and is a measurement which signifies how mobile a media resourceis, for example a mobile phone is portable and lets the user move around while a stationarycomputer is not. There is also a non quantifiable abstraction which is the scenario-detection.This abstraction is used to determine which situation the user is currently in.

The situation description gained from the scenario-detection abstraction can be quiteadvanced and detailed. With the advanced rule sets and access to enough information thescenario-detection would be able to determine subtle differences in which situation the useris in. This would make it possible for the user to configure the behaviour of the media re-source selection in more detail, for example specify what should happen in specific locations,or how the system should behave when the user is performing certain actions. However, asmentioned in [26] the more detailed configuration and complexity, the more sensor data isneeded and thus making it more difficult to deploy. Therefore, at current state, based onprevious work from [27], only three different scenarios are detected: home, office and other,where other is considered as any public place. These abstractions are very granular and de-scribe a location rather than a complete situation. However, with such granular abstractionsit is easy to determine the situation with few sensors which make it easier to deploy. It is alsoeasy to add rules if more detailed scenario-detection is desired.

Because only three different scenarios exist it is also relatively simple to create differentrules for how the quantifiable abstractions should be calculated in the different scenarios.Here the user can also add preferences in the corresponding user information repository,for example if privacy should have precedence over cost in office scenarios. The indexeswhich are assigned to the media resources are relative to each other, meaning the resources

Page 146: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

128 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

are assigned indexes in ascending order, starting from one, based on which resource is bestfor a given abstraction. The decision to determine which resource is best is based on theinformation stored in the media resource information repositories. For example, for a videocamera it would be appropriate to compare frame rate, image quality and placement if thereare more than one video camera available.

When the abstraction creation part is done the algorithm can use the user preferences,located in the user information repository, to weigh together the different abstractions. Thesepreferences will affect which resources are suggested to the user when there are several tochoose from. The scenario-detection abstraction helps decide which of the different prefer-ences is applicable in a given situation and from here it is a rather simple procedure to selectan appropriate media resource as the remaining variables are ordered in ascending order anda user preference exist. If desired, it is possible to create more complicated rules whichtake several factors into account although this will also make it more difficult for the user toconfigure the system.

The user needs to be notified when the algorithm has selected a new media resource. Thiscan be done in several different ways, however one matter which needs to be considered isthat a user can utilize different resources with different capabilities. This means the way inwhich the user should be notified may differ as well. One way to solve this problem is usinga remote user interface, meaning the actual user interface is specified by the resource whichpresents it. To send new information to the remote user interface the agent can send messagesand events through a specified protocol to the remote user interface, which is described infurther detail in the following section.

7.3.3 Remote Control User Interface

The Remote Control User Interface runs as an application on the user’s personal device. Thepurpose of the application is to present available media resources to the user, and enable theuser to select which ones should be used. The application can also receive suggestions fromthe agent that a certain media resource should be used, so that the user interface can guide theuser in selecting the most appropriate resource. The suggestions are computed by the agentbased on the user’s current context, and involves taking quality, mobility, privacy concerns,and potential costs of using a resource into account. As the application only marks thesesuggestions in the user interface and does not automatically select them, the user is still incharge at all time and can override any faulty or improper suggestion from the agent.

When the agent finds that a media resource has been added or removed in the user’scurrent information repository, it sends a message to the application that adds or removesits corresponding representation in the user interface. In a similar manner, when the agenthas computed which media resource is the most appropriate to use given the user’s currentcontext and situation, it sends a suggestion event to the application which emphasizes thisin the user interface. These add, remove, and suggest operations are all generic and keptindependent of the actual implementation of a user interface, meaning that different personaldevices can have different types of interfaces based on their capabilities.

Page 147: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 129

In order to have as few requirements on the user’s personal device as possible, the con-cept of UI remoting [38] is employed. This means the user interface is loosely coupled to theunderlying functionality with a minimum of signaling between them. It also means one userinterface can be replaced by another user interface, yet still be able to control the same func-tionality. For example, a handheld computer can employ a simple graphical user interface,while a mobile phone can have an audio only interface with speech control. An applicationrunning on a wearable computer could in turn employ a different kind of user interface, utiliz-ing everything from an ordinary GUI to gesture control and implicit user interaction to makethe selection. In Section 7.4.7 an example implementation of a remote control application ispresented.

Internally, when the user selects a certain media resource in the user interface, the appli-cation sends a message to the agent informing it about the request. The agent in turn notifiesthe mobility management gateway to configure the media resource for use. Feedback is givenexplicitly by the user interface to mark which resources are being used, and the user will alsonotice this implicitly by seeing the resources activated — e.g. a video stream moved from theuser’s head-mounted display onto a nearby TV screen.

7.3.4 Mobility Manager

To be able to switch to a selected media resource, a mobility management protocol is neededto redirect incoming and outgoing media streams to new media resources. In addition, mediastreams produced by new media resources must be integrated with the communication systembeing used, so that it looks like the media streams originate from the same user. For example,in a group communication system like Marratech [48], the user should only be given oneidentity, and be able to seamlessly switch to another media resource while communicatingwith another user.

Mobility management in general can be divided into several subclasses. The most re-searched class so far is host mobility, which refers to preserving an active communicationsession while switching between network interfaces. Another class of mobility called sessionmobility refers to maintaining an active session while switching between devices or communi-cation services. In contrast to host mobility, session mobility aims particularly on suspendinga session and then resuming the session on another device. It can be viewed as a subproblemof personal mobility which aims at providing access to services from anywhere, anytime, byusing a personal identifier.

This paper focuses specifically on how to provide session mobility for RTP-based com-munication systems. Figure 7.3 shows two major approaches, which are both supported bythe framework. The first approach, as depicted in Figure 7.3(a), is to implement mobility sup-port in a proxy or gateway running in the network similar to the Personal Proxy proposed inthe Mobile People Architecture [44]. Session mobility support can then be implemented byforwarding incoming and outgoing traffic via the Personal Proxy and transcode interceptedpackets. For certain IP telephony gateways, the Megaco protocol [78] can be used for initial-izing and managing connections.

Page 148: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

130 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

MediaResource

MediaResource

MediaResource

Media Gateway

Configure

Mobility Manager

Mobility

Manager

RTP Packets RTP Packets

PersonalMangement

Communication Agent

(a) Mobility support in a media gateway.

MediaResource

PersonalCommunication

Management Agent

Mobility

Manager

MediaResource

Mobility

Manager

MediaResource

Mobility

Manager

Media Gateway

Configure

Mobility Managers

RTP Packets RTP Packets

(b) Mobility support in communication tools.

Figure 7.3: Two ways of implementing mobility support.

The second approach of implementing mobility support is to add support directly to me-dia resources as depicted in Figure 7.3(b). One solution to this approach is to use the re-INVITE function available in the Session Initiation Protocol [28] to reestablish the commu-nication after switching to another media resource. For example, when a user switches toanother media resource, the new media resource sends a re-INVITE message to the mediaresource which the old media resource was connected to. Another solution is to implementan application-level router in each media resource, which redirects media streams to differentmedia resources used by the users.

Independently on how mobility support is implemented, the agent must be able to con-figure the used mobility protocol. If mobility support is implemented in media resources,the agent must be able to tell all media resources how to reestablish the communication andhandle different streams. Similarly, if it is implemented in a gateway, the agent must beable to tell the gateway how to transcode different streams. In the framework, this task isperformed by the mobility manager. The mobility manager maintains an internal databasecalled the media resource table, which contains information on how to handle different me-dia streams. The following two subsections discusses in more details how mobility supportcan be implemented, and what kind of information that must be stored in the media resourcetable.

Handling mobility support in a media gateway

In RTP-based communication systems, every media stream is assigned a unique identifiercalled synchronization source identifier (SSRC). In addition, every user is assigned an iden-tifier called canonical name (CNAME), which is used to map several SSRC to a particular

Page 149: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 131

user. Hence, a simple solution to implement mobility support is just to use the same CNAMEfor every media resource belonging to a particular user, assuming the clients can select whichmedia stream to present and block streams from inactive media resources. This will be furtherdiscussed in the next subsection. Configurable RTP translators in the gateway can be usedto switch media streams without modifying the clients, if they can not select which mediastream to present. In this case, the media resource table needs to contain information aboutwhich streams to use for every user in the session.

The main advantage of implementing mobility support in a gateway is mainly that thecommunication tools providing the media resources can be kept unmodified. The gatewaycan even transcode media streams to provide interoperability between otherwise incompati-ble tools. However, a potential drawback is that additional delay may be added as the streamsmust be triangularly routed via the gateway instead of directly to the clients, assuming themobility gateway is not integrated with another gateway required by the communication sys-tem.

Handling mobility support in media resources

An alternative solution to handle mobility in the gateway is simply to modify the graphi-cal interface provided by the media resources and select which streams should be active foreach user. For example, notify a video sink which stream to present or decide which videocomponent to show. In order to avoid inconsistency, this method requires all applicationsto be synchronized and have a common view of how the system is configured. One wayof synchronizing the applications is to have a distributed protocol and replicate configura-tion settings. In the framework this is done by letting each media resource have a separatecopy of a shared media resource table containing mappings between user identifiers and theCNAMEs of active components or streams. As only the agent updates the media resourcetable, synchronization can be implemented simply by transfering a new media resource tableto the mobility manager located in the media resources, or letting the mobility manager fetcha new media resource table themselves.

The main advantage of implementing mobility support directly in media resources is thatit does not require a central server for processing incoming and outgoing media streams. An-other advantage is that it does not require special RTP translators or special codecs. However,there are two drawbacks of handling mobility in media resources that should be emphasized.The first drawback is that the communication tools containing the media resources need to bemodified, which may not always be possible. The second drawback is that several messagesmust be exchanged between media resources in order to synchronize the media resource table.This issue will be further investigated in the next section.

7.4 Evaluation

The framework described in Section 7.3 has been implemented and can be used for assistingdevelopers to design ubiquitous communication systems. It follows a modular design to allowmore advanced systems to be developed in the future. For example, the algorithm for select-

Page 150: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

132 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

ing media resources can easily be replaced by upgrading the agent. The framework is alsodesigned to allow different components to be loosely coupled. That is, different componentscan run on separate hosts and be dynamically added or removed in runtime. This is necessaryto be able to support multiple users and be able to switch between media resources residingon different hosts. This section describes how the framework is implemented, analyses itscomplexity and bandwidth requirements, and describes the proof of concept prototype usedin the nursing home scenario.

7.4.1 Framework Implementation

The framework4 is implemented in Java to be platform independent, which makes it possibleto run on a wide range of platforms, including some mobile terminals. It provides a set ofAPI:s which can be used to adapt multimedia communications systems so they can be usedas media resources, this includes getting access to the media resource table mentioned in theprevious section, and register communication tools to information repositories so that theycan be discovered by the agent. It also provides API:s to specify parameters in the mediaresource selection algorithm to make it more accurate. It is also possible to use the API:s todevelop sensors and integrate them with the framework. In addition, API:s are provided fordeveloping remote control user interfaces customized for a specific device.

To allow the components to communicate with each other, the framework provides anevent notification service, which allows components to interact by sending and receivingevent messages. In the current implementation, the event messages contain information aboutmodified tuples, sensor data, or media resource tables, but can also be extended to transferother information. By registering to a specific component, a component can receive eventnotifications when the status of that component changes. For example, the agent can registerto an information repository and be notified when a particular tuple is modified, thus makingit possible for it to react to changes such as when a user moves to another location.

The event notification system is implemented using RMI (Remote Method Invocation),which is Java’s approach to Remote Procedure Calls. By using RMI a Java object can invokemethods on a remote object as if it was created locally. RMI is implemented using a mech-anism in Java called object serialization, which is used to to marshal and unmarshal methodparameters when invoking a method on a remote object. However, as will be discussed inSection 7.4.3, object serialization can result in extra bandwidth utilization, thus limiting thescalability of the system. The rest of this section discusses different operations provided bythe framework in more details and investigates the complexity of each operation.

Operations

The framework provides five categories of operations, which are needed to discover andswitch to new media resources.

add/remove/get information repository The first category is used to setup, dispatch, oraquire information repositories. In the current implementation, every communication

4Binaries and source code can be downloaded from http://media.csee.ltu.se/∼johank/ucmf/

Page 151: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 133

SensorInformationRepository

AgentMoblityManager

RemoteControl UI

MediaResource

Update tuple

Generate

suggestion

event

Generate

response event

Generate media

resource event Update

media resource

table

Generate

information

repository

event

Figure 7.4: Messaging between components when adding a tuple and switching to a newmedia resource.

tool is responsible for setting up their own media resource information repositories.Environment and user information repositories are created statically. To be able toremotely access information repositories, every information repository is assigned anunique identifier and registered to a global directory service.

publish sensor data The second category is used by sensors to publish information to infor-mation repositories, i.e. add or update tuples in information repositories.

change location The third category is used by location sensors to link information reposi-tories to other information repositories. This operation is similar to the publish sensoroperation, but causes the agent to search for new media resources.

update remote control user interface The fourth category is used to notify the user aboutdiscovered media resources and let the user decide if it should be used or not.

change media resource The last category is used to synchronize media resource tables lo-cated in the mobility managers in order to allow the user to carry out a switch to anothermedia resource.

Figure 7.4 shows a sequence diagram illustrating interactions between components whenadding a tuple (publishing sensor data), which cause the system to switch to a new mediaresource. After a sensor has added or updated a tuple in a particular information repository(e.g. a media resource information repository), a information repository event is generated,which is later received by the agent. The agent then compares the affected media resourceto investigate how it perfoms in relationship to the currently used media resource. This isdone using the media resource selection algorithm described in Section 7.3.2. If the agentfound the media resource to be better it generates a suggestion event, which is sent to theremote control user interface to consult the user. If the user accepts the new media resource,a response event is sent back to the agent. If the user accepts the suggestion, the agent sendsa media resource event to the all registered mobility mangers, which makes sure all mediaresource tables are synchronized.

Page 152: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

134 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

SensorPersonal

InformationRepository

AgentMoblityManager

RemoteControl UI

MediaResource

Update location

Generate

suggestion

event

Generate response

event

Generate media

resource event Update

media resource

table

EnvironmentInformationRepository

Media ResourceInformationRepository

Generate

information

repository event

Get information

Information

Get media resource

information repository link

Media resource

information repository links

Figure 7.5: Messaging between componenst when changing location and switching to a newmedia resource.

Figure 7.5 shows a similar sequence diagram as the one just discussed, but shows inter-actions between components when re-linking a user information repository to another envi-ronment information repository. In this case, a location sensor updates the tuple in the userinformation repository containing the link to the current environment information repository,which causes a information repository event to be generated. When the event is received bythe agent, it accesses the new environment information repository to get a list of references toall media resource information repositories registered to the environment information repos-itory. It then traverses all media resource information repositories to find out informationabout the new media resources, and uses the media resource selection algorithm to calculatewhich one is best to use. Given the results from the selection process it generates a sugges-tion event to the remote control user interface, and updates remote media resource tables aspreviously described.

As the framework requires several messages to be exchanged between components, itis important to analyse how the total number of messages increases when the number ofusers and available media resources increases. It is also important to be aware of how muchbandwidth these messages will consume as that limits the scalability of the system. Forexample, as will be investigated later in this section, the bandwidth overhead introducedby the event messages directly limits the total number of sensors that can be attached toa information repository. The next subsection discusses these issues in more details andanalyses the message complexity of each operation and bandwidth overhead implications ofthe current implementation.

Page 153: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 135

7.4.2 Message Complexity

Message complexity can be defined as how many signalling messages are needed to performa specific operation. Table 7.1 summarizes the number of messages that needs to be sent peroperation and user. The first column in the table shows the number of messages that needsto be sent if all components run on separate hosts. The second column shows the numberof messages that needs be sent if all information repositories, the directory service, and theagent run on the same host, and the third column shows the number of message that needs tobe sent if all sensors, all information repositories, and the agent run on the same host.

The complexity of the add information repository operation depends on how the direc-tory service is implemented and if the information repositories are running on one host or dis-tributed over several hosts. In the current implementation three messages need to be sent. Thefirst two messages are sent to get access to the directory service, which is also implementedusing RMI. The third message registers the new information repository to the directory ser-vice. Note that zero messages need to be sent if the directory service and all informationrepositories run in the same virtual machine.

Table 7.1: Minimal number of messages required per operation.

Operation Separate hosts Info. Rep.+Agent All (except media resources)

Add information repository 3 0 0Remove information repository 3 0 0Get information repository 4 0 0Publish sensor data 2 1 0Change location 4+2 ·m 1 0Update Remote Control UI 2 2 2Change media resource 1+u 1+u 1+uChange media resource (GW) 1 1 1

In the table, m is the total number of media resources attached to a specific environmentinformation repository, and u the total number of users. As can be seen, all operations exceptthe change location and the change media resource operation are O(1). The complexity ofthe change location operation is dependent on the number of media resources registered toan environment information repository, and is therefore O(m). This is because the agentneeds to traverse all media resource information repositories in order to obtain informationabout them. The geographical area that an environment information repository representsmay of course differ, but if the geographical area is small it is unlikely that m is significantlylarge. Therefore, the change location operation is in reality an inexpensive operation in termsof message complexity. The complexity of the change media resource operation on the otherhand depends on how session mobility management is implemented. If mobility managementis implemented in a media gateway, only one media resource update event needs to be sent.However, if mobility management is implemented in the media resources, media resourceevents need to be sent to all media resources connected to the session. This operation isO(u) as update messages must to be sent to every user, and can thus be expensive if there aremany connected users and the message size is significantly large. Note that sending a large

Page 154: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

136 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

amount of messages may not only result in increased bandwidth overhead, but can also resultin decreased user perceived performance if it takes long time to complete an operation.

As can be seen in Table 7.1, the total number of message can be significantly reducedif the components run together in the same virtual machine. For example, it would meanthe number of messages needed to publish sensor data can be reduced when the sensor runstogether with the information repository, or the events generated by the information repositorycan be reduced if parts of the agent run together with the sensor. However, it may not alwaysbe possible to run all components in the same virtual machine. For example, several sensorsthat measure jitter in different media resources can not run on the same machine because themedia resources normally run on different hosts. Hence, a good design rule is to try to locatean information repository close to the actual sensor that publish data to it.

7.4.3 Bandwith Overhead

Bandwidth consumption was measured using Ethereal [17] and includes IP and TCP headers.Table 7.2 summarizes bandwidth required to execute different operations. The data presentedin the table are average values from 30 runs using a Intel Pentium 4 (3.4 GHz) computerrunning on a Linux 2.6.12 kernel.

Table 7.2: Bandwidth consumption per operation in kB.

Operation Separate hosts Info. Rep.+Agent All (except media resources)

Add information repository 5.0 0 0Remove information repository 5.0 0 0Get information repository 4.9 0 0Publish sensor data 5.3 5.3 0Publish sensor data (keep-alive) 0.3 0.3 0Change location 9.3+3.8 ·m 5.3 0Update Remote Control UI 8.0 8.0 8.0Change media resource (all) 26.1+2.0 ·u 26.1+2.0 ·u 26.1+2.0 ·uChange media resource (diff) 26.1+15.1 ·u 26.1+15.1 ·u 26.1+15.1 ·uChange media resource (GW) 41.2 41.2 41.2

As can be seen, using RMI to implement the event notification is quite expensive in termsof bandwidth overhead. One reason is that the current implementation is not optimally imple-mented as some supplementary remote calls are executed to implement the change locationand the change media resource operation. Although it was not as efficient as it could be, itwas deemed appropriate to perform a bandwidth measurement of the framework as it wasdeployed in a real scenario.

In the table, two versions of the publish sensor data operation are presented. The differ-ence between the publish sensor data and the publish sensor data (keep-alive) version is thatthe connection to the information repository is not dropped in the keep-alive version. Usingkeep-alive significantly reduces the bandwidth overhead. In the table, three versions of thechange media operations are also presented. In the first version, change media resource (all),a separate copy of a media resource table is copied (serialized) to all clients. In the second

Page 155: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 137

version, change media resource (diff), only the differences between the old and the new me-dia resource table are sent. As can be seen, only sending the difference is much more efficientin term of bandwidth consumption compared to sending the whole table. In the third version,change media resource (GW), mobility management is implemented in a media gateway. Inthis case, only two messages need to be sent, which is the best solution in regards to signallingbandwidth overhead.

Table 7.3: Number of sensors on a 10Mbit network.

Update frequency 10 Hz 5 Hz 1 Hz 0.1 Hz

No keep-alive 24 48 241 2415Keep-alive 426 853 4266 42666

Using Table 7.2, it can be calculated how many sensors that can maximally be attached toa information repository. As can be seen in Table 7.3, when not using keep-alive and publish-ing sensor data in frequency of 10 Hz, only 24 sensors can be used at the same time on a 10Mbit network. In this case, all bandwidth is consumed by the sensors, leaving no bandwidthto other applications or users. The number of sensors can be increased by decreasing theupdate frequency and keeping the connection to the information repository. Needless to say,minimizing message headers is crucial when building a large scale system.

7.4.4 Time Complexity

Time complexity can be definied as the amount of time it take before all media resourcetables are synchronized. Theoretically, the time complexity for the change media resourceoperation when handling mobility in the media resources is O(u), or O(1) if multicast is usedinstead of unicast to distribute the events. On the other hand, if mobility is implemented in agateway only one message needs to be sent.

To evaluate how large this delay may be, an experiment was conducted by connecting testclients to the system and vary the number of connected clients. Each test client was started asa separate process running on the same machine as the rest of the system. The test clients onlycontained a mobility manager and a media resource table, i.e. it did not have any multimediaprocessing capabilities. As interprocess communication was done via the loopback interface,network delay was not taken into account. However, the purpose of the experiment was notto simulate a real ubiquitous environment, but rather study the performance of the currentimplementation.

Figure 7.6 shows the results from the experiment. As can be seen, the delay is about onesecond for only ten users, which makes it hard to seamlessly switch media resources in themiddle of a conversation, although this delay can be reduced with better hardware. Anotherway to reduce the delay is to only send differences between media resource tables instead ofsending the complete media resource table. Moreover, only sending media resource tableschanges significantly reduce the time it takes to synchronize the system.

Page 156: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

138 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

0

1000

2000

3000

4000

5000

0 10 20 30 40 50 60

Tim

e [m

s]

Number of users in addition to oneself

Time complexity

All media resource table entries (Intel P4, 3.2 GHz, 2GB memory)All media resource table entries (AMD Athlon, 1.2 GHz, 384 MB memory)

One media resource table entry (Intel P4, 3.2 GHz, 2GB memory)One media resource table entry (AMD Athlon, 1.2 GHz, 384 MB memory)

Figure 7.6: Time before all clients are synchronized.

7.4.5 Proof of Concept

A proof of concept system was implemented to demonstrate the framework being used in areal world scenario. The setting chosen for this scenario was a local nursing home, becauseearlier research on the deployment and prototyping of wearable computers for communi-cation was conducted there [14]. The scenario concerns a nurse attending a patient, whilehaving a remote expert (e.g. a physician or physical therapist) provide guidance and offeringa secondary opinion on the case. To enable communication with the remote expert sittingin front of an ordinary desktop computer, the nurse utilizes wearable computing technology(e.g. head-mounted displays, headsets and cameras), through which audio and video fromthe meeting can be conveyed back and forth.

7.4.6 Scenario

In our scenario, the nurse utilizes a wearable computer with a head-mounted display whichcan be used to see the remote expert, as well as a head-mounted camera for examining a pa-tient up close with free hands. As the nurse enters the room of a patient, a positioning systemrecognizes that the nurse now has access to a new information repository. In this informationrepository, two media resources are detected, a television screen and a web camera on top ofit. The remote control user interface running on the nurse’s wearable computer is notified bythe agent that there are two media resources with better quality (the Q abstraction is lower)compared to the currently used media resources. The nurse now has the option of using thenew media resources while remaining inside the room. When the nurse starts to examine thepatient, the nurse is at first guided by the remote expert but soon finds it better if the expertcan discuss with the patient directly. The nurse therefore chooses to use the TV screen insteadof the head-mounted display to present the expert, so that both of them can watch the expertat the same time. As the video is now seen on a public display, the image disappears fromthe head-mounted display to avoid distracting the nurse. As the expert requests the patient to

Page 157: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 139

stand up and perform certain movements to determine the patient’s physical health, the nurseswitches to the web camera to get an overview of the room as the patient walks around. Whenthe examination is over, the nurse leaves the room and thereby also the information repos-itory, with the result that all the media streams are brought back to the wearable computeragain.

7.4.7 Prototype Implementation

To realize the implementation of a functional communications system ready for deployment,the framework was integrated with a commercial e-meeting system called Marratech [48].Marratech supports real-time audio, video, and chat communication over the Internet, incombination with application sharing and a shared whiteboard and web browser, to enablepeople to meet online from their ordinary desktop computer. Although Marratech was chosenfor this specific implementation, other systems offering similar functionality should equallywell be possible to integrate with the framework to realize the same kind of communicationssystem.

Figure 7.7: The remote control user interface depicting two media resources.

The Marratech system consists of a central server called the Marratech Manager, togetherwith Marratech clients connected to the server via IP multicast or unicast to enable groupcommunication. Even though it would be possible to modify the Marratech Manager, changeswere made to the Marratech clients instead as it was simpler. This was done by modifyingthe user interface as described in Section 7.3.4. Each client was thereby made to function as amedia resource containing one media sink (the video output on screen) and one media source(the camera-based video capture). Normally, each client presents an overview of all otherclients in the e-meeting so that a user can see all the users who are participating. As eachclient now represents a media resource instead of a live participant, all media resource typesbelonging to the same user are aggregated to the same component in the user interface. Forexample, if the user has several video sources (cameras), these are grouped so other users onlyperceive one video stream for that particular user. Thus, as a user switches between differentmedia resources in the framework, the mobility manager updates the clients to reflect this intheir graphical user interface.

The sensor data required for detecting when new media resources become available weresimulated by a Wizard of Oz method [12]. This was done for the purpose of robustness whendemonstrating the proof of concept system. The authors could manually change the nurse’slocation in the system through a simple GUI. It was developed for the purpose of representing

Page 158: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

140 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

Remote expert’s computer

(Dell Latitude C400, 1.2 GHz Pentium III, 512 MB RAM)

Wearable computer

(Sony Vaio U70P,

1 GHz Pentium M, 512 MB RAM)

Information Repository / Marratech Manager server

(Dell Latitude D600, 1.7 GHz Pentium M, 1 GB RAM)

Wireless LAN

(IEEE 802.11b)

TV screen

Web camera

Figure 7.8: System components used in the nursing home scenario.

the available information repositories, users and media resources as nodes in a graph, makingit possible to easily connect a user to a certain information repository.

For showing available media resources to the user, a simple remote control user interfaceapplication was developed to run on the wearable computer. This application pops up alist of buttons depicting available media resources when the user enters a new informationrepository, and the nurse can thereby select on the computer’s touch screen if any externalresources should be used. Figure 7.7 depicts this user interface as it appears to the user whenhaving entered the room; the left icon represents the TV screen, while the right icon representsthe web camera. The red exclamation mark over the TV informs the user that utilizing thismedia resource can have privacy issues because it is a public display.

7.4.8 Hardware used in the Scenario

In terms of hardware, the realization of the scenario in the nursing home required a numberof components. As the medical workers’ ordinary computers are locked down due to theirhandling of sensitive patient information, a number of laptops were brought to the site to-gether with a wearable computer on which the framework and other necessary software wasinstalled. Because of restrictions and security concerns of the in-house IEEE 802.11g wire-less network, a separate WLAN was setup for the scenario using an IEEE 802.11b accesspoint. Figure 7.8 illustrates the complete setup.

The “information repository server” runs an information repository for the current patientroom, In addition, it also runs a modified Marratech client and a Marratech Manager to en-able communication. The client’s video is displayed on a TV inside the room by having anS-video connection to the laptop. Video capture is handled by a web camera connected viaUSB, placed on top of the TV to get an overview of the room. The “Remote Expert’s com-puter” runs a modified Marratech client which allows the remote expert to communicate with

Page 159: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 141

the nurse. The “Wearable Computer” is mounted on a vest so it can be worn by a nurse. Thevest is reinforced with straps holding together the cables and stabilizing the body-worn cam-era, microphone and loudspeaker, providing the nurse with audio and video communicationabilities. An SV-6 head-mounted display from MicroOptical can optionally be connected tothe computer, to provide information directly to the nurse in private. Alternatively, the nursecan utilize the computer’s display to receive visual information, and as the display is touchsensitive the nurse can also use it to operate an ordinary graphical user interface if needed.The communications software consists of a modified Marratech client integrated with theprototype.

7.4.9 Evaluation by End Users

The purpose of the evaluation was to get initial feedback on the usability of the system, andvalidate whether the framework has the intended benefits. Having used an ordinary wearablecomputer in previous research projects run by the authors, the nurses were well aware of theconcept of wearable computing and how such a system could be utilized in their daily work.They were first given an introduction to the functionality enabled by the framework and thedifferent components, followed by the scenario being enacted. The nurses could then seehow new media resources became available once they entered the patient’s room, and changebetween the display of the wearable computer and the TV screen as they desired. They couldalso toggle between an overview of the room and details captured by the wearable’s camera.Figure 7.9 illustrates the setup.

As the group of nurses was very small and consisted of only two persons, no statisticallyrelevant data could be retrieved from this test. Instead, we handed out inquiries and followedup with a discussion about the scenario in order to find answers for some of the researchquestions posed in the introduction. Although they were only two, their input is valuable asthey have used wearable computers before and are active nurses in elderly care.

In regards to the first research question listed in the introduction, the nurses were askedif the switch between media resources really was transparent. The responses on the inquiriesindicated that there were indeed no undesirable side effects; the nurses did not find the switchbetween the wearable computer and the TV screen distracting. During the discussion theyalso mentioned they did not notice any delay associated with this action, as the switch hap-pened instantaneously when they pushed the button to change displays.

In regards to the second research question, the nurses were asked if the dynamic wearablecomputer was really automatically configured, or if excess user intervention is required atsome point. The responses in the inquiries here unanimously indicated that it was very easyfor the nurses to select new media resources for utilization. Because of the simplicity of theremote control user interface, where only two large easily identifiable buttons were needed,this made the remote control very easy to understand according to the nurses.

The nurses were also asked, as an overall question regarding the purpose of this research,whether they really appreciate having this functionality of switching between devices, or ifthey would rather prefer a traditional wearable computer with everything readily available intheir clothing. As the nurses have earlier experience of using such a wearable computer, this

Page 160: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

142 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

(a) A nurse using a stat-ically configured wear-able computer.

(b) A nurse using a dynamic wearable computer while performingan examination of another nurse playing the role of patient, withthe remote expert (in this case one of the authors) visible on theTV screen offering guidance. The web camera on top of the TValso provides a view of the examination for the expert.

Figure 7.9: Wearable computers used in the scenario.

was a valid question to see what system they would prefer. The nurses’ responses, both inthe inquiries and in the discussion, indicated that they preferred a dynamic wearable over atraditional wearable computer. By being able to utilize external devices in the patient’s room,they could more easily select whether to bring the full wearable computer or a lightweightcomputer for certain purposes. Even though this freedom of choice required additional userinteraction when selecting between media resources, the simplicity of the remote control userinterface made it easy to comprehend.

Furthermore, there is the general question of whether changes between media resourcesis really unobtrusively handled. As the concept of wearable computing is being employedhere, this is an important question to address to ensure that its use in real life situations willnot impede or hinder the user. Currently, the agent can compute which media resource isappropriate to use and send a suggestion event to the remote control user interface. In theory,the system could automatically follow this suggestion and perform the change unobtrusivelywithout user intervention. However, an earlier user study [27] indicated that users do notwant a completely automated system unless it is 100% accurate. As such a reliable systemcan be very difficult to realize outside of a lab environment, the user need to be in control andat any time be able to override erronous suggestions from the system, at the cost of requiringan extra interaction to confirm this selection.

Page 161: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 143

7.5 Discussion

This paper has presented a framework for ubiquitous multimedia communication, which al-lows media resources in the surrounding environment to be combined with a wearable com-puter carried by a user, thus realizing the concept of a dynamic wearable computer. In theintroduction of this paper, the following research questions were posed.

1. What functionality is needed to transparently combine and switch between resourcescarried by the user and those available in the environment?

2. What functionality is needed to automatically configure resources to be used in thedynamic wearable computer?

3. How can the distributed information storage infrastructure be designed to provide easyaccess to information and support the decision making process?

In relation to the first research problem, the paper has discussed several methods that canbe used to switch between different media resources. Independently of which method is used,a new identifier is required to map each user to the different media resources they are currentlyusing. In the framework, these mappings are stored in a database called the media resourcetable, which is updated by the mobility manager. The paper has shown that the number ofmessages can be significantly reduced if the mobility manager is added to a gateway insteadof directly to media resources. However, the drawback of adding the mobility manager toa gateway is that it may require modifications to both the clients and the gateway if RTPtranslators are not added to the gateway and the clients can not select which media streamsto present. Because of this drawback, the mobility manager was added directly to the mediaresources in the proof of concept prototype used in the nursing home scenario. Nevertheless,the paper has shown that it may take some time to synchronize media resource tables if themobility manager is added to the media resources, thus making it hard to transparently switchbetween media resources.

In relation to the second research question, the paper has proposed an agent based sys-tem which uses an algorithm to automatically select and configure media resources. Whena media resource is selected the user is notified through a remote user interface, which iscustomized to a specific control device. To minimize the effects of a faulty selection, theuser needs to make an active choice in selecting the media resource, although this choiceis simplified by the automized suggestion. Because the user has to make an active choice,the requirements on the algorithm used for making selections are lowered, since the user al-ways have the final say. The selection of media resources is based on user preferences, thecurrent scenario, and the quantified abstractions cost, mobility, privacy and quality. Theseabstractions are based on contexts from sensors and manually inputted.

The information structure used to store contexts is addressed by the third research ques-tion. The paper suggests an ontology called information repositories, which divides the infor-mation structure into three different domains: the user domain, the environment domain, andthe media resource domain. By dividing the information structure in this way, it is possibleto link different information repositories together in a distributed and logical way to provide

Page 162: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

144 Enabling Multimedia Communication using a Dynamic Wearable Computer ...

easy access to information. It also provides means for access control based on locality, forexample a user only gains access to information if the user is in the vicinity. In the same man-ner, it would also be possible to create group information repositories, which provides accessto media resources and information to group members. Another advantage of dividing theinformation structure this way, is that the domains have their own list of permitted contexts,which are interpreted differently based on which domain they are in. However, for the mediaresource information repository, there are also different permitted contexts based on whichmedia resource type it is for, e.g. audio sink/source, video sink/source, etc. This structuresimplifies the decision making process by providing the agent with known contexts, whichcan be used to make abstractions.

The paper also presents a proof of concept prototype which has been tested in a realworld scenario, as well as in a laboratory setting. The results from the laboratory investiga-tion shows that reducing bandwidth overhead caused by signalling between components iscrucial, especially when building a large scale system. The current implementation uses RMIfor message passing which introduces significant overhead because of serialization. Thisoverhead causes a problem with bandwidth usage, as the distributed system requires a largenumber of messages to be sent. A more efficient solution would be to reduce the messagesize by creating a special purpose protocol for communicating between components. Anotherway to make it more efficient is to reduce the number of sent messages. This can for examplebe done by locating the sensor and the media resource on the same host as the correspondinginformation repository.

The results from the brief testing in the real world scenario indicates that the prototypehas potential to increase efficiency and save time. Compared to an ordinary wearable com-puter unable to utilize media resources in the surrounding environment, the dynamic wearablecomputer was deemed better as it allowed for a more flexible and lightweight system. Thescenario investigated in this paper targets a nursing home, although it can be applied to othersituations and types of users. Having a working framework makes it possible to develop otherdynamic wearable systems that can be utilized in other interest groups.

To summarize, this paper has proposed a framework for multimedia communication en-abling dynamic wearable computing in ubiquitous environments. The framework offers fun-damental building blocks which can be used to create prototypes for many different scenarios,thereby making it possible to deploy and evaluate future ubiquitous multimedia communica-tions systems for providing richer communication. The proof of concept prototype, whichwas deployed and evaluated in this paper, has shown that it is possible to create a dynamicwearable computer in a nursing home scenario. Although the framework has been success-fully tested, there are still several issues which need to be resolved, including how to minimizethe bandwidth usage and number of messages being sent. However, the paper has suggesteda few solutions to these issues, which will be considered in future versions of the framework.

Page 163: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Enabling Multimedia Communication using a Dynamic Wearable Computer ... 145

7.6 Acknowledgements

This work was funded by the Centre for Distance spanning Health-care (CDH), the Centrefor Distance spanning Technology (CDT), and the C4 project which is supported by EUstructural funds.

Page 164: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

146

Page 165: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Bibliography

147

Page 166: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 167: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Bibliography

[1] A. Bierbaum and C. Just. Software tools for virtual reality application development,1998. Applied Virtual Reality, SIGGRAPH 98 Course Notes.

[2] M. Billinghurst, J. Bowskill, M. Jessop, and J. Morphett. A wearable spatial conferenc-ing space. In Proceedings of the 2nd International Symposium on Wearable Computers,pages 76–83, 1998.

[3] M. Billinghurst, S. Weghorst, and T. A. Furness. Wearable computers for three dimen-sional CSCW. In Proceedings of the International Symposium on Wearable Computers,pages 39–46, 1997.

[4] M. Boronowsky, T. Nicolai, C. Schlieder, and A. Schmidt. Winspect: A case study forwearable computing-supported inspection tasks. In 5th IEEE International Symposiumon Wearable Computers (ISWC’01), 2001.

[5] N. Bretschneider, S. Brattke, and K. Rein. Head mounted displays for fire fighters. InProceedings of the 3rd International Forum on Applied Wearable Computing, 2006.

[6] S. Brewster. Sound in the interface to a mobile computer. In HCI International’99,pages 43–47, 1999.

[7] S. Brewster, J. Lumsden, M. Bell, M. Hall, and S. Tasker. Multimodal ’eyes-free’ inter-action techniques for wearable devices. In Conference on Human Factors in ComputingSystems, pages 473–480, 2003.

[8] H. Chen, T. Finin, A. Joshi, L. Kagal, F. Perich, and D. Chakraborty. Intelligent agentsmeet the semantic web in smart spaces. IEEE Internet Computing, 08(6):69–79, 2004.

[9] M. Chen. Leveraging the asymmetric sensitivity of eye contact for videoconference. InProceedings of the SIGCHI conference on Human factors in computing systems, pages49–56. ACM Press, 2002.

[10] A. Clark. What do we want from a wearable user interface. In Proceedings of Workshopon Software Engineering for Wearable and Pervasive Computing, June 2000.

[11] L. Dabbish and R. Kraut. Coordinating communication: Awareness displays and inter-ruption. In CHI 2003 Workshop: Providing Elegant Peripheral Awareness, 2003.

149

Page 168: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

150 Bibliography

[12] N. Dahlbäck, A. Jönsson, and L. Ahrenberg. Wizard of oz studies: why and how. InProceedings of the 1st international conference on Intelligent user interfaces, pages193–200. ACM Press, 1993.

[13] A. K. Dey, D. Salber, and G. D.Abowd. A Conceptual Framework and a Toolkit forSupporting the Rapid Prototyping of Context-Aware Applications. Anchor article of aspecial issue on context-aware computing in the Human-Computer Interaction (HCI)Journal, 16:97–166, 2001.

[14] M. Drugge, J. Hallberg, P. Parnes, and K. Synnes. Wearable Systems in Nursing HomeCare: Prototyping Experience. IEEE Pervasive Computing, 5(1):86–91, Jan-Mar 2006.

[15] M. Drugge, M. Nilsson, U. Liljedahl, K. Synnes, and P. Parnes. Methods for Interruptinga Wearable Computer User. In Proceedings of the 8th IEEE International Symposiumon Wearable Computers (ISWC’04), November 2004.

[16] M. Drugge, M. Nilsson, R. Parviainen, and P. Parnes. Experiences of using wearablecomputers for ambient telepresence and remote interaction. In ETP ’04: Proceedingsof the 2004 ACM SIGMM workshop on Effective telepresence, pages 2–11, New York,NY, USA, 2004. ACM Press.

[17] Ethereal. http://www.ethereal.com/, June 2006.

[18] M. W. Eysenck and M. T. Keane. Cognitive Psychology: A Student’s Handbook. Psy-chology Press (UK), 5th edition, 2005.

[19] S. Fickas, G. Kortuem, J. Schneider, Z. Segall, and J. Suruda. When cyborgs meet:Building communities of cooperating wearable agents. In Proceedings of the 3rd Inter-national Symposium on Wearable Computers, pages 124–132, October 1999.

[20] S. R. Fussell, L. D. Setlock, and R. E. Kraut. Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks. In Proceedings of theconference on Human factors in computing systems, pages 513–520. ACM Press, 2003.

[21] S. K. Ganapathy, A. Morde, and A. Agudelo. Tele-collaboration in parallel worlds. InProceedings of the 2003 ACM SIGMM workshop on Experiential telepresence, pages67–69. ACM Press, 2003.

[22] D. Garlan, D. Siewiorek, A. Smailagic, and P. Steenkiste. Project aura: towarddistraction-free pervasive computing. IEEE Pervasive Computing, 1(2):22–31, Apr-Jun2002.

[23] H.-W. Gellersen, M. Beigl, and H. Krull. The mediacup: Awareness technology embed-ded in a everyday object. In HUC ’99: Proceedings of the 1st international symposiumon Handheld and Ubiquitous Computing, pages 308–310, London, UK, 1999. Springer-Verlag.

Page 169: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Bibliography 151

[24] K. Goldberg, D. Song, Y. Khor, D. Pescovitz, A. Levandowski, J. Himmelstein, J. Shih,A. Ho, E. Paulos, and J. Donath. Collaborative online teleoperation with spatial dynamicvoting and a human "tele-actor". In Proceedings of the IEEE International Conferenceon Robotics and Automation (ICRA’02), volume 2, pages 1179–1184, May 2002.

[25] K. Goldberg, D. Song, and A. Levandowski. Collaborative teleoperation using net-worked spatial dynamic voting. Proceedings of the IEEE, 91:430–439, March 2003.

[26] J. Hallberg. Improving everyday experiences using awareness and rich communication,June 2006. Licentiate in Engineering Thesis, ISSN 1402-1757 / ISRN LTU-LIC–06/35–SE / NR 2006:35.

[27] J. Hallberg, J. Kristiansson, P. Parnes, and K. Synnes. Supporting ubiquitous multimediacommunication – user study and system design. Submitted for review.

[28] M. Handley, H. Schulzrinne, E. Schooler, and J. Rosenberg. SIP: Session initiationprotocol, rfc 2543. http://www.faqs.org/rfcs/rfc2543.html, March 1999.

[29] J. I. Hong and J. A. Landay. An Architecture for Privacy-sensitive Ubiquitous Com-puting. In MobiSYS ’04: Proceedings of the 2nd international conference on Mobilesystems, applications, and services, pages 177–189, New York, NY, USA, 2004. ACMPress.

[30] S. E. Hudson, J. Fogarty, C. G. Atkeson, D. Avrahami, J. Forlizzi, S. Kiesler, J. C. Lee,and J. Yang. Predicting human interruptibility with sensors: A wizard of oz feasibilitystudy. In Proceedings of Conference on Human Factors in Computing Systems (CHI2003), pages 257–264. ACM Press, 2003.

[31] J. Hughes, V. King, T. Rodden, and H. Andersen. The role of ethnography in interactivesystems design. interactions, 2(2):56–65, 1995.

[32] N. P. Jouppi. First steps towards mutually-immersive mobile telepresence. In Pro-ceedings of the 2002 ACM conference on Computer supported cooperative work, pages354–363. ACM Press, 2002.

[33] N. Kern, S. Antifakos, B. Schiele, and A. Schwaninger. A Model for Human Inter-ruptability: Experimental Evaluation and Automatic Estimation from Wearable Sen-sors. In Proceedings of the 8th IEEE International Symposium on Wearable Computing(ISWC’04), November 2004.

[34] G. Kortuem, M. Bauer, T. Heiber, and Z. Segall. Netman: The design of a collaborativewearable computer system. ACM/Baltzer Journal on Mobile Networks and Applications(MONET), 4(1), 1999.

[35] R. E. Kraut, M. D. Miller, and J. Siegel. Collaboration in performance of physical tasks:Effects on outcomes and communication. In Computer Supported Cooperative Work,1996.

Page 170: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

152 Bibliography

[36] J. Kristiansson, J. Hallberg, S. Svensson, K. Synnes, and P. Parnes. Supporting Au-tomatic Media Resource Selection Using Context-Awareness. In In 3rd InternationalConference on Advances in Mobile Multimedia (MoMM2005), pages 271–282, Septem-ber 2005.

[37] M. Langheinrich. Privacy by design - principles of privacy-aware ubiquitous systems.In UbiComp ’01: Proceedings of the 3rd International Conference on Ubiquitous Com-puting, pages 273–291, London, UK, 2001. Springer-Verlag.

[38] C. Lee, S. Helal, and W. Lee. Universal interactions with smart spaces. IEEE PervasiveComputing, 5(1):16–21, Jan-Mar 2006.

[39] P. Lukowicz, T. Kirstein, and G. Troster. Wearable systems for health care applications.Methods of Information in Medicine, 43(3):232–238, 2004.

[40] N. Lund. Attention and Pattern Recognition. Routledge, East Sussex, UK, 2001.

[41] K. Lyons and T. Starner. Mobile capture for wearable computer usability testing. InProceedings of IEEE International Symposium on Wearable Computing (ISWC 2001,pages 69–76, Zurich, Switzerland, October 2001.

[42] K. Lyons, T. Starner, D. Plaisted, J. Fusia, A. Lyons, A. Drew, and E. Looney. Twiddlertyping: One-handed chording text entry for mobile phones. Technical report, GeorgiaInstitute of Technology, 2003.

[43] P. P. Maglio and C. S. Campbell. Tradeoffs in displaying peripheral information. InCHI, pages 241–248, 2000.

[44] P. Maniatis, M. Roussopoulos, E. Swierk, K. Lai, G. Appenzeller, X. Zhao, andM. Baker. The Mobile People Architecture. ACM Mobile Computing and Commu-nications Review, 3:36–42, July 1999.

[45] S. Mann. Wearable computing: A first step towards personal imaging. IEEE Computer,30:25–32, February 1997.

[46] S. Mann. Personal imaging and lookpainting as tools for personal documentary andinvestigative photojournalism. ACM Mobile Networks and Applications, 4, March 1999.

[47] S. Mann and R. Picard. An historical account of the ‘wearcomp’ and ‘wearcam’ in-ventions developed for applications in ‘personal imaging’. In IEEE Proceedings of theFirst International Conference on Wearable Computing, pages 66–73, October 1997.

[48] Marratech. http://www.marratech.com/, March 2006.

[49] S. McCanne and V. Jacobson. vic : A flexible framework for packet video. In ACMMultimedia, pages 511–522, 1995.

[50] K. McCloghrie. Management Information Base for Network Management of TCP/IP-based internets: MIB-II, 1991. IETF RFC1213.

Page 171: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Bibliography 153

[51] D. C. McFarlane. Interruption of people in human-computer interaction: A generalunifying definition of human interruption and taxonomy. Technical report, US NavalResearch Lab, Washington, DC., 1997. NRL/FR/5510-97-9870.

[52] D. C. McFarlane. Interruption of people in human-computer interaction, 1998. DoctoralDissertation. George Washington University, Washington DC.

[53] D. C. McFarlane. Coordinating the interruption of people in human-computer interac-tion. In Human-Computer Interaction - INTERACT’99, pages 295–303. IOS Press, Inc.,1999.

[54] B. Nath, F. Reynolds, and R. Want. RFID Technology and Applications. IEEE PervasiveComputing, 5(1):86–91, Jan-Mar 2006.

[55] M. Nilsson, M. Drugge, U. Liljedahl, K. Synnes, and P. Parnes. A Study on Users’ Pref-erence on Interruption When Using Wearable Computers and Head Mounted Displays.In Proceedings of the 3rd IEEE International Conference on Pervasive Computing andCommunications (PerCom’05), March 2005.

[56] M. Nilsson, M. Drugge, and P. Parnes. In the borderland between wearable computersand pervasive computing. Research report, Luleå University of Technology, 2003. ISSN1402-1528.

[57] M. Nilsson, M. Drugge, and P. Parnes. Sharing experience and knowledge with wear-able computers. In Pervasive 2004: Workshop on Memory and Sharing of Experiences,April 2004.

[58] P. Parnes, K. Synnes, and D. Schefström. mStar: Enabling collaborative applicationson the internet. Internet Computing, 4(5):32–39, 2000.

[59] R. Parviainen and P. Parnes. A Web Based History tool for Multicast e-Meeting Ses-sions. In Proceedings of the IEEE International Conference on Multimedia and Expo(ICME’2004), June 2004.

[60] R. Parviainen and P. Parnes. The MIM Web Gateway to IP Multicast E-Meetings.In Proceedings of the SPIE/ACM Multimedia Computing and Networking Conference(MMCN’04), 2004.

[61] E. Paulos. Connexus: a communal interface. In Proceedings of the 2003 conference onDesigning for user experiences, pages 1–4. ACM Press, 2003.

[62] T.-L. Pham, G. Schneider, and S. Goose. A situated computing framework for mobileand ubiquitous multimedia access using small screen and composite devices. In MUL-TIMEDIA ’00: Proceedings of the eighth ACM international conference on Multimedia,pages 323–331, New York, NY, USA, 2000. ACM Press.

[63] C. Randell and H. Muller. The shopping jacket: Wearable computing for the consumer.Personal and Ubiquitous Computing, 4(4):241–244, 2000.

Page 172: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

154 Bibliography

[64] S. Ratnasamy, P. Francis, M. Handley, and R. Karp. A Scalable Content-AddressableNetwork. In SIGCOMM, pages 161–172, 2001.

[65] M. Rettig. Prototyping for tiny fingers. Commun. ACM, 37(4):21–27, 1994.

[66] B. J. Rhodes. The wearable remembrance agent: A system for augmented memory. InProceedings of The First International Symposium on Wearable Computers (ISWC ’97),pages 123–128, Cambridge, Mass., USA, 1997.

[67] B. J. Rhodes. WIMP interface considered fatal. In IEEE VRAIS’98: Workshop onInterfaces for Wearable Computers, March 1998.

[68] B. J. Rhodes, N. Minar, and J. Weaver. Wearable computing meets ubiquitous comput-ing: Reaping the best of both worlds. In Proceedings of The 3rd International Sympo-sium on Wearable Computers, pages 141–149, 1999.

[69] T. Richardson, Q. Stafford-Fraser, K. R. Wood, and A. Hopper. Virtual network com-puting. IEEE Internet Computing, 2(1):33–38, 1998.

[70] N. Roussel. Experiences in the design of the well, a group communication device forteleconviviality. In Proceedings of the tenth ACM international conference on Multime-dia, pages 146–152. ACM Press, 2002.

[71] N. Sawhney and C. Schmandt. Nomadic radio: speech and audio interaction for con-textual messaging in nomadic environments. ACM Transactions on Computer-HumanInteraction, 7(3):353–383, 2000.

[72] W. N. Schilit. A system architecture for context-aware mobile computing. PhD thesis,Columbia University, 1995.

[73] J. Siegel, R. E. Kraut, B. E. John, and K. M. Carley. An empirical study of collaborativewearable computer systems. In Conference companion on Human factors in computingsystems, pages 312–313. ACM Press, 1995.

[74] Standard Upper Ontology Working Group. Standard Upper Ontology.<http://suo.ieee.org/>, 2006.

[75] T. Starner. Attention, memory, and wearable interfaces. IEEE Pervasive Computing,1(4):88–91, 2002.

[76] I. Stoica, R. Morris, D. Liben-Nowell, D. R. Karger, M. F. Kaashoek, F. Dabek, andH. Balakrishnan. Chord: A Scalable Peer-to-peer Lookup Protocol for Internet Appli-cations. IEEE/ACM Transactions on Networking, 11:17–32, February 2003.

[77] F. Tang, C. Aimone, J. Fung, A. Marjan, and S. Mann. Seeing eye to eye: a sharedmediated reality using eyetap devices and the videoorbits gyroscopic head tracker. InProceedings of the International Symposium on Mixed and Augmented Reality (IS-MAR2002), pages 267–268, Darmstadt, Germany, Sep. 1 - Oct. 1 2002.

Page 173: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for

Bibliography 155

[78] T. Taylor. Megaco/h.248: a new standard for media gateway control. IEEE Communi-cations Magazine, 38(10):124–132, 2000.

[79] H. Wang, B. Rama, C. Chuah, R. Biswas, R. Gummadi, B. Hohlt, X. Hong, E. Kici-man, Z. Mao, J. Shih, L. Subramanian, B. Zhao, A. Joseph, and R. Katz. ICEBERG:An Internet-core Network Architecture for Integrated Communications. IEEE PersonalCommunications, 7:10–19, Aug 2000. Special Issue on IP-based Mobile Telecommu-nication Networks.

[80] R. Want, A. Hopper, V. Falcao, and J. Gibbons. The active badge location system. ACMTransactions on Information Systems, 10(1):91–102, 1992.

[81] R. Want, B. N. Schilit, N. I. Adams, R. Gold, K. Petersen, D. Goldberg, J. R. Ellis, andM. Weiser. The parctab ubiquitous computing experiment. Technical report, 1995.

[82] M. Weiser. The computer for the 21st century. Scientific American, 265(3):94–104,September 1991.

[83] H. Witt and M. Drugge. Hotwire: An apparatus for simulating primary tasks in wearablecomputing. In CHI ’06: Extended Abstracts on Human Factors in Computing Systems,April 2006.

[84] H. Witt, R. Leibrandt, A. Kemnade, and H. Kenn. Scipio: A miniaturized building blockfor wearable interaction devices. In Proceedings of International Forum on AppliedWearable Computing (IFAWC). VDE/ITG, 2006.

[85] M. J. Zieniewicz, D. C. Johnson, D. C. Wong, and J. D. Flatt. The evolution of armywearable computers. IEEE Pervasive Computing, 01(4):30–40, 2002.

Page 174: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 175: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for
Page 176: Interaction Aspects of Wearable Computing for Human ...ltu.diva-portal.org/smash/get/diva2:989790/FULLTEXT01.pdf · Abstract This thesis presents the use of wearable computers for