frank j.j.m. steyvers 1,2, mark balkema 3, rik van sommeren 3, frits polman 4, else m. havik 2 &...

1
Frank J.J.M. Steyvers 1,2 , Mark Balkema 3 , Rik van Sommeren 3 , Frits Polman 4 , Else M. Havik 2 & Aart C. Kooijman 2 . Affiliations: 1: Department of Psychology, University of Groningen, The Netherlands; 2: Laboratory of Experimental Ophthalmology, University Medical Center Groningen; 3: former students Human Technology, Hanzehogeschool Groningen; 4: GuideID, Deventer, The Netherlands. Contact: [email protected] Problem: Wayfinding for visually impaired people is difficult in complex environments. Often, GPS systems do not work. Solution: The interface of an RFID-based system, provided by GuideID, is being developed by the University Medical Center of Groningen, and it should be particularly usable for visually impaired people. The system gives auditory messages to guide a person to a destination. Three levels of detail are available for messages. Additionally, possibly relevant on-route land- marks may be provided, such as shops, toilets, etc. Issue: The interface of this system should provide optimal usability for visually impaired people. Interfaces of available systems for out-door navigation are insufficient. Task Analysis: The interface should provide the following functions: insert destination, choose detail level for route information, choose detail level for landmark information, repeat last message, change settings of the system, go back one step, and give auditory information about key functioning. Implementation was done on a PDA with emulated functio-nality. The touch-screen keys were made tangible by the use of a sheet with adhesive buttons (figure 1). Method: Twelve visually impaired participants tested three different interfaces: Sophis, Daisy, Irda (figure 2). When pressed, an auditory message presented the function of that key. Using this function should be confirmed by pressing a confirmation key (Sophis and Daisy) or by pressing the selected key again (Irda). Destinations could be selected from a hierarchical menu, which could be scrolled through by using the “arrow keys”. Each key had its own function. Tasks: select a destination (twice), repeat the last message, adjust the speaker’s volume, choose on-route landmarks information, stop the navigation (twice), and consult help. Interface order was counterbalanced. Recorded were the number of key-clicks used to complete the task, and the time used (only for destination selection). Participants were interviewed about their findings. Results: The number of key-clicks additional to the minimum needed to perform the task, was calculated. Table 1 shows this extra number of clicks for each task, averaged across participants. Table 2 gives the average time. Figure 2: Sophis Daisy Irda Task Device Select desti- nation 1 Select desti- nation 2 Repeat mes- sage Adjus t volum e On- route info Stop naviga -tion 1 Stop naviga -tion 2 Use help Sophis 17 13 2 4 - 4 2 2 Daisy 21 17 8 8 6 4 1 2 Irda 25 13 4 10 9 3 2 3 Table 1: number of key-clicks above minimum required to perform task Task Device Select desti- nation 1 Select desti- nation 2 Sophis 115 68 Daisy 129 93 Irda 133 87 Table 2: time (s) required to perform task Discussion: In general, the Sophis interface provided best usability. But this dedicated design cannot be used on more generic devices. A new interface (figure 3) was designed to combine the best features of the three tested interfaces. This design of functionality may fit on existing keyboard layouts, and hence make way-finding functions accessible that are increasingly provided by normally available cell- phones and PDA’s. Since visually impaired people need tactile feedback, an add-on with clearly distinctive elevated keys for use on touch-screens of PDA’s should be developed. Figure 3: Synthesis Indoor navigation support interfaces for visually impaired people Figure 1: usability test

Upload: rosalind-thomas

Post on 01-Jan-2016

219 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Frank J.J.M. Steyvers 1,2, Mark Balkema 3, Rik van Sommeren 3, Frits Polman 4, Else M. Havik 2 & Aart C. Kooijman 2. Affiliations: 1: Department of Psychology,

Frank J.J.M. Steyvers1,2, Mark Balkema3, Rik van Sommeren3, Frits Polman4, Else M. Havik2 & Aart C. Kooijman2.

Affiliations: 1: Department of Psychology, University of Groningen, The Netherlands; 2: Laboratory of Experimental Ophthalmology, University Medical Center Groningen; 3: former students Human Technology, Hanzehogeschool Groningen; 4: GuideID, Deventer, The Netherlands.Contact: [email protected]

Problem: Wayfinding for visually impaired people is difficult in complex environments. Often, GPS systems do not work.Solution: The interface of an RFID-based system, provided by GuideID, is being developed by the University Medical Center of Groningen, and it should be particularly usable for visually impaired people. The system gives auditory messages to guide a person to a destination. Three levels of detail are available for messages. Additionally, possibly relevant on-route land-marks may be provided, such as shops, toilets, etc.Issue: The interface of this system should provide optimal usability for visually impaired people. Interfaces of available systems for out-door navigation are insufficient.Task Analysis: The interface should provide the following functions: insert destination, choose detail level for route information, choose detail level for landmark information, repeat last message, change settings of the system, go back one step, and give auditory information about key functioning.Implementation was done on a PDA with emulated functio-nality. The touch-screen keys were made tangible by the use of a sheet with adhesive buttons (figure 1).

Method: Twelve visually impaired participants tested three different interfaces: Sophis, Daisy, Irda (figure 2). When pressed, an auditory message presented the function of that key. Using this function should be confirmed by pressing a confirmation key (Sophis and Daisy) or by pressing the selected key again (Irda). Destinations could be selected from a hierarchical menu, which could be scrolled through by using the “arrow keys”. Each key had its own function.Tasks: select a destination (twice), repeat the last message, adjust the speaker’s volume, choose on-route landmarks information, stop the navigation (twice), and consult help. Interface order was counterbalanced. Recorded were the number of key-clicks used to complete the task, and the time used (only for destination selection). Participants were interviewed about their findings.Results: The number of key-clicks additional to the minimum needed to perform the task, was calculated. Table 1 shows this extra number of clicks for each task, averaged across participants. Table 2 gives the average time.

Figure 2: Sophis Daisy Irda

Task

Device

Select desti-

nation 1

Select desti-

nation 2

Repeat mes-sage

Adjust volume

On-route info

Stop naviga-tion 1

Stopnaviga-tion 2

Usehelp

Sophis 17 13 2 4 - 4 2 2

Daisy 21 17 8 8 6 4 1 2

Irda 25 13 4 10 9 3 2 3

Table 1: number of key-clicks above minimum required to perform task

Task

Device

Select desti-

nation 1

Select desti-

nation 2

Sophis 115 68

Daisy 129 93

Irda 133 87

Table 2: time (s) required to perform task

Discussion: In general, the Sophis interface provided best usability. But this dedicated design cannot be used on more generic devices. A new interface (figure 3) was designed to combine the best features of the three tested interfaces. This design of functionality may fit on existing keyboard layouts, and hence make way-finding functions accessible that are increasingly provided by normally available cell-phones and PDA’s.Since visually impaired people need tactile feedback, an add-on with clearly distinctive elevated keys for use on touch-screens of PDA’s should be developed.

Figure 3: Synthesis

Indoor navigation support interfaces for visually impaired people

Figure 1: usability test