usability evaluation methods - universidade de aveirosweet.ua.pt/bss/aulas/ihc-2020/usability...

62
Usability Evaluation Methods Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Beatriz Sousa Santos

Upload: others

Post on 05-Sep-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Usability Evaluation Methods

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática

Beatriz Sousa Santos

Page 2: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

• Usability is, according to ISO 9241-11:

“the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”

• How to measure it??

2

Page 3: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

3

(Cokton, 2013):

“Put simply, usability evaluation assesses the extent to which an interactive system is easy and pleasant to use”. Things aren’t this simple at all though, but …: - Usability is a measurable property of all interactive digital technologies

- Evaluation methods determine if an interactive system or device is usable

- And the extent of its usability, through robust, objective and reliable metrics

- Evaluation methods and metrics are thoroughly documented …

http://www.interaction-design.org/encyclopedia/usability_evaluation.html

Page 4: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Evaluation Methods

Heuristic Evaluation

• Analytical (without users) Cognitive Walkthrough

Model based methods

Review methods

Observation usability tests

• Empirical (involving users) Query

Controlled Experiments

( - have used in Lab classes)

4

Page 5: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Heuristic Evaluation (Nielsen and Molich 1990)

• A “discount usability engineering method” for quick, cheap, and easy evaluation of a UI design

• The most popular of the usability inspection methods

• It is a systematic inspection of a design for usability

• Meant to find the usability problems in the design so that they can be attended to as part of an iterative design process.

• Involves a small set of analysts judging the UI against a list of usability principles ("heuristics").

5

Page 6: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

• Is difficult for a single individual to do; one person will never be able to find all the problems

• Involving multiple evaluators improves the effectiveness of the method significantly

• Nielsen generally recommends to use three to five evaluators

• not much gain by using larger numbers

6

Page 7: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Example:

• Heuristic evaluation of a banking system:

– 19 evaluators

– 16 usability problems

black square - problem found

white square – not found

7

http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/#sthash.OmTrV7Og.6ZrkgzXB.dpuf

This suggests that in general 3 to 5 evaluators may be reasonable

Page 8: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

How to select the number of evaluators for a specific case?

• Consider the following criteria:

– Complexity of the user interface

– Experience of the evaluators

– Expected costs /benefits

– How critical is the system (cost of user errors)

– …

8

Page 9: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

How to perform Heuristic Evaluation

Each evaluator:

• First make a general analysis to get to know the UI

• Then, make a systematic analysis having in mind the heuristics

• Take note of each potential problem, the heuristic and the severity grade

• Finally, compile all the potential problems

http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation sthash.OmTrV7Og.dpuf

9

Page 10: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

• Visibility of system status

• Match between system and the real world

• User control and freedom

• Consistency and standards

• Error prevention

• Recognition rather than recall

• Flexibility and efficiency of use

• Aesthetic and minimalist design

• Help users recognize, diagnose, and recover from errors

• Help and documentation

10

Ten Nielsen’s heuristics

https://www.nngroup.com/articles/ten-usability-heuristics/

Page 11: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

• Main advantages of heuristic evaluation: – May produce useful results with modest investment

– Simple to apply even by not very experienced evaluators

– May be used along the development process from early phases

• Main limitations:

– Subjective (partially overcome with more and more experienced

evaluators)

– Tends to find many small problems which may not be very important

– Can’t find all usability problems

-> evaluation involving users is needed!

11

Page 12: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Cognitive Walkthrough (Wharton, et al., 1992)

• Usability inspection method not involving users (analytical)

• Based on the fact that users usually prefer to learn a system by using it (e.g., instead of studying a manual)

• Focused on assessing learnability (i.e., how easy it is for new users to accomplish tasks with the system)

• Applicable at early phases, before any coding

13

Page 13: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

How to perform a cognitive walkthrough

1- Task analysis: sequence of steps or actions required by a user to accomplish a task, and the system responses

2- Designers and developers walkthrough as a group, asking themselves a set of questions at each step

3- Data gathering during the walkthrough: answering the questions for each subtask usability problems are detected

4- Report of potential issues

5- UI redesign to address the issues identified

14

Page 14: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

CW Four questions:

• Will the user try to achieve the effect that the subtask has?

(Does the user understand this subtask is needed to reach the goal?)

• Will the user notice that the correct action is available?

(E.g. is the button visible?)

• Will the user understand that the wanted subtask can be achieved by the action?

(E.g. the button is visible but the user doesn’t understand the text and

will not click on it)

• Does the user get feedback?

Will the user know that they have done the right thing?

15

Page 15: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Common issues

• The evaluator may not know the optimal way to perform the task; the method involves the optimal sequence of actions

• Involves an extensive analysis and documentation and often too many potential issues are detected, resulting very time consuming

Thus:

Lighter variants of Cognitive Walkthrough were proposed to make it more applicable in S/W development companies

16

Page 16: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Streamlined Cognitive Walkthrough (Spencer, 2000)

- Will the user know what to do at this step?

- If the user does the right thing, will they know that they did the

right thing, and are making progress towards their goal?

17

• Only two questions:

• And a set of rules to streamlining the walkthrough and trade-off granularity for coverage

comprises the 3 first questions of CW

Page 17: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

According to Spencer the method can be applied successfully if the usability specialist:

• takes care to prepare the team for the walkthrough,

• avoids design discussions during the walkthrough,

• explicitly neutralizes defensiveness among team members,

• streamlines the procedure by collapsing the first three questions into one question,

• and captures data selectively

18

Page 18: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Practice the Streamlined Cognitive Walkthrough:

• Look for a phone number at the University of Aveiro Web site

user: any student from the University

• Create a pdf of a PowerPoint file using the Print option but not printing the hiden slides

user: anyone familiar with a previous version

21

Page 19: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Look for a person’s phone number and email address at the University of Aveiro Web site User: any student from the University

Task analysis:

- find the icon (search);

- input part of the person’s name and search in “Pessoas“

- get the phone number

But the defined user profile (any student from the University) includes foreign students, thus a previous action is needed:

- select the English version

For each action we need to ask the two questions and put ourselves in the shoes of the user!

22

Page 20: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

First action in the Portuguese version: find the icon Q1 - Will the user know what to do at this step? Even without tooltip the correct icon seems recognizable (it is “standard”) Q2 - If the user does the right thing (selects the icon), will they know that they did the right thing, and are making progress towards their goal? Probably yes; while it may not look a search bar, it is adequately labeled (Pesquisa em páginas, ...)

23

Previous action for foreign students: Select the English version seems easy (it is a “standard” way to do it in sites)

Page 21: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Q1 - Will the user know what to do at this step?

Probably yes; it is easy to recognize that s/he should input the person’s name and select “Pessoas”

24

Second action: input part of the person’s name and search in “Pessoas“

Page 22: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Q2 - If the user does the right thing (inputs the name and selects “Pessoas”), will they know that they did the right thing, and are making progress towards their goal?

Probably yes; however, some users may not recognize 24117 as a phone number

(it only has 5 digits, as it is internal, and not 9 as possibly expected)

25

In conclusion: - it seems easy for the target users to reach the phone number and email address; - however, the phone number may be not recognized as such

Page 23: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Previous version of the site: Look for a person’s phone number at the University of Aveiro Web site user: any student from the University

Task analysis:

- look for the icon (directório);

- input part of the person’s name and search

- get the phone number

But the defined user profile (any student from the University) includes foreign students, thus a previous action is needed:

- select the English version

For each action we need to ask the two questions and put ourselves in the shoes of the user!

26

Page 24: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

First action: find the icon Q1 - Will the user know what to do at this step? Even reading the tooltip (directório) possibly the correct icon is not recognizable! Q2 - If the user does the right thing (selects the icon), will they know that they did the right thing, and are making progress towards their goal? Probably yes; this looks a familiar search bar and it is adequatly labeled (lista telefónica; pesquisar)

27

Page 25: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Second action: input part of the person’s name

Q1 - Will the user know what to do at this step? Probably yes; the tooltip lets the user know he/she should input the person’s name and select “pesquisar” Q2 - If the user does the right thing (selects the icon), will they know that they did the right thing, and are making progress towards their goal?

Probably yes; however, some users might not recogize 24117 as a phone number (it only has 5 digits, as it is internal, and not 9 as possibly expected)

28

Page 26: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

30

Another example: Smart TV

How to access the Internet? (before reading the manual?)

(we see the symbol at the screen only after pressing it on the control!)

Page 27: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Practice the Streamlined Cognitive Walkthrough:

Analysing interactive systems/applications that should be

very intuitive:

• Turn on and off the video projector in your Lab using the

remote control or directly on the projector

user: any student from the University

• Change the Channel using the box of your TV service

(not the remote control)

user: anyone having a TV box

31

Page 28: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Limitations of Analytical Methods

– Are subjective

– Involve several usability experts

– Cannot find all usability problems

Thus, empirical methods (involving users) are needed

observation

query

controlled experiments (scientific approach)

32

Usability test (engineering approach)

Page 29: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Ethics in applying empirical methods

Involving users implies specific cautions:

– Asking for explicit consent

– Confidentiality

– Security (avoid any risk)

– Freedom (users may give up at any time)

– Limit stress

It’s the system that is under evaluation not the user!

33

Page 30: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Empirical evaluation modes

These methods may be performed:

– In the laboratory (more controlled)

– In the field (more realistic)

They produce complementary information;

if possible they should be both used!

34

https://www.nngroup.com/articles/field-studies/

Page 31: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Observation

Has many variants from very simple

to very complex and expensive:

• Direct: observer takes notes

• Undirect: through audio/ vídeo – more complex and time consuming

• Think Aloud: users are asked to explain what they are doing

• Logging: users activity is logged by the system

• Combinations of the previous, etc

35

https://www.usabilitybok.org/usability-testing-methods

Page 33: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Query

• Two main variants:

– Questionnaire

(reach more people; less flexible)

– Interview

• Should always be carefully prepared and tested

• Collected data should be carefully analyzed

37

https://www.interaction-design.org/literature/article/how-to-conduct-user-interviews

https://www.interaction-design.org/literature/article/useful-survey-questions-for-user-feedback-surveys

Page 34: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Well-known usability questionnaires

- System Usability Scale (SUS)

- Questionnaire for User Interface Satisfaction (QUIS)

• SUS provides a “quick and dirty”, reliable tool for measuring the usability

• It includes 10 questions with five response options

• QUIS is a measurement tool designed to assess a computer user's subjective satisfaction with the UI

• It is designed to be configured according to the needs of each UI analysis by including only the sections that are of interest to the user

• It includes questions with ten response options

• Both questionnaires should be completed following use of the UI in question

38

Page 35: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

System Usability Scale (SUS)

• Provides a “quick and dirty”, reliable tool for measuring the usability

• It includes 10 questions with five response options

• It allows to evaluate a wide variety of products and services

(H/W, S/W, mobile devices, websites and applications)

• Has become an industry standard, with references in over 1300 publications

Benefits of using a SUS • Is a very easy scale to administer to participants

• Can be used on small sample sizes with reliable results

• Is valid – it can differentiate between usable and unusable systems

39

https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html

Page 36: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

SUS Questions

• I think that I would like to use this system frequently.

• I found the system unnecessarily complex.

• I thought the system was easy to use.

• I think that I would need the support of a technical person to be able to use this system.

• I found the various functions in this system were well integrated.

• I thought there was too much inconsistency in this system.

• I would imagine that most people would learn to use this system very quickly.

• I found the system very cumbersome to use.

• I felt very confident using the system.

• I needed to learn a lot of things before I could get going with this system.

40

https://www.usability.gov/how-to-and-tools/resources/templates/system-usability-scale-sus.html

Page 37: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Scoring SUS

41

Let R(n) be the answer to Question n:

( Σ R(n)-1 + 5- R(n*2) ) * 2.5 n=1

5

SUS =

0… 100; SUS > 68 would be considered above average

Page 38: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

QUIS - Questionnaire for User Interface Satisfaction

• The QUIS contains:

– a demographic questionnaire,

– a measure of overall system satisfaction,

– a measure of specific UI factors (e.g. screen visibility, terminology and system information, learning factors, and system capabilities)

• QUIS has pen and paper and PC software versions for administration

• Uses a 10-point scale to rate 21 items relating to the system's usability

• These ratings produce data for the overall reaction to a system's usability on 6 factors.

• It is easy to use and analyse.

https://ext.eurocontrol.int/ehp/?q=node/1611

42

Page 39: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Example questions of QUIS

43

Page 40: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Usability tests

• Involve observation and query

• Main aspects:

– Participants

– Tasks

– Test facilities and systems

– Protocol

– Usability measures

– Data analysis

• May have a complex logistics

• Standard: Common Industry Format (CIF) for usability test reports

https://www.usability.gov/how-to-and-tools/methods/planning-usability-testing.html https://www.interaction-design.org/literature/topics/usability-testing

Page 41: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Participants

45

• The total number of participants to be tested

(a valid statistical analysis implies a sufficient number of subjects)

• Segmentation of user groups tested, if more than one

• Key characteristics and capabilities of user group (user profile: age, gender, computing experience, product experience, etc.)

• How to select participants

• Differences between the participant sample and the user population

(e.g. actual users might have training whereas test subjects were untrained)

Page 42: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Tasks

46

• The task scenarios for testing

• Why these tasks were selected

(e.g. the most frequent tasks, the most troublesome tasks)

• The source of these tasks

(e.g. observation of users using similar products, product specifications)

• Any task data given to the participants

• Completion or performance criteria established for each task

(e.g. n. of clicks < N, time limit)

Page 43: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Test Facilities and equipment

• The setting and type of space in which the evaluation will be done (e.g. usability lab, cubicle office, meeting room, home office, home family room, manufacturing floor, etc.)

• Any relevant features or circumstances that can affect the results (e.g. video and audio recording equipment, one-way mirrors, or automatic data collection equipment)

• Participant’s computing environment (e.g. computer configuration, including model, OS version, required libraries or settings, browser name and version; relevant plug-in, etc. )

• Display and input devices characteristics

• Any questionnaires to be used

47

Page 44: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Protocol

• Procedure: the logical design of the test

• Participant general instructions and task instructions

• The usability measures to be used:

a) for effectiveness (completeness rate, errors, assists)

b) for efficiency (times)

c) for satisfaction

48

Page 45: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Common Industry Format (CIF) for usability test reports ISO/IEC 25062:2006

• Specifies the format for reporting the results of a summative evaluation

• The most common type of usability evaluation is formative, (i.e. designed to identify problems that can be fixed)

• A summative evaluation produces usability metrics that describe how usable a product is when used in a particular context of use

• The CIF report format and metrics are consistent with the ISO 9241-11

https://www.iso.org/standard/43046.html https://www.userfocus.co.uk/articles/cif.html

49

Page 46: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Software engineering -- Software product Quality Requirements and Evaluation (SQuaRE) -- Common Industry Format (CIF) for usability test reports

The format includes the following elements:

• the description of the product,

• the goals of the test,

• the test participants,

• the tasks the users were asked to perform,

• the experimental design of the test,

• the method or process by which the test was conducted,

• the usability measures and data collection methods, and

• the numerical results. 50

Page 47: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Controlled experiments

• The “work horse” of science ...

• Important issues to consider:

– Hypothesis

– Variables (input or independent; output or dependent)

– Secondary variables

– Experimental design (within groups; between groups)

– Participants (number, profile)

– Statistics

https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/experimental-methods-in-human-computer-interaction

Page 48: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Controlled experiment

• Define an hypothesis

• Define input (independent), output (dependent) and secondary variables

• Define experimental design (within-groups / between groups)

• Select the participants

• Prepare all the documentation: - list of tasks and perceived difficulty - final questionnaire - list of tasks for the observer to take notes

• Run a pilot test

• Take care of the logistics … and after the experiment analyze data

52

To the user

To the observer

Page 49: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Examples of Controlled Experiments performed @ HCI - DETI

• Study of the Effect of Hand-Avatar in a Selection Task using a Tablet as Input Device in an Immersive Virtual Environment

• Comparing two alternative versions of Meo Go

53

Page 50: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

54

• Research question: How does the virtual representation of the user's hands influence the performance on a button selection task performed in a tablet-based interaction within an immersive virtual environment?

• Method: Controlled experiment

• 55 participants used three conditions: - no-hand avatar, - realistic avatar, - translucent avatar.

• Participants were faster but made slightly more errors with no-avatar

• Considered easier to perform the task with the translucent avatar

“Effect of Hand-Avatar in a Selection Task using a Tablet as Input Device in an Immersive Virtual Environment”

L. Afonso, P. Dias, C. Ferreira, B. Sousa Santos IEEE 3D UI, Los Angeles, March 2017

Page 51: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Experimental Design

Null Hypothesis: usability is independent of the hands representation Independent (input) variable (with 3 levels): representation of the hands Dependent (output) variable: usability (performance + satisfaction) Within-groups: all participants used all experimental conditions (in different sequences to avoid bias) Task: selecting as fast as possible a highlighted button from a group of twenty buttons (repeated measures)

Page 52: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

56

Experimental Conditions

1- No avatar: the user only sees the virtual tablet; 2- Realistic avatar: a realistic representation of the hands movement is shown 3- Translucent avatar: a translucent hand model is used (to alleviate occlusion)

No-avatar Realistic avatar Translucent-avatar

Page 53: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

57

Experimental Set-up

• Laptop running the main application (in Unity)

• HMD (Oculus Rift DK2) providing head tracking

• Tablet (Google Nexus 7) as input device running the controller application (in Unity)

• Leap Motion (mounted on the HMD) to track the user’s hands

• Tablet camera tracking the position and orientation of an AR marker on the HMD to map tablet position in the virtual world (using Vuforia)

Page 54: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

58

Main Results

Selection time: Participants completed the button selections in average faster with no-avatar (statistically significant)

Selection errors: Participants made slightly less errors with avatar - realistic or translucent- (statistically significant)

Participants’ opinion: The translucent avatar: - was more often preferred - was considered as better than the realistic avatar (statistically significant)

Page 55: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

59

Comparing two versions of Meo Go

Current version

New version

Altisse Labs User - UX Group DETI – HCI couse April/2016

Page 56: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Controlled experiment: comparing two versions of Meo Go: Current version vs Version to be deployed

• Null Hypothesis: both are equally usable

• Input (independent) variable: version

• Output (dependent) variables: performance, opinion and satisfaction

• Secondary variable: participant profile

• Participants: 66 volunteer HCI -2016 students (12 female)

• Experimental design: between groups (one version per participant)

• Exploratory Data Analysis and non-parametric tests (ordinal variables) 60

Page 57: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Summary of results

• Participants were satisfied with both versions no significant difference between versions was observed • Both versions got a good classification concerning application usage no significant difference between versions was observed • Both versions got a good classification concerning difficulty no significant difference between versions was observed • Both versions got a good classification concerning specific aspects of the

application two aspects improved significantly in version 2 • No significant difference between the two versions is observed

concerning total task time

69

Page 58: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Study main limitations

• Participants’ profile - students are not representative of all target users

• Tasks – are simple and performed in a controlled context - limit ecological validity

• Think aloud - total task time must be considered with caution

However:

HCI students are aware of usability issues and provided valuable feedback

70

Page 59: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

• This data analysis was complemented by the analysis of comments and suggestions

• The combined analysis provided more insight concerning usability issues

• Allowed identify easy to implement improvements in the UI with potential positive impact on the UX of the final version

71

Page 60: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Bibliography for Usability evaluation – Books and links

- Alan Dix, Janet Finlay, Gregory Abowd, Russell Beale, Human-Computer Interaction, 3rd edition, Prentice Hall, 2004

- Jackob Nielsen, Usability Engineering, Morgan Kaufmann, 1993

- Jenny Preece, Yvonne Rogers, Helen Sharp, D. Benyon, S. Holland, T. Carey, Human-Computer Interaction, Addison Wesley, 1994

- Peter Mitchell, A Step-by-step Guide to Usability Testing, iUniverse, 2007

- Gilbert Cockton, Usability Evaluation. In: Soegaard, Mads and Dam, Rikke Friis (eds.), The Encyclopedia of Human-Computer Interaction, 2nd Ed, 2013, Aarhus, Denmark: The Interaction Design Foundation. (June, 10, 2018)

http://www.interaction-design.org/encyclopedia/usability_evaluation.html

- Norman/ Nielsen Group site - http://www.nngroup.com/articles/

- Standard ISO 9241-11 - Ergonomic requirements for office work with visual display terminals Part 11 : Guidance on usability

72

Page 61: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Preparing the Exam

• Study the Slides

• Study the mandatory readings

• Answer the Exam preparation questions available at Moodle

87

Page 62: Usability Evaluation Methods - Universidade de Aveirosweet.ua.pt/bss/aulas/IHC-2020/Usability Evaluation-2020.pdf · 2020. 5. 13. · Heuristic Evaluation (Nielsen and Molich 1990)

Mandatory Readings for the Exam

• Slides available at Moodle and at the course web page: http://sweet.ua.pt/bss/disciplinas/IHC-ECT/IHC-ECT-home.htm • Alan Dix t al., Human-Computer Interaction, 3rd ed., Prentice Hall, 2004 (at the Library) - Chapters: 1 to 4 (for the topics addressed in the slides) - Chapter 9 (for the topics addressed in the slides) - Chapters 12, 14 and 16 (for the topics addressed in the slides) • Randolph Bias, Deborah Meyhew, Cost Justifying usability, Morgan Kaufmann,

2005 (Chapter 3, pag. 42-55 available at Moodle) • Ian Sommerville, Software Engineering, 9. Ed., Addison Wesley, 2009 (Chapter 29, available at Moodle)

88