hci460: week 1 lecture september 9, 2009. 2 © august 21, 2015 course overview overview of...

65
HCI460: Week 1 Lecture HCI460: Week 1 Lecture September 9, 2009 September 9, 2009

Upload: constance-copeland

Post on 24-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

HCI460: Week 1 LectureHCI460: Week 1 LectureSeptember 9, 2009September 9, 2009

Page 2: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

2© April 19, 2023

Course overview

Overview of usability evaluation methods

Heuristic evaluation

Cognitive walkthrough

Expert evaluation

Presenting results

Project 1a

Outline

Page 3: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

3© April 19, 2023

Course Overview

Page 4: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

4© April 19, 2023

Instructors– Gavin Lew– Aga Bojko

Office hours– Wed 5pm – 5:45pm and 9pm – 9:45pm– Lewis 1111, Loop Campus (our classroom)

Email address– [email protected]

Course Web page– http://www.usercentric.com/hci460-fall2009.html

Prerequisites– HCI 440 and basic statistics

Basic InformationCourse Overview

Page 5: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

5© April 19, 2023

This course surveys methods for evaluating usability of a wide range of products and interfaces.

We will discuss and practice methods:– Heuristic and expert evaluations– Cognitive walkthroughs– Usability testing (formative and

summative)– Surveys– Eye tracking– Contextual inquiries– KLM-GOMS– Focus groups

Course SummaryCourse Overview

Page 6: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

6© April 19, 2023

To learn how to:– Establish appropriate evaluation objectives – Select evaluation methods that address evaluation objectives

and take into account existing constraints – Articulate advantages and disadvantages of usability evaluation

methods – Properly use various usability evaluation methods – Present results and prepare effective reports

Course GoalsCourse Overview

Page 7: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

7© April 19, 2023

Required Text– Handbook of Usability Testing by Rubin

• 1st edition: ISBN 0-471-59403-2• 2nd edition: ISBN 0-470-18548-1

– Task-Centered User Interface Design: A Practical Introduction by Lewis and Rieman (online text)

Optional Text– Measuring the User Experience:

Collecting, Analyzing, and Presenting Usability Metrics by Tullis and Albert (ISBN 0-123-73558-0)

– Usability Inspection Methods by Nielsen and Mack (ISBN 0-471-01877-5)

– Various papers

TextbooksCourse Overview

Page 8: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

8© April 19, 2023

Project 1: Expert evaluation– Individual notes– Report

Project 2: Formative usability study– Test plan– Participant screening questionnaire and moderator’s guide– Conducting a test– Report

Project 3: Quantitative comparison study– Test plan– Report

ProjectsCourse Overview

Page 9: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

9© April 19, 2023

15% Project 1: Expert evaluation 25% Project 2: Formative usability study 15% Project 3: Quantitative comparison study 10% Take-home midterm quiz 25% Final exam 10% Individual contribution to projects (next slide)

Attendance?– Not required but strongly recommended– Projects and exams will cover both lecture and reading material

GradingCourse Overview

Page 10: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

10© April 19, 2023

Grading: Individual Contribution to ProjectsCourse Overview

Page 11: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

11© April 19, 2023

ScheduleCourse Overview

Date TopicText Reading

(will be assigned a week prior to each class)

Exam or Due Project

Sep 9Course overview, overview of usability evaluation methods, heuristic evaluation, cognitive walkthrough, expert evaluation

 

Sep 16Expert evaluation continued, fundamentals of study design

 Rubin ch. 1, Lewis and Rieman ch. 4 (except 4.2)

Project 1a - Expert evaluation: individual notes

Sep 23 Formative usability testing: How to prepare a study   Project 1b - Expert evaluation: final report

Sep 30 Formative usability testing: How to conduct a study   Project 2a - Formative usability study: test plan

Oct 7Formative usability testing: How to analyze findings, formulate recommendations, and write reports

 Project 2b - Formative usability study: participant screening questionnaire and moderator's guide

Oct 14 Teams will be conducting usability testing  Project 2c - Formative usability study: conducting the testTake-home midterm will be distributed

Oct 21 Summative usability testing   Take-home midterm will be collected

Oct 28 Other usability evaluation methods: Eye tracking, KLM-GOMS, contextual inquiry, surveys, remote usability testing, focus groups

 Project 2d - Formative usability study: reportProject 3a - Quantitative comparison study: test plan

Nov 4    

Nov 11Team presentations, overview of material for final exam

  Project 3b - Quantitative comparison study: report

Nov 18     Final exam

Page 12: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

12© April 19, 2023

Overview of Usability Evaluation Methods

Page 13: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

13© April 19, 2023

Usability Evaluation Methods in ContextOverview of Usability Evaluation Methods

Usability testing?

Participatory design?

Survey?User

Experience (UX) Methods

Usability Evaluation Methods

Page 14: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

14© April 19, 2023

Usability Evaluation Methods in ContextOverview of Usability Evaluation Methods

User Experience

(UX) Methods

Usability Evaluation Methods

Do not involve users Involve users

Summative usability testing

Ethnographic research

Participatory design

Card sorting

Focus groups

Eye tracking

Surveys

Formative usability testing

Heuristic evaluation

Cognitive walkthrough

KLM-GOMS

Page 15: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

15© April 19, 2023

UX Methods Involving UsersOverview of Usability Evaluation Methods

Attitude

Behavior

Qu

alit

ativ

eQ

ua

ntitative

Formative usability testing

Summative usability testing

Ethnographic research

Participatory design

Card sorting

Focus groups

Eye tracking

Surveys

Page 16: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

16© April 19, 2023

Usability Evaluation Methods Involving UsersOverview of Usability Evaluation Methods

Attitude

Behavior

Qu

alit

ativ

eQ

ua

ntitative

Formative usability testing

Summative usability testing

Ethnographic research

Focus groups

Eye tracking

Surveys

Page 17: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

17© April 19, 2023

Usability Evaluation Methods Without UsersOverview of Usability Evaluation Methods

Qu

alit

ativ

eQ

ua

ntitative

Heuristic EvaluationKLM-GOMS

Cognitive Walkthrough

Expert Evaluation

Usability inspection methods:

Page 18: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

18© April 19, 2023

Involving users is expensive (time, money).

No users = “discount usability”

Usability inspection is quick, cheap, and useful.

It should be done before a usability test to “clean up” the interface from obvious issues.– If the interface is not cleaned up, the participants will be

distracted and will waste time.

Why No Users?Overview of Usability Evaluation Methods

Page 19: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

19© April 19, 2023

Heuristic Evaluation

Page 20: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

20© April 19, 2023

Heuristics = guidelines, principles, rules of thumb

There are many sets of usability heuristics:– Jacob Nielsen’s Heuristics – 1994 (link)– Tognazzini’s First Principles of Interaction Design – 2003 (link)– Jill Gerhardt-Powal’s Cognitive Engineering Principles – 1996

(link)– Research-Based Web Design and Usability Guidelines – 2004

(link)

What Are Heuristics?Heuristic Evaluation

Page 21: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

21© April 19, 2023

Nielsen’s HeuristicsHeuristic Evaluation

1. Visibility of System Status

2. Match Between System and the Real World

3. User Control and Freedom

4. Consistency and Standards

5. Error Prevention

6. Recognition Rather than Recall

7. Flexibility and Efficiency of Use

8. Aesthetic and Minimalist Design

9. Error Recovery

10.Help and Documentation

Page 22: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

22© April 19, 2023

Nielsen’s Heuristics: #1Heuristic Evaluation

1. Visibility of System Status– The system should always keep users informed about what is

going on, through appropriate feedback within reasonable time.

There is no indication of location within the application. Clicking on Map History does not display anything if the history is empty.

Page 23: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

23© April 19, 2023

Nielsen’s Heuristics: #2Heuristic Evaluation

2. Match Between System and the Real World– The system should use phrases and concepts familiar to the

user. Follow real-world conventions, making information appear in a natural and logical order.

The order of the controls is incorrect. Making a selection in the dropdown depends on whether or not the Enable Auto Update is selected.

“Processing weather data” is a system-oriented term that appears when the user clicks on “Update Weather Now.”

Page 24: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

24© April 19, 2023

Nielsen’s Heuristics: #3Heuristic Evaluation

3. User Control and Freedom– Users often choose system functions by mistake and will need a

clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

Processing can take a while and there is no way to cancel the action or move the box to the side.

Page 25: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

25© April 19, 2023

Nielsen’s Heuristics: #4Heuristic Evaluation

4. Consistency and Standards– Users should not have to wonder whether different words,

situations, or actions mean the same thing. Follow platform conventions.

Placement of the Browse button does not follow standards. The Browse button generally appears to the right of the file textbox.

Page 26: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

26© April 19, 2023

Nielsen’s Heuristics: #5Heuristic Evaluation

5. Error Prevention– Even better than good error messages is a careful design which

prevents a problem from occurring in the first place.

The text field is too long and accepts 256 digits, suggesting that the required input should be longer than five digits.

Page 27: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

27© April 19, 2023

Nielsen’s Heuristics: #6Heuristic Evaluation

6. Recognition Rather than Recall– Make objects, actions, and options visible. The user should not

have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

Most of the icons are not intuitive and they are not labeled. Users have to remember what each icon means or hover over them, which negatively impacts efficiency.

Page 28: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

28© April 19, 2023

Nielsen’s Heuristics: #7Heuristic Evaluation

7. Flexibility and Efficiency of Use– Accelerators -- unseen by the novice user -- may often speed up

the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

The Close is difficult to see and the clickable area associated with it is very small. Hitting the target area is difficult and may take a few tries.

The window does not close when Alt-F4 is pressed, which is inconsistent with other Windows applications.

Page 29: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

29© April 19, 2023

Nielsen’s Heuristics: #8Heuristic Evaluation

8. Aesthetic and Minimalist Design– Dialogues should not contain information which is irrelevant or

rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

It is unnecessary to include the word “Map” on all buttons. It is clear that this is a list of maps and all the actions will be performed on maps.

The tray tooltip is unnecessarily long for the amount of information it conveys.

Page 30: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

30© April 19, 2023

Nielsen’s Heuristics: #9Heuristic Evaluation

9. Error Recovery– Help users recognize, diagnose, and recover from errors. Error

messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

After entering an incorrect zip code, when users click on the “Update Weather Now” icon, they see an error message.

The message provides no information on what the problem is and how to fix it.

Page 31: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

31© April 19, 2023

Nielsen’s Heuristics: #10Heuristic Evaluation

10.Help and Documentation– Even though it is better if the system can be used without

documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

The Help document is very long and requires lengthy scrolling to access certain sections. This specifically impacts users who are unaware of the Find feature in Notepad.

Additionally, the sections are not labeled with the established system in the Table of Contents (A, B, C, … etc.) making the document difficult to search.

Page 32: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

32© April 19, 2023

Form groups of 3 in class.– Online students can do this

exercise individually.

Each group will get a handout with Nielsen’s 10 heuristics and screenshots of two interfaces.

With the heuristics in mind, find a few usability issues with each interface.– Assign an appropriate

heuristic to each issue.– There may be more than one

heuristic per issue.

Nielsen’s Heuristics: ExerciseHeuristic Evaluation

1.Visibility of System StatusThe system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

2.Match Between System and the Real WorldThe system should use phrases and concepts familiar to the user. Follow real-world conventions, making information appear in a natural and logical order.

3.User Control and FreedomUsers often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

4.Consistency and StandardsUsers should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

5.Error PreventionEven better than good error messages is a careful design which prevents a problem from occurring in the first place.

6.Recognition Rather than RecallMake objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

7.Flexibility and Efficiency of UseAccelerators (unseen by the novices) – may speed up the interaction for the experts such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

8.Aesthetic and Minimalist DesignDialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

9.Error RecoveryHelp users recognize, diagnose, and recover from errors. Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

2.Help and DocumentationEven though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

Page 33: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

33© April 19, 2023

Dog age calculator– http://www.bowwow.com.au/calculator/index.asp

Nielsen’s Heuristics: Exercise – Interface 1Heuristic Evaluation

Page 34: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

34© April 19, 2023

Currency converter:– http://www.oanda.com/convert/classic

Nielsen’s Heuristics: Exercise – Interface 2aHeuristic Evaluation

Page 35: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

35© April 19, 2023

The new converter: Have things improved? – http://www.oanda.com/currency/converter/

Nielsen’s Heuristics: Exercise – Interface 2bHeuristic Evaluation

Page 36: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

36© April 19, 2023

Developed by the National Cancer Institute in the US Department of Health and Human Services– Many contributors

Available on usability.gov and in a book

2009 guidelines

Web-specific– Mostly for informational Web sites

Each guideline has two ratings

Research-Based GuidelinesHeuristic Evaluation

Page 37: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

37© April 19, 2023

Relative importance– How important is the

guideline to the overall success of a Web site?

– Based on opinions of 16 experts

Strength of evidence– Team of researchers

evaluated existing research evidence for each guideline and rated it.

Strong research support

Moderate research support

Weak research support

Strong expert opinion support

Weak expert opinion support

Research-Based Guidelines: RatingsHeuristic Evaluation

Page 38: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

38© April 19, 2023

How to use the guidelines?

Pair them down by making a checklist of:– Top __ guidelines based on importance– Top __ guidelines based on research evidence– Most relevant guidelines to your application

• E.g., 30 guidelines related to forms

Use guidelines with 4 – 5 strength of evidence ratings to convince others who do not believe in usability.– Credibility

Research-Based Guidelines: UsageHeuristic Evaluation

Page 39: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

39© April 19, 2023

One person cannot find all usability problems.

Different people find different usability problems.

Heuristic evaluation is most effective with multiple evaluators.– Nielsen recommends 3 – 5 evaluators:

Conducting an Evaluation: # of EvaluatorsHeuristic Evaluation

Page 40: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

40© April 19, 2023

Each evaluator examines the interface independently (no communication with others).– He/she can use tasks/scenarios.– At least two passes:

• 1st pass: to get a feel for the flow and scope• 2nd pass: to focus on specifics

– Output: list of problems and heuristics that they violate

Evaluators meet for a debriefing session.– Discuss each issue and the violated heuristic. – Agree on the final list of issues (ensure there is no redundancy).– Brainstorm recommendations.

Each evaluator independently assigns a severity rating to each issue.

One evaluator combines all severity ratings.

Conducting an Evaluation: StepsHeuristic Evaluation

Page 41: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

41© April 19, 2023

Severity ratings help with prioritization of issues.

Nielsen’s 0 – 4 scale:– 0: I don’t agree that this is a usability problem at all– 1: Cosmetic problem only

• Need not be fixed unless extra time is available– 2: Minor usability problem

• Fixing this should be given low priority– 3: Major usability problem

• Important to fix, so should be given high priority– 4: Usability catastrophe

• Imperative to fix this

Conducting an Evaluation: Severity RatingsHeuristic Evaluation

Page 42: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

42© April 19, 2023

When assigning severity ratings, consider:– The frequency with which the problem occurs.

• Is it common or rare?– The impact of the problem if it occurs.

• Will it be easy or difficult for the users to overcome?– The persistence of the problem.

• Is it a one-time problem that users can overcome once they know about it or will they be repeatedly bothered by it?

Conducting an Evaluation: Severity RatingsHeuristic Evaluation

Page 43: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

43© April 19, 2023

Another scale:

Conducting an Evaluation: Severity RatingsHeuristic Evaluation

HIGH A high-severity issue is a major problem that prevents users from finding, recognizing, or using key features and completing critical tasks. It is critical to resolve this issue because it blocks the use of functionality.

A medium-severity issue can be a common problem that has a negative impact on users’ overall efficiency. It often creates inconvenience for the user. It is important to resolve this issue; otherwise user confusion and frustration will result.

A low-severity issue typically does not affect user performance but causes irritation and has impact on user’s opinion of the product/company. Resolving the issue will generally improve the user experience.

MEDIUM

LOW

Page 44: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

44© April 19, 2023

Cognitive Walkthrough

Page 45: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

45© April 19, 2023

Walkthroughs are a formalized way of imagining people's thoughts and actions when they use an interface for the first time.

They are more structured than a heuristic evaluation.

Walkthroughs help iterate the interface; they do not validate it.

OverviewCognitive Walkthrough

Page 46: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

46© April 19, 2023

Interface or its prototype– E.g., Weather Watcher 5.4B

Task description (representative tasks)– E.g., Set the zip code for which you

would like Weather Watcher to retrieve the weather.)

Complete, written list of the actions needed to complete the task– E.g., click on View Options menu item, click on Active City link,

type zip code, click on OK button…

User description– E.g., wide range of computer users (from novices to experts)

What’s NeededCognitive Walkthrough

Page 47: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

47© April 19, 2023

For each action, ask these four questions:– Will users try to produce the effect the action has? – Will users see the control (button, menu item etc.) for the action? – Will users recognize that the control produces the effect? – After the action is taken, will users understand the feedback they

get, so that they know the action took place?

StepsCognitive Walkthrough

Page 48: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

48© April 19, 2023

Example: Setting Zip Code in Weather WatcherCognitive Walkthrough

ACTIONS 1. Click on the View Options icon on the main screen.

2. Click on Active City in the menu on the left.

3. Click on the Add New Zip Code button.

Will users try to produce the effect the action has?

Will userssee the control for the action?

Will users recognize the control produces the effect?

Will users understand the feedback?

Page 49: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

49© April 19, 2023

Example: Setting Zip Code in Weather WatcherCognitive Walkthrough

ACTIONS 4. Type the zip code. 5. Click on the OK button to confirm the entered zip code.

6. Click on the “X” in the top right corner of the Options window to exit Options.

Will users try to produce the effect the action has?

Will userssee the control for the action?

Will users recognize the control produces the effect?

Will users understand the feedback?

Page 50: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

50© April 19, 2023

You should start with a list of correct actions needed to complete a given task. – You can explore the interface to identify those actions BUT

that's not the walkthrough.

The walkthrough shows what the user may have trouble with rather than what the user will do when that problem arises.

Keep in MindCognitive Walkthrough

Page 51: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

51© April 19, 2023

Perform a cognitive walkthrough of a visual shopping site:– http://www.like.com/

Users:– Experienced Internet users– Visual shopping novices

Task: – Find shoes similar to the ones

in the picture.

Actions: – To be determined by the

evaluators.

ExerciseCognitive Walkthrough

Page 52: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

52© April 19, 2023

Will users try to produce the effect the action has?

Will users see the control (button, menu item etc.) for the action?

Will users recognize that the control produces the effect?

Will users understand the feedback they get, so that they know the action took place?

ExerciseCognitive Walkthrough

Page 53: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

53© April 19, 2023

The outcome is a list of potential usability issues.

Typical recommendations: – Make the controls more obvious (question 2).– Use labels that users will recognize (question 3).– Provide better feedback (question 4).

What if Question 1 was a problem?– If the user doesn't have any reason to think that an action needs

to be performed: • Eliminate the action. Let the system take care of it. • Re-order the task so that users will start doing something

that they know needs to be done, and then get prompted for the action in question.

OutcomeCognitive Walkthrough

Page 54: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

54© April 19, 2023

Expert Evaluation

Page 55: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

55© April 19, 2023

Cognitive walkthrough (CW) and heuristic evaluation (HE) with all the different sets of heuristics and are just tools.

Issues with HE and CW:– They can be unnecessarily time-consuming (you know a

problem, now you need to match a heuristic).– You will see usability issues that you will not be able to describe

with heuristics or walkthrough questions.

Thus, using only these methods may not be optimal.

Solution: Consider expert evaluation.

Practically SpeakingExpert Evaluation

Page 56: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

56© April 19, 2023

Heuristic evaluation explicitly uses a predefined set of heuristics.– It specifies which problem violates which heuristic.

Expert evaluation:– Evaluators do not use a specific set of heuristics.– Evaluators are usability experts who have internalized various

usability guidelines and seen many usability testing sessions.– They can use elements of cognitive walkthrough but the

questions are not used explicitly.

Expert Evaluation ≠ Heuristic EvaluationExpert Evaluation

Page 57: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

57© April 19, 2023

Presenting Results

Page 58: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

58© April 19, 2023

Introduction– Objectives– Product evaluated– Target users– Context of use

Findings with severity ratings– Describe the problem and WHY this may be a problem.– Organize by section, screen, or task rather than by heuristic.

Recommendations– Could accompany each finding or be presented at the end

(especially if there is no clear one-to-one relationship between the findings and recommendations)

What to Include in the Report?Presenting Results

Page 59: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

59April 19, 2023

Usability Issue: The Rapid Part Add feature does not recognize part numbers if dashes are not entered. The items are added to the cart but they are not recognized.Recommendation: The search functionality should be able to handle part numbers without dashes. If it can’t, instructions should say that the part number has to include dashes. Alternatively, provide three input fields, separated by dashes. Automatically advance to the next field if maximum number of characters have been entered.

Usability Issue: The Rapid Part Add gets visually lost among all the other elements on the page and may be difficult to notice.Recommendation: Put the Rapid Part Add in a shaded box to make it stand out.

Good Practice: Rapid Part Add allows users to quickly enter their shopping list if they know the part numbers of the items without having to search for the items one by one. Also, users’ own part numbers are accepted in the Rapid Part Add tool, which is helpful.

Usability Issue: There is no error check when adding parts through Rapid Part Add. If users make an error in the part number, they will not have an opportunity to correct it.Recommendation: Provide users with feedback If the part number is not recognized. H

L

M

Presenting Results

Example

Page 60: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

60© April 19, 2023

Project 1a

Page 61: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

61© April 19, 2023

Evaluate Free Sticky Notes 6.0– http://www.morun.net/www/downloads/freestickynotes.html

Expert Evaluation: Individual PartProject 1a

Page 62: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

62© April 19, 2023

ScopeProject 1a

Page 63: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

63© April 19, 2023

Approach Project 1a

Approach:– Download the software – Familiarize yourself with the software (1st pass)– Print each screen / window /menu– Go through the application screen by screen and evaluate each

using a combination of your expertise, HE and CW– Jot down potential usability issues on the screenshots (e.g., on

post-its)

Next Wednesday, bring your individual evaluations (screenshots with your notes) to class with you. You will need 2 copies:– One for you to work with during class.– One for us (please write clearly).

Page 64: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

64© April 19, 2023

Reading for Next WeekProject 1a

Rubin, chapter 1– What makes something usable– Techniques in UX research

Lewis and Rieman, chapter 4 (except 4.2)– Evaluating the design without users

• Cognitive walkthrough• Heuristic evaluation

Page 65: HCI460: Week 1 Lecture September 9, 2009. 2 © August 21, 2015  Course overview  Overview of usability evaluation methods  Heuristic evaluation  Cognitive

65© April 19, 2023

Questions?Project 1a