discount evaluation evaluating with experts. agenda online collaboration tools heuristic evaluation...

35
Discount Evaluation Evaluating with experts

Post on 20-Dec-2015

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Discount Evaluation

Evaluating with experts

Page 2: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Agenda

Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform CW on each other’s prototypes

Page 3: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Project part 3

See any comments in your writeups I’ll send out grades tomorrow (I have to add the

demo piece) Perform the evaluations

Clearly inform your users what you are doing and why.

If you are audio or video recording, I prefer you use a consent form.

Pilot at least once – know how long its going to take.

If you need equipment, ask me

Page 4: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Discount Evaluation Techniques

Basis: Observing users can be time-consuming and

expensive Try to predict usability rather than observing it

directly Conserve resources (quick & low cost)

Page 5: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Approach - inspections

Expert reviewers used HCI experts interact with system and try to find

potential problems and give prescriptive feedback

Best if Haven’t used earlier prototype Familiar with domain or task Understand user perspectives

Does not require working system

Page 6: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Heuristic Evaluation

Developed by Jakob Nielsen

Several expert usability evaluators assess system based on simple and general heuristics (principles or rules of thumb)

(Web site: www.useit.com)

Page 7: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

How many experts?

Nielsen found thatabout 5 evaluations found 75% of the problems

Above that you get more, but at decreasing efficiency

Page 8: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Evaluate System

Reviewers need a prototype May vary from mock-ups and storyboards to a

working system

Reviewers evaluate system based on high-level heuristics.

Where to get heuristics? http://www.useit.com/papers/heuristic/ http://www.asktog.com/basics/firstPrinciples.html

Page 9: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Heuristics

use simple and natural dialog

speak user’s language

minimize memory load

be consistent provide feedback

provide clearly marked exits

provide shortcuts provide good error

messages prevent errors

Page 10: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Neilsen’s Heuristics

visibility of system status

aesthetic and minimalist

design user control and

freedom consistency and

standards error prevention

recognition rather than recall

flexibility and efficiency of use

recognition, diagnosis and recovery from errors

help and documentation match between system

and real world

Page 11: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Groupware heuristics Provide the means for intentional and appropriate verbal

communication Provide the means for intentional and appropriate gestural

communication Provide consequential communication of an individual’s embodiment Provide consequential communication of shared artifacts (i.e. artifact

feedthrough) Provide Protection Manage the transitions between tightly and loosely-coupled

collaboration Support people with the coordination of their actions Facilitate finding collaborators and establishing contact

Baker, Greenberg, and Gutwin, CSCW 2002

Page 12: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Ambient heuristics

Useful and relevant information “Peripherality” of display Match between design of ambient display and

environments Sufficient information design Consistent and intuitive mapping Easy transition to more in-depth information Visibility of state Aesthetic and Pleasing Design

Mankoff, et al, CHI 2003

Page 13: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Process

Perform two or more passes through system inspecting Flow from screen to screen Each screen

Evaluate against heuristics Find “problems”

Subjective (if you think it is, it is) Don’t dwell on whether it is or isn’t

Page 14: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Debriefing

Organize all problems found by different reviewers At this point, decide what are and aren’t

problems Group, structure Document and record them

Page 15: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Severity Rating

Based on frequency impact persistence market impact

Rating scale 0: not a problem 1: cosmetic issue, only fixed if extra time 2: minor usability problem, low priority 3: major usability problem, high priority 4: usability catastrophe, must be fixed

Page 16: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Advantages

Few ethical issues to consider Inexpensive, quick

Getting someone practiced in method and knowledgeable of domain is valuable

Page 17: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Challenges

Very subjective assessment of problems Depends of expertise of reviewers

Why are these the right heuristics? Others have been suggested

How to determine what is a true usability problem Some recent papers suggest that many

identified “problems” really aren’t

Page 18: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Your turn:

Library – main page, finding a book

Use Nielsen’s heuristics (p 686) List all problems Come up to the board and put up at least one

new one We’ll rate as a group

Page 19: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Your turn

Prepare Determine your heuristics if you want any different

ones Demo your prototype to all of us

Evaluate Evaluate someone other than yourself

Debrief On your own, include in Part 3

Page 20: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Cognitive Walkthrough

More evaluation without users

Page 21: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Cognitive Walkthrough

Assess learnability and usability through simulation of way novice users explore and become familiar with interactive system

A usability “thought experiment” Like code walkthrough (s/w engineering) From Polson, Lewis, et al at UC Boulder

Page 22: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Process

Construct carefully designed tasks from system spec or screen mock-up

Walk through (cognitive & operational) activities required to go from one screen to another

Review actions needed for task, attempt to predict how users would behave and what problems they’ll encounter

Page 23: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Assumptions

User has rough plan User explores system, looking for actions to

contribute to performance of action User selects action seems best for desired

goal User interprets response and assesses

whether progress has been made toward completing task

Page 24: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Requirements

Description of users and their backgrounds Description of task user is to perform Complete list of the actions required to

complete task Prototype or description of system

Page 25: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Methodology

Step through action sequence Action 1 Response A, B, .. Action 2 Response A ...

For each one, ask four questions and try to construct a believability story

Page 26: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Questions

1. Will users be trying to produce whatever effect action has?

2. Will users be able to notice that the correct action is available? (is it visible)

3. Once found, will they know it’s the right one for desired effect? (is it correct)

4. Will users understand feedback after action?

Page 27: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Answering the Questions

1. Will user be trying to produce effect? Typical supporting evidence

It is part of their original task They have experience using the system The system tells them to do it

No evidence? Construct a failure scenario Explain, back up opinion

Page 28: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Next Question

2.Will user notice action is available? Typical supporting evidence

Experience Visible device, such as a button Perceivable representation of an action such as a

menu item

Page 29: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Next Question

3.Will user know it’s the right one for the effect? Typical supporting evidence

Experience Interface provides a visual item (such as prompt)

to connect action to result effect All other actions look wrong

Page 30: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Next Question

4.Will user understand the feedback? Typical supporting evidence

Experience Recognize a connection between a system

response and what user was trying to do

Page 31: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Let’s practice

Library web site Users:

Student users, familiar with computers and libraries, but new to this web site

Task: find Don Norman books in library Click on Books, Catalog, Videos link Search by: choose Author Type “Norman, Don” and click Search Click on Norman, Donald link Click on design of everyday things link Write down call number and what floor its on

Page 32: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Questions

1. Will users be trying to produce whatever effect action has?

2. Will users be able to notice that the correct action is available? (is it visible)

3. Once found, will they know it’s the right one for desired effect? (is it correct)

4. Will users understand feedback after action?

Page 33: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW Summary

Advantages Explores important

characteristic of learnability

Novice perspective Detailed, careful

examination Working prototype not

necessary

Disadvantages Can be time consuming May find problems that

aren’t really problems Narrow focus, may not

evaluate entire interface

Page 34: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

Your turn

Prepare: Describe users and their experience Determine task(s) and action lists

Evaluation: Choose the one you didn’t evaluate last time

As a group, for each action, answer the 4 questions with evidence

Page 35: Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough

CW: Questions

1. Will users be trying to produce whatever effect action has?

2. Will users be able to notice that the correct action is available? (is it visible)

3. Once found, will they know it’s the right one for desired effect? (is it correct)

4. Will users understand feedback after action?