unit v hci - design rules , guidelines and evaluation...

Post on 23-Mar-2020

10 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

UNIT – VHCI - DESIGN RULES ,

GUIDELINES AND EVALUATION TECHNIQUES

Principles that Support Usability

• Principles are divided into three main categories:• Learnability – the ease with which new users can begin

effective interaction and achieve maximal performance.

• Flexibility – the multiplicity of ways in which the user and system exchange information.

• Robustness – the level of support provided to the user in determining successful achievement and assessment of goals

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 2

Principles Affecting Learnability

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 3

Principles Affecting Flexibility

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 4

Principles Affecting Robustness

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 5

Design Guidelines

• Basic categories of Smith & Mosier guidelines are:• Data Entry

• Data Display

• Sequence Control

• User Guidance

• Data Transmission

• Data Protection

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 6

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 7

Golden Rules and Heuristics• Shneiderman’s 8 Golden Rules of Interface Design

• Strive for consistency in action sequences, layout, terminology, command use and so on.

• Enable frequent users to use shortcuts, such as abbreviations, special key sequences and macros, to perform regular, familiar actions more quickly.

• Offer informative feedback for every user action, at a level appropriate to the magnitude of the action.

• Design dialogs to yield closure so that the user knows when they have completed a task.

• Offer error prevention and simple error handling so that, ideally, users are prevented from making mistakes and, if they do, they are offered clear and informative instructions to enable them to recover.

• Permit easy reversal of actions in order to relieve anxiety and encourage exploration, since the user knows that he can always return to the previous state.

• Support internal locus of control so that the user is in control of the system, which responds to his actions.

• Reduce short-term memory load by keeping displays simple, consolidating multiple page displays and providing time for learning action sequences.

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 8

Golden Rules and Heuristics

• Norman’s Seven Principles for Transforming Difficult Tasks into Simple Ones• Use both knowledge in world and knowledge in head

• Simplify the structure of tasks

• Make things visible

• Get the mappings right

• Exploit the power of constraints

• Design for error

• When all else fails, standardize

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 9

User Interface Management System (UIMS)

• Set of programming and design techniques which are supposed to add another level of services for interactive system design beyond the toolkit level

• Main concerns of a UIMS:• Conceptual architecture for structure of interactive

system

• Techniques for implementing a separated application

• Supports techniques for managing, implementing and evaluating a run-time interactive environment

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 10

Goals of Evaluation

• Evaluation has three main goals: • To assess extent & accessibility of system’s functionality

• To assess users’ experience of the interaction

• To identify any specific problems with the system

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 11

Evaluation through Expert Analysis

• Evaluation should occur throughout design process

• First evaluation of a system should ideally be performed before any implementation work has started

• If the design itself can be evaluated, expensive mistakes can be avoided, since the design can be altered prior to any major resource commitments

• It can be expensive to carry out user testing at regular intervals during the design process

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 12

Evaluation through Expert Analysis

• Four approaches to expert analysis• Cognitive Walkthrough

• Heuristic Evaluation

• Use of Models

• Use of Previous Work

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 13

Evaluation through User Participation

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 14

Styles of Evaluation• Laboratory studies• Field studies

Experimental Evaluation• Participants• Variables• Hypotheses• Experimental Design• Statistical Measures• Studies of Groups of Users

Observational Techniques• Think Aloud And

Cooperative Evaluation• Protocol Analysis• Automatic Protocol

Analysis Tools• Post-task WalkthroughsQuery Techniques

• Interviews• Questionnaires

Evaluation through Monitoring Physiological Responses• Eye Tracking for Usability Evaluation• Physiological Measurements

Choosing an Evaluation Method

• Factors distinguishing evaluation techniques• The stage in cycle at which the evaluation is carried out

(Design vs. implementation)

• The style of evaluation (Laboratory vs. field studies)

• The level of subjectivity or objectivity of the technique (Subjective vs. objective)

• The type of measures provided (Qualitative vs. quantitative)

• The information provided

• The immediacy of the response

• The level of interference implied

• The resources required

12-Oct-17 Parag N Achaliya, SNJB's KBJ COE, Chandwad (Nashik) 15

top related