is214 recap. is214 understanding users and their work –user and task analysis –ethnographic...
TRANSCRIPT
IS214 Recap
IS214• Understanding Users and Their Work
– User and task analysis– Ethnographic methods– Site visits: observation, interviews– Contextual inquiry and design– Universal usability
• Evaluating – Usability inspection methods – including heuristics,
guidelines– Surveys, interviews, focus groups– Usability testing– Server log analysis
• Organizational and Managerial Issues– Ethics; Managing usability
Methods: assessing needs, evaluating
Method Needs Evaluation
User and task analysis xEthnographic methods xObservation, interviews x xContextual inquiry& design
x
Universal usability x xUsability inspection–heuristics, guidelines
x x
Surveys, interviews, focus gps
x x
Usability testing xServer log analysis x
Intro to usability and UCD
• Usability concepts– Usability as more than interface– Functionality, content, and design
• User-Centered Design– Usability begins with design– At every stage in the design process, usability
means using appropriate methods to perform user-based evaluation
– Placing users (not cool technology or…) at the center of design
– Iterative design
Understanding Users and Their Work
To inform design & evaluation
User and Task Analysis
• Can’t ask “how good is this?” without asking “for whom and for what purpose?”
• Users– Selecting users: whom do you need to include? How
many?– Categorizing users– Getting people’s cooperation
• Trust
• Tasks– Identifying & describing the tasks they (currently)
perform– Technology design is work re-design
• User-task matrix
Ethnographic methods
• Methods and principles of social science research are fundamental to collecting, analyzing, interpreting data for needs and usability assessment– Reliability– Validity
• One set of methods: Ethnographic – Studying “users in the wild”– Learning their understanding of their work:
purposes and practices– Seeing how they actually do their work (as
opposed to formal work processes)
Site Visits
• Observing– Seeing people doing what they do, how they do it,
under the conditions that they do it– Asking questions as they work– Tacit knowledge: people may not be able to
articulate what they do– Recollection: people may not think to mention
things, or not think them important
• Interviewing– Getting users’ understandings and interpretations– Ability to probe– Interviewing skills!
Contextual Inquiry and Design
• A systematic, ethnographically-based method for:– Collecting, interpreting, and summarizing
information about work practices and organizational factors
– Incorporating findings into design
• Structured approach to data collection, recording, interpretation
• Complex; requires that entire team be trained in it
Evaluating
A design, prototype, or working system
Not a clean distinction between design and
evaluation
Usability inspection methods
• A variety of methods that consist of experts (not users) inspecting (not using) a design, prototype, or system
• Including: – Competitive evaluation– Heuristic evaluation
• Commonly-used method
• Easy• Lots of information with not much
investment• Reflects short-term use; limited depth.
Surveys
• Useful for collecting data directly from users at various stages of design and development
• Can reach a large number of users• Standardized questions, answer formats easy
to analyze• Issues of sample composition, sample size,
and validity• Only get answers to the questions you think
to ask• Question (and answer) wording affects results• Lack of depth and follow-up
Usability testing
• Lab-based tests• Usually standardized tasks observed
under controlled conditions• Good for getting performance data
unsullied by variations in use conditions
• Bad for getting performance data under real conditions of use (ecological validity)
Focus groups
• Again, useful at many stages in process• In-depth information from users• Interaction among users helpful (or sometimes
not)• Limits:
– small numbers– limited time period– effects of strong personalities or a sidetrack in the
conversation
• Skilled facilitator! Hard to do well, easy to mess up
Server log analysis
• Analyzes data collected automatically• Large numbers• Unobtrusive• Does not rely on use cooperation or
memory• Limits to the data available• Inferences must be justified by the
data
Organizational and Managerial Issues
Analyzing and presenting results
• Lots of data that has to be summarized in useful form
• What is the purpose of your study?• What do you know? What do you need to
know?• What recommendations can you develop
from your data?• How do you present your findings succintly
and clearly, in a way that your audience will understand and use?
Ethics
• Do no harm to the people you are studying
• Choices of projects?
Managing usability
• How usability fits into organizations• “We don’t get no respect”
Universal usability
• International usability• Accessibility
– Removing unnecessary barriers– Being aware of and designing for the
variety of people’s capabilities– Incorporating multimodal information
presentation and functionality
Topic we might have covered: credibility
• Larger issue: when presenting content not (just) functionality, need to understand how people use and evaluate information
• Factors that affect web site credibility:– Source:
• Institutional, personal• Expertise; bias or interest
– Currency (how up to date the info is)– Observable factors used as indicators of
unobservable:• Language, (absence of) mistakes• Links, imprimaturs
Some final questions
• How do we understand users’ activities, needs, interpretations, & preferences?– Especially for things that don’t yet exist– Users and uses are varied– People can’t always articulate what we would like to
know from them– The observer is not a perfectly objective “tool”
• How do we translate these understandings into recommendations and designs?
• How do we decide what trade-offs to make?– Among users (including organization vs individuals)– Between cost of design and priority of needs