evaluating usability of commercial software applications
DESCRIPTION
Copy of the presentation I gave at UPA 2009 in Portland, OR.TRANSCRIPT
© 2
008 T
he M
ath
Work
s, In
c.
® ®
Evaluating the Usability of Commercial
Software Applications
Jen Hocko
The MathWorks
2
® ®
About Me
Manager of the Business
Applications (“BizApps”) Usability
Group at The MathWorks
Prior lives: Web Development,
Technical Publications
B.S. in Computer Science &
Technical Writing, M.S. in
Human Factors
Enjoy replacing chaos with
something more orderly
Avid West Coast Swing dancer
3
® ®
About You
How many of you have been asked to weigh in on the
usability of a commercial software application?
What challenges did you face?
4
® ®
What This Presentation is About
Here’s the situation….
How did I decide what to try?
What it turned into: overview of the methodology
How you can do it too: details about each step
Discussion of challenges, lessons learned
Q&A, further thoughts from the audience
Closing discussion, Q&A
5
® ®
Here’s the situation….
Select the best Expense Reporting System
to replace our old, home-grown solution
Project already underway
Project team requested Usability help because:
Hadn’t done this type of project before
Thought the end-user point of view was critical
Wanted to use tools / templates to work effectively
Needed guidance about where to go next
6
® ®
How did I decide what to try?
Questions I asked myself:
What was out there?
What did other Usability people think?
Has this already been done?
Posted to many discussion groups!
What might work given our company culture?
What might work given the current state of the
project?
7
® ®
What I found / suggestions from others:
Jerrod Larson’s UX Magazine article on market research firms
Gartner, Forrester evaluations
SUS and SUMI Questionnaires
Nielsen’s (and other) heuristics
Various checklists
CIF report (ISO/IEC 250622006)
How did I decide what to try?
8
® ®
What it turned into: methodology overview
We recommend that project teams perform both the first and second level qualifications.
9
® ®
We suggest that project teams perform at least one of the third level qualifications.
What it turned into: methodology overview
10
® ®
First Level Qualification
Primary goal: get a sense for how versed the vendor is in
usability and user-centered design
11
® ®
Step 1: Questions for vendors
What questions about Usability would you ask vendors?
12
® ®
Step 1: Questions for vendors
13
® ®
Step 1: Evaluating vendor responses
“Does the system support people with disabilities by
following web accessibility guidelines?”
“The system does not support this functionality.” (Company A)
14
® ®
Step 1: Evaluating vendor responses
“Does the system support people with disabilities by
following web accessibility guidelines?”
“Company B’s user interface is designed in accordance to the principals of the Inductive
User Interface approach. This approach is similar to key applications used on a day-to-
day basis by both lay and performance end users. Microsoft, noted as the most
significant contributor to end user experience and design, adopts this approach in a
number of its applications such as MS Money, Hotmail, and MSN.com. Entry screens are
dynamic in nature meaning that dependent on the type selected different fields will
become visible and dependent on their configuration will be optional or mandatory. Users
fill in values in a logical sequence, using a series of pull down lists, buttons and check
boxes, without requiring screen refreshes which make other products cumbersome.”
15
® ®
Step 1: Evaluating vendor responses
“Does the system support people with disabilities by
following web accessibility guidelines?”
“Company C believes the application to comply with Section 508 requirements based on
the ability to navigate the application using keyboard access and to adjust text size in the
browser using standard browser functions.”
16
® ®
Step 1: Evaluating vendor responses
“Does the system support people with disabilities by
following web accessibility guidelines?”
“Company C believes the application to comply with Section 508 requirements based on
the ability to navigate the application using keyboard access and to adjust text size in the
browser using standard browser functions.”
“The system does not support this functionality.” (Company A)
“Company B’s user interface is designed in accordance to the principals of the Inductive
User Interface approach. This approach is similar to key applications used on a day-to-
day basis by both lay and performance end users. Microsoft, noted as the most
significant contributor to end user experience and design, adopts this approach in a
number of its applications such as MS Money, Hotmail, and MSN.com. Entry screens are
dynamic in nature meaning that dependent on the type selected different fields will
become visible and dependent on their configuration will be optional or mandatory. Users
fill in values in a logical sequence, using a series of pull down lists, buttons and check
boxes, without requiring screen refreshes which make other products cumbersome.”
17
® ®
Let’s talk
Q&A
Your thoughts?
18
® ®
Primary goals:
Define what is required of the application you are looking to buy
Set up some structure and evaluation criteria for vendor demos
Second Level Qualification
19
® ®
What is a use case?
Description of the user
Description of the user’s goal
The user’s current workflow
Pain points associated with the current workflow
Step 2: Use cases, requirements, & capabilities
20
® ®
How do you get the content for your use cases?
Identify user roles
Brainstorm high level use cases
Brainstorm pain points or
issues
Affinitize pains or issues
Write use cases
Throughout, keep looking for missing
user roles, use cases, or pains /
issues
Step 2: Use cases, requirements, & capabilities
21
® ®
How do you get the content for your use cases?
Throughout, keep looking for missing
user roles, use cases, or pains /
issues
Brainstorm pain points or issues
Affinitize pains or issues
Identify user roles
Write use cases
CARD (task analysis)
Big picture
Current workflow
Interviews
Observations
Step 2: Use cases, requirements, & capabilities
22
® ®
Step 2: Use cases, requirements, & capabilities
23
® ®
Step 2: Use cases, requirements, & capabilities
24
® ®
Challenge #1: Shouldn’t we be documenting the ideal
workflow?
Makes vendors think about our problems and how to solve them
Not limiting to one “ideal” solution – different ones may work well
Easier to start from what is known
Shared understanding of today is invaluable
Step 2: Use cases, requirements, & capabilities
25
® ®
Challenge #2: Why do we have to start with a use case?
Requirements should be traceable back to an actual user
Helps reduce scope creep and “bells and whistles”
If it can’t be tied to a use case it’s probably not needed right now!
Step 2: Use cases, requirements, & capabilities
26
® ®
Challenge #3: What is the appropriate
level of detail for a use case?
Some things to consider:
Combine multiple user roles into one use case. (Note where
any variations in steps or pains occur.)
Vendors care about what we want the application to do
Project teams and users care about:
Getting the best application possible (having their needs met)
Evaluating the applications
Step 2: Use cases, requirements, & capabilities
27
® ®
Challenge #4: What should we be giving to vendors?
Some options:
The N most important use cases (exactly as we wrote them)
A prioritized list of requirements pulled from the use cases
Spreadsheets are fine, if cutoffs are defined – must have’s, should
have’s, nice to have’s, etc.
The requirements, organized into high level “capabilities”
Requirements affinitized back into manageable categories
Sometimes aligns better with demos (allows for easier scoring)
Still need to decide how many to address per demo
Step 2: Use cases, requirements, & capabilities
28
® ®
Step 2: Use cases, requirements, & capabilities
Challenge #5: How do you write a good requirement?
Well written requirements:
1. Explain what the application should do, not how
2. Keep the readers in mind
3. Are specific, actionable statements
4. Make good use of language (spelling, grammar, etc.)
5. Are uniquely numbered
29
® ®
Example: related requirements organized by Capability
Req
uir
em
en
ts
Capability
Step 2: Use cases, requirements, & capabilities
30
® ®
Example:
related
requirements
organized by
Capability
Requirements(click on Capability
to view spreadsheet)
Step 2: Use cases, requirements, & capabilities
Capability
31
® ®
Let’s talk
Q&A
Other thoughts?
32
® ®
Second Level Qualification
Primary goals:
Compare applications based on how well vendors are able to
demonstrate that they meet your requirements
Get the project team talking about application strengths and
weaknesses
33
® ®
Step 3: Vendor demos & scorecard review
An Individual scorecard (example)
34
® ®
Step 3: Vendor demos & scorecard review
The Individual comments template:
Do:
Train people!
Have a parking lot
35
® ®
Step 3: Vendor demos & scorecard review
A Consolidated Demo Scorecard (example)
36
® ®
Step 3: Vendor demos & scorecard review
Count the A’s and multiply by 3
Count the B’s and multiply by 2
Count the C’s and multiply by 1
Add these together to get a “Positives” score
Count the N’s and multiply by 3 to get a
“Negatives” score
“Positives” score – “Negatives” score = Final score
37
® ®
Let’s talk
Q&A
Other thoughts?
38
® ®
Third Level Qualification
Primary goal: Look critically at application to identify potential
usability problems (Usability Specialist)
39
® ®
Step 4a: Usability audit
40
® ®
Step 4a: Usability audit
41
® ®
Step 4a: Usability audit
Pros:
Provides helpful reminders of different usability principles that
could cause problems for users if not followed
Can result in useful discussions about configurability (e.g. control
of style sheets, etc)
Cons:
Some checklist items don’t apply or are difficult to measure – need
to be consistent across audits of different applications
Isolated activity for single Usability expert goes against our
collaborative culture, inter-rater reliability
Do Better Next Time
Need to look at the checklist items in the context of use cases
42
® ®
Let’s talk
Q&A
Other thoughts?
43
® ®
Third Level Qualification
Primary goal: Look critically at application to identify potential
usability problems (end-users)
44
® ®
Step 4c: End-user evaluation
3 measures of system usability:
Effectiveness – users can complete tasks
Efficiency – how easily users can complete tasks
Satisfaction – how users feel about completing the tasks
2/3 User Acceptance Test (UAT), 1/3 survey
In UAT, Usability plays a supporting role by ensuring:
Tasks are adapted from the use cases
All user groups are represented as participants
Evaluation documents are designed to capture effectiveness and
efficiency
45
® ®
Step 4c: End-user evaluation
46
® ®
Step 4c: End-user evaluation
Score
Type
Effectiveness Efficiency
Individual (Number of Y’s / Total Y’s
Possible) * 100
(Number of Agrees / Total Agrees
Possible) * 100
Overall Average of all Individual Scores Average of all Individual Scores
If there were tasks that evaluators were unable to complete
or that took unreasonable time and effort, follow up with
them to identify and document the reasons WHY.
We do this for each application being evaluated.
47
® ®
Step 4c: End-user evaluation
Administered in
SurveyMonkey
Initial section for
collecting demographic
information
Additional question for
overall feeling of system
(scored separately)
Users fill out one survey
per system
48
® ®
Step 4c: End-user evaluation
49
® ®
Let’s talk
Q&A
Other thoughts?
50
® ®
Final usability recommendation: system
comparison matrix
51
® ®
Additional takeaways
Over communicate: keep the team informed / aligned with
the process – don’t assume they know it
Customize as necessary: this isn’t a “one size fits all”
methodology. It’s only a starting point – get team input
Fit into the bigger picture: if there’s a larger, centralized
process of software evaluations
It does make a difference: project teams think more
critically about end-users as part of procurement decision
52
® ®
Final Questions