benchmarking usability performance
TRANSCRIPT
BENCHMARKING USABILITY PERFORMANCE
Jennifer Romano Bergstrom, Ph.D. UX Research Leader Fors Marsh Group
George Mason University Dec 9 , 2014
WHAT IS USER EXPERIENCE?
+ emotions and perceptions = UX
Usability = “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.” ISO 9241-11
USABILITY & USER EXPERIENCE
useful
valuable
desirable
accessible
trustworthy
engaging
usable
The 5 Es to Understanding Users (W. Quesenbery): http://www.wqusability.com/articles/getting-started.html
WHEN TO TEST
WHEN TO TEST
Benchmark
WHY TEST
WHY BENCHMARK?‣ Provide a framework of current website performance ‣ Compare metrics in future testing
WHY DO IT?‣ Ensure you’re solving a problem that exists ‣ Ensure you’re building a product that is tailored to its audience ‣ Ensure that your product solution aligns to behaviors
WHY TEST
WHERE TO TEST
• Controlled environment
• All participants have the same experience
• Record and communicate from control room
• Observers watch from control room and provide additional probes (via moderator) in real time
• Incorporate physiological measures (e.g., eye tracking, EDA)
• No travel costs
LABORATORY REMOTE IN THE FIELD • Participants tend to be
more comfortable in their natural environments
• Recruit hard-to-reach populations (e.g., children, doctors)
• Moderator travels to various locations
• Bring equipment (e.g., eye tracker)
• Natural observations
• Participants in their natural environments (e.g., home, work)
• Use video chat (moderated sessions) or online programs (unmoderated)
• Conduct many sessions quickly
• Recruit participants in many locations (e.g., states, countries)
HOW TO TEST
• In-depth feedback from each participant
• No group think
• Can allow participants to take their own route and explore freely
• No interference
• Remote in participant’s environment
• Flexible scheduling
• Qualitative and Quantitative
ONE-ON-ONE SESSIONS FOCUS GROUPS SURVEYS • Representative
• Large sample sizes
• Collect a lot of data quickly
• No interviewer bias
• No scheduling sessions
• Quantitative analysis
• Participants may be more comfortable with others
• Interview many people quickly
• Opinions collide
• Peer review
• Qualitative
WHAT TO MEASURE
WHAT TO MEASURE
Benchmark
EXAMPLE IN-LAB ONE-ON-ONE METHODS Co
pyrig
ht*©
2013**The
*Nielse
n*Co
mpany.*Con
fiden
;al*and
*proprietary.*
34*
Example Methodology Participants: • N = 74 | Average Age = 37 • Mix of gender, ethnicity, income • Random assignment to diary condition
• New, Old, Prototype, Bilingual
Usability Testing session: • Participants read a description of the
study. • The moderator gave instructions and
calibrated the eye tracker. • Participants completed Steps 1-5 in the
diary at their own pace. • End-of-session satisfaction questionnaire • Debriefing interview
Eye Tracker
Moderators worked from another room.
Control Room
Slide from: Walton, L., Romano Bergstrom, J., Hawkins, D. & Pierce, C. (2014). User Experience and Eye-Tracking Study: Paper Diary Design Decisions. Paper presentation at the American Association for Public Opinion Research (AAPOR) Conference, Anaheim, CA, May 2014.
EXAMPLE IN-LAB ONE-ON-ONE METHODS Co
pyrig
ht*©
2013**The
*Nielse
n*Co
mpany.*Con
fiden
;al*and
*proprietary.*
34*
Example Methodology Participants: • N = 74 | Average Age = 37 • Mix of gender, ethnicity, income • Random assignment to diary condition
• New, Old, Prototype, Bilingual
Usability Testing session: • Participants read a description of the
study. • The moderator gave instructions and
calibrated the eye tracker. • Participants completed Steps 1-5 in the
diary at their own pace. • End-of-session satisfaction questionnaire • Debriefing interview
Eye Tracker
Moderators worked from another room.
Control Room
Slide from: Walton, L., Romano Bergstrom, J., Hawkins, D. & Pierce, C. (2014). User Experience and Eye-Tracking Study: Paper Diary Design Decisions. Paper presentation at the American Association for Public Opinion Research (AAPOR) Conference, Anaheim, CA, May 2014.
No Think Aloud in
Benchmark studies: We want a pure measure of
performance
PREPARATION
‣ What are the most important things users should be able to do on this site? ‣ Most frequent ‣ Most important (e.g., registration)
‣ Tasks should be clear and unambiguous and in the user’s language (no jargon).
‣ Don’t prompt the solution.
CREATE TASKS
PREPARATION
TASK SCENARIO EXAMPLE‣ “You want to book a romantic holiday for you and your partner for Valentine’s day. How would you do that?” !
‣ “Use this site to…” is even better. It is a task. You can measure behavior. !
‣ NOT: Go to the home page of romanticholidays.com and click “sign up now” then click “Valentine’s day.”
PREPARATION
THINGS TO AVOID‣ Asking participants to predict the future
‣ Asking if a participant would use something like X or might enjoy X feature is not productive
‣ Instead, ask about current behavior (do you currently do X?) or show them something and observe how they interact with it
PREPARATION
THINGS TO AVOID‣ Leading people
‣ Let them make their own mistakes; that is valuable ‣ If you give the answers, you’ll never learn what you need to learn
‣ AVOID: ‣ Telling people what to do or explaining how it works ‣ “Is there anywhere else you would click?” ‣ “Go ahead and click on that…”
PREPARATION
THINGS TO AVOID‣ Bias
‣ Try to remain neutral, even if the person is really funny or mean ‣ Use open-ended questions to understand perceptions
‣ AVOID: ‣ Testing friends ‣ Acting differently with different participants ‣ “Did you like it?” ‣ “Interesting.” ‣ “Now we are going to work with this awesome page.”
PREPARATION
THINGS TO AVOID‣ Interrupting
‣ You don’t want to interfere with what participants would normally do on their own
‣ Wait until the end to ask follow-up questions ‣ AVOID:
‣ Probing mid-task ‣ “Why?”
PREPARATION
THINGS TO AVOID‣ Explaining the purpose
‣ Your job is to pull as much information as possible ‣ Your job is not to explain how it works ‣ “What do you think it is for?” ‣ “What would you do if I was not here?”
‣ AVOID: ‣ Explaining how to find information ‣ Explaining the purpose of the product
ANALYZING RESULTS
USABILITY & UX TESTING
COMPARE TO GOALS‣ It is a good idea to set goals (e.g., 90% of participants should be able to register in less than one minute).
‣ Keep results simple so people will use them and appreciate them. ‣ Compare performance to goals ‣ In future iterations, compare performance to benchmark
ANALYZING RESULTS
OUTPUTS‣ Notes, data, video/audio recordings ‣ Usability labs will create full reports (doc or PPT) ‣ Unmoderated tests may provide data reports and recorded sessions.
‣ When writing research notes, remember to: ‣ Report good and bad findings ‣ Stick to what you observed in the test
‣ Use the data!
ANALYZING RESULTS
BENCHMARKING USABILITY PERFORMANCE
THANK YOU!Jennifer Romano Bergstrom, Ph.D. Fors Marsh Group [email protected] @romanocog
Links to more info: EdUI slides (see other slides on Slideshare too) Eye Tracking in UX Design