thinking about usability and the user experience

Post on 27-Jan-2015

137 Views

Category:

Technology

3 Downloads

Preview:

Click to see full reader

DESCRIPTION

Discussion seminar at 2010 OSU Library In-Service.

TRANSCRIPT

Derek PoppinkSeptember 15, 2010

Thinking about Usability and the User Experience

What is Agile?

What is Agile?

What is User Experience?

Return on Investment

Spending 10% of a project’s budget on usability improves key performance indicators by 83% on average

Conversion rates

Traffic numbers

User performance

Target feature usage

How Do You Improve User Experience?

Research

Contextual Inquiry – Interviewing and observing users where they do their work

Personas & Scenarios – User archetypes and the way they interact with products

Design

Mockups – Designs that are fast to create and easy to dispose of

Guidelines – Best practices for the web, the industry, and particular user groups

Evaluation

Heuristic Evaluations – Inspection by an expert using recognized principles

User Studies - Assess products by asking actual users to accomplish core tasks

How Do You Improve User Experience?

Before December, 2009

Market research and focus groups

Ad hoc user testing

Heuristic evaluations post release

What was Missing?

End user research

Common design patterns

Regular user testing

Usability Maturity

1. Hostility Towards Usability

2. Developer-Centered Usability

3. Skunkworks Usability

4. Dedicated Usability Budget (Gale)

5. Managed Usability

6. Systematic Usability Process

7. Integrated User-Centered Design

8. User-Driven Corporation

How is User Experience at Gale?

Recent Results

Evaluation of Systems and Services

16 graduate students, 4 Gale products, 24 user experience evaluations

Career Transitions, Global Resource on Energy, Environment, and Natural Resources (GREENR), Grzimek’s Animal Life, and Literature Resource Center

Summer of User Experience

2 user experience interns, 3 months, 26 user studies, 13 heuristic evaluations

Academic OneFile, Books & Authors, Business & Company Resource Center, Community Health, Course Reader, Editorial Interface, Gale Admin, Gale Virtual Reference Library, Illustrated London News, International Business, Latin America Area Studies, Opposing Viewpoints in Context, Slavery Anti-Slavery, State Papers Online

http://wiki.oh.gale.com/display/UX/Usability+Studies

1-2 rounds of tests when site is nearly complete

1-2 days of tests per round

5-8 participants per round

Held on-site in room with one-way mirror

1-2 observers (product owner)

Dozens of problems prioritized by moderator

1 week to prepare report afterwards

$25,000 per round

The Big Honkin’ Usability Test

The Agile Usability Test

1 round of tests per sprint from concept to launch

1 morning per round (tests and debriefing)

3 participants per round

Held in conference rooms and remotely with WebEx

Observed by team

Ten problems and fixes prioritized by team

Findings and fixes distributed immediately afterwards

$4,000 per round

When (Frequency)

“A morning a month, that’s all we ask” – Steve Krug Morning

3 sessions (9, 10, 11)

Debriefing over lunch

Allows more people to attend

Month (Sprint) Eliminates deciding when to test

Finds enough problems to fix

Affordable, iterative, scalable

What (Materials)

Existing products

Other people’s products

Mockups (sketches)

Balsamiq Mockups

Mockingbird

Wireframes

HotGloo

Axure

Visual designs (comps)

Prototypes

Partially working pages

What (Products)

Business & Company Resource Center

International Business

Books & Authors

Community Health

Course Reader

Editorial Interface

Gale Admin

Gale Virtual Reference Library

Illustrated London News

Latin America Area Studies

Opposing Viewpoints in Context

Slavery Anti-Slavery

State Papers Online

Academic OneFile

Who (Participants)

“Recruit loosely and grade on a curve” – Steve Krug What people?

Representative users (students, teachers, professionals, librarians)

How many? Three per sprint

Recruiting? Customers, personal networks (email, Facebook), Craig’s List

Incentives? Amazon gift cards

Who (Moderator)

Tour guide Give participant instructions

Keep things moving

Therapist Get participant talking

Show you are listening

Stay neutral

Respect participant, privacy

Who (Observers)

“Make it a spectator sport”

Attend a session (or three)

Avoid distractions

Take notes

Notice where user is confused or can’t complete tasks

Make list

Three most serious usability problems

Attend debriefing

Free lunch!

Where (Conference Room & WebEx)

Conference room Laptop, projector, speakerphone, snacks

Remote participant Internet connection, phone

Benefits Easier recruiting

Easier scheduling

No travel required

Produces [almost] the same results

Why (Debriefing)

Share problems

Three most serious per observer

Prioritize top ten

Discuss easiest fix

Tweak, don’t redesign

Take something away

Commit to changes

Email results to team

“Focus ruthlessly on a small number of the most important problems”

“When fixing problems, always do the least you can do”

Why (Results)

Why (Results)

Remote Usability

Thank My Sources

Brian Swan, www.builder.com.au (2005)

Dave Nicolette, www.davenicolette.net (2006)

Frank Klein, www.relativitycorp.com (2010)

Jakob Nielsen, www.useit.com (2008)

Nate Bolt, Remote Research, remoteusability.com (2010)

Peter Morville, Ambient Findability (2005)

Peter Morville, semanticstudios.com (2004)

Steve Krug, Rocket Surgery Made Easy (2009)

Questions? Comments?

top related