assessing and improving research networking usability at ucsf

1
Assessing and improving research networking usability at UCSF Anirvan Chatterjee, Katja Reuter, PhD, Brian Turner, MBA Clinical and Translational Science Institute, University of California, San Francisco Launching a research networking tool is only the first step. Understanding subsequent site usage helped us find and address substantial barriers to usage. Our team released the UCSF Profiles platform in September 2010 with profiles for about 2400 researchers. We followed up by establishing a continuous assessment and improvement process to enhance the product using hosted tools for web analytics, crowdsourced usability testing, and A/B testing. We used our toolkit to make three targeted design improvements since January 2011, successfully solving measurable usability gaps. Introduction Methods and Tools Acknowledgments This project was supported by NIH/NCRR UCSF-CTSI Grant Number UL1 RR024131. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH. We would like to thank Cynthia Piontkowski for design and implemention. Evolution silhouettes images by TeeKay, used under a Creative Commons license. Crowdsourced 5-Second Tests ( UsabilityHub.com ) What: Strangers see a screenshot of a web page for 5 seconds, then asked about what they remember seeing. Why: If untrained strangers understand a site, researchers should too. Results arrive in minutes, for rapid idea testing. How: We test screenshots and mockups before/after major site changes. Consistent questions establish baselines. A/B Testing ( VisualWebsiteOptimizer.com ) What: Website users see one of several page variations as they use the site. Afterwards, we either pick the best- performing variation, or retest new variations. Why: Real user interaction data trumps design arguments. How: We test design element variations to see if they drive desired behavior, e.g. lower bounce rate, longer visits. Web Analytics (Google Analytics) What: Analytics tool logs information about user interaction with site. Dashboard lets complex questions be answered. Why: Web analytics are our primary tool to evaluate how the site’s being used, find gaps, and measure improvements. How: We look at pages (home, search results, profiles), audiences (on-campus, off-campus), referrers (Google, UCSF.edu, etc.), interactions (bounce rate, page depth). What new users retained after 5 seconds: What is the purpose of this page? “Online encyclopedia type of thing” “Something to do with research” “search for research articles” “Searching for information” “find research information” “118 have updated profile” “advanced search page” “research information” “To search for stuff” “searching” The homepage fails to communicate to unfamiliar users that they can find scientific researchers based on expertise. January 2011 March 2011 June 2011 What new users retained after 5 seconds: What is the purpose of this page? “Finding scientists/people at UCSF” “help you find people to do scientific projects for you” “To find people in departments” “search for scientific papers” “find people with a category” “to find translators” “search for stuff” “find people” “translation” “Directory” Better, but viewers still aren’t seeing the networking angle, and are confused by “translational” in the header. What new users retained after 5 seconds: What is the purpose of this page? “the social website of some university?” “Search engine for scientists to find other scientists, publications, by topic” “social networking for smart people” “To find a person at the university” “search for people and research” “facebook/linkedin for scientists” “to search for other scientists” “find other scientists” “listing of scientists” “school” Success! The page seems to immediately signal its purpose, even among a non-academic audience. How we used crowdsourced 5-second testing to assess and improve homepage usability How we found and fixed a user engagement gap with web analytics and A/B tests Problem Our site-wide bounce rate was higher than we liked. Who were we failing to engage? Users landing on researcher profiles via search engines (e.g. Google, UCSF.edu search) were the likeliest to leave immediately after arriving. How could we encourage these search- driven users to stay and explore more research networking features? Experiment We designed several variations of a new block of links to encourage users to click on related researcher profiles. We ran an A/B test over several weeks to see which design most effectively encouraged further exploration. The top performing variation reduced bounce rates by 15%, so we made that a permanent part of the page. Add large headings Segment tasks Bigger search buttons New header focuses on social networking Reduce clutter Remove “translational” Adding this block of links reduced bounce rates by 15% Clinical and Translational Science Institute / CTSI Accelerating Research to Improve Health S F U C

Upload: ctsi-at-ucsf

Post on 21-Aug-2015

414 views

Category:

Technology


1 download

TRANSCRIPT

Assessing and improving research networking usability at UCSF

Anirvan Chatterjee, Katja Reuter, PhD, Brian Turner, MBA

Clinical and Translational Science Institute, University of California, San Francisco

Launching a research networking tool is only the first step. Understanding subsequent site usage helped us find and address substantial barriers to usage. Our team released the UCSF Profiles platform in September 2010 with profiles for about 2400 researchers. We followed up by establishing a continuous assessment and improvement process to enhance the product using hosted tools for web analytics, crowdsourced usability testing, and A/B testing. We used our toolkit to make three targeted design improvements since January 2011, successfully solving measurable usability gaps.

Introduction

Methods and Tools

Acknowledgments

This project was supported by NIH/NCRR UCSF-CTSI Grant Number UL1 RR024131. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH. "We would like to thank Cynthia Piontkowski for design and implemention. Evolution silhouettes images by TeeKay, used under a Creative Commons license.

Crowdsourced 5-Second Tests (UsabilityHub.com)

•  What: Strangers see a screenshot of a web page for 5 seconds, then asked about what they remember seeing.

•  Why: If untrained strangers understand a site, researchers should too. Results arrive in minutes, for rapid idea testing.

•  How: We test screenshots and mockups before/after major site changes. Consistent questions establish baselines.

A/B Testing (VisualWebsiteOptimizer.com)

•  What: Website users see one of several page variations as they use the site. Afterwards, we either pick the best-performing variation, or retest new variations.

•  Why: Real user interaction data trumps design arguments. •  How: We test design element variations to see if they drive

desired behavior, e.g. lower bounce rate, longer visits.

Web Analytics (Google Analytics)

•  What: Analytics tool logs information about user interaction with site. Dashboard lets complex questions be answered.

•  Why: Web analytics are our primary tool to evaluate how the site’s being used, find gaps, and measure improvements.

•  How: We look at pages (home, search results, profiles), audiences (on-campus, off-campus), referrers (Google, UCSF.edu, etc.), interactions (bounce rate, page depth).

What new users retained after 5 seconds:

What is the purpose of this page? •  “Online encyclopedia type of thing” •  “Something to do with research” •  “search for research articles” •  “Searching for information” •  “find research information” •  “118 have updated profile” •  “advanced search page” •  “research information” •  “To search for stuff” •  “searching”

The homepage fails to communicate to

unfamiliar users that they can find scientific researchers based on expertise.

January 2011

March 2011

June 2011

What new users retained after 5 seconds:

What is the purpose of this page? •  “Finding scientists/people at UCSF” •  “help you find people to do"

scientific projects for you” •  “To find people in departments” •  “search for scientific papers” •  “find people with a category” •  “to find translators” •  “search for stuff” •  “find people” •  “translation” •  “Directory”

"Better, but viewers still aren’t seeing the networking angle, and are confused by

“translational” in the header.

What new users retained after 5 seconds:

What is the purpose of this page? •  “the social website of some university?” •  “Search engine for scientists to find"

other scientists, publications, by topic” •  “social networking for smart people” •  “To find a person at the university” •  “search for people and research” •  “facebook/linkedin for scientists” •  “to search for other scientists” •  “find other scientists” •  “listing of scientists” •  “school”

"Success! The page seems to immediately

signal its purpose, even among a"non-academic audience.

How we used crowdsourced 5-second testing to assess and improve homepage usability

How we found and fixed a user engagement gap with web analytics and A/B tests Problem •  Our site-wide bounce rate was higher than

we liked. Who were we failing to engage?

•  Users landing on researcher profiles via search engines (e.g. Google, UCSF.edu search) were the likeliest to leave immediately after arriving.

•  How could we encourage these search-driven users to stay and explore more research networking features?

Experiment •  We designed several variations of a

new block of links to encourage users to click on related researcher profiles.

•  We ran an A/B test over several weeks to see which design most effectively encouraged further exploration.

•  The top performing variation reduced bounce rates by 15%, so we made that a permanent part of the page.

Add large headings

Segment tasks

Bigger search buttons

New header focuses"on social networking

Reduce clutter

Remove “translational”

Adding this block of links reduced bounce rates

by 15%

Clinical and Translational Science Institute / CTSI Accelerating Research to Improve Health S F U C