Usability Studies and User-Centered Design in Digital Libraries

Download Usability Studies and User-Centered Design in Digital Libraries

Post on 12-Feb-2017

212 views

Category:

Documents

0 download

TRANSCRIPT

  • This article was downloaded by: [University of Tennessee At Martin]On: 04 October 2014, At: 23:23Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UK

    Journal of Web LibrarianshipPublication details, including instructions forauthors and subscription information:http://www.tandfonline.com/loi/wjwl20

    Usability Studies and User-Centered Design in DigitalLibrariesDavid J. Comeaux aa State University of New York ,Published online: 11 Oct 2008.

    To cite this article: David J. Comeaux (2008) Usability Studies and User-CenteredDesign in Digital Libraries, Journal of Web Librarianship, 2:2-3, 457-475, DOI:10.1080/19322900802190696

    To link to this article: http://dx.doi.org/10.1080/19322900802190696

    PLEASE SCROLL DOWN FOR ARTICLE

    Taylor & Francis makes every effort to ensure the accuracy of all theinformation (the Content) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness,or suitability for any purpose of the Content. Any opinions and viewsexpressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified withprimary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directly or

    http://www.tandfonline.com/loi/wjwl20http://www.tandfonline.com/action/showCitFormats?doi=10.1080/19322900802190696http://dx.doi.org/10.1080/19322900802190696

  • indirectly in connection with, in relation to or arising out of the use of theContent.

    This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan,sub-licensing, systematic supply, or distribution in any form to anyone isexpressly forbidden. Terms & Conditions of access and use can be found athttp://www.tandfonline.com/page/terms-and-conditions

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

    http://www.tandfonline.com/page/terms-and-conditions

  • Usability Studies and User-Centered Designin Digital Libraries

    David J. Comeaux

    ABSTRACT. Digital libraries continue to flourish. At the same time, theprinciples of user-centered design and the practice of usability testing havebeen growing in popularity, spreading their influence into the library sphere.This article explores the confluence of these two trends by surveying thecurrent literature on usability studies of digital libraries. This article focuseson the methodology of studies of multimedia digital libraries.

    KEYWORDS. Usability, usability testing, user testing, user-centered de-sign, digital libraries, virtual libraries, multimedia, multimedia libraries

    This article will begin with an overview of digitization and a brief dis-cussion of recent digitization trends in academic libraries. That is followedby a general discussion of user-centered design and usability testing. Afterthat discussion, the application of these techniques on the different formsof digital libraries is discussed. Then usability studies performed on mul-timedia digital libraries are discussed in detail. The published literaturefrom the library field was searched between February and April 2007. Sev-eral databases were consulted, including Library Literature & InformationScience Full Text, Emerald FullText, and ERIC.

    David J. Comeaux is Multimedia Developer with experience as a Web anduser-interface designer (E-mail: dave@djcomeaux.com). He is a certified usabilityanalyst with advanced knowledge of Web usability and accessibility practices. Hecompleted his master of library science studies at the State University of NewYork at Buffalo in 2007.

    Journal of Web Librarianship, Vol. 2(23) 2008Available online at http://www.haworthpress.comC 2008 by The Haworth Press. All rights reserved.

    doi: 10.1080/19322900802190696 457

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 458 JOURNAL OF WEB LIBRARIANSHIP

    OVERVIEW OF DIGITIZATION

    Digitization affords two principal benefits: increased access to librarymaterials and new approaches for preservation of objects and collections.By digitizing materials, such as image collections or audio, then making thedigital files available on the Web, libraries enable people from anywherein the world to study or enjoy their collections. In addition to expandingaccess to any geographic area, they can also be accessed at any time ofday.1

    Digitization can also be instrumental in the preservation of rare or fragileitems. For preservation purposes, items can be scanned or photographedat high resolution. Then, lower-resolution digital surrogates can be madeavailable on the Web, therefore increasing access.2 Additional benefitsinclude improving the ability to share resources between collaborating in-stitutions, easing the task of keeping material current, and accommodatingnew formats such as digital audio or video.3

    The digitization of images, rare manuscripts, and other objects has con-tinued to accelerate. In 2006, the Association of Research Libraries pub-lished SPEC Kit 294, Managing Digitization Activities. This surveyreveals insights into the digitization activities of more than 60 academiclibraries. It shows there was an upward trend throughout the 1990s in thenumber of libraries that began digitizing. Six libraries began digitizationin 1992 or before, while 27 initiated digitization between 1995 and 1999.Some fourteen more began their digitization process in 2000 and 2001.The prime motivating factors for commencing digital initiatives were theacquisition of grant funding and the addition of staff with digitization ex-perience. Other factors reported included a desire to make collections moreaccessible, particularly through the Internet.4

    USER-CENTERED DESIGN AND USABILITY TESTING

    According to the Usability Professionals Association, User-centereddesign (UCD) is an approach to design that grounds the process in infor-mation about the people who will use the product. UCD processes focuson users through the planning, design, and development of a product.5

    In simple terms, there are a few basic steps in a user-centered designprocess. The first is to identify the target audience or audiences. In mostcases there will be discernible groups of users, with varying informationneeds and skill levels. For example, in academic libraries, researchers

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 459

    and students are typical user groups. Graduate and undergraduate studentsmight be considered different user groups if they are expected to use asystem in significantly different ways. Once target users are identified,their needs and expectations must be ascertained and clearly defined. Thisis a critical step in the user-centered design process that will be discussedin more detail below.

    The next step is to conduct formal testing of the existing site or prototype.The results of the testing are then analyzed, and improvements are made tothe site or prototype based on those results. If resources and time permits,testing is repeated and improvements are continually made until the systemperformance is acceptable.

    The term usability testing has been applied to different practices.Many of these are actually methods of gathering information aboutusers, not usability testing. Common methods for studying users are focusgroups and surveys. Other methods used in library Web site evaluationsinclude site-log analysis and interviews.6

    These techniques each have their strengths and should be a vital part ofa user-centered design process. For example, focus groups are an excellentmeans of learning about the target audiences needs and expectations. Awell-designed questionnaire can yield quantifiable data on users percep-tions of an existing site, and site-log analysis provides hard data about asites actual usage patterns. However, while these techniques provide someuseful information, they do not alone provide the specific informationneeded to inform critical design issues.

    Validating the effectiveness of a site or system requires carefully observ-ing users interacting with the system in a realistic way. That, in essence, isusability testing. To distinguish this approach from surveys or focus groups,this method has been called formal usability testing7 or user protocol.8

    In a typical usability test, the participants are directed to complete acommon scenario, such as putting an item on hold, or executing a task,such as searching for an image in the collection. The test administratorclosely observes the participant and carefully notes the participants actionsbut does not interfere or guide the participant in any way. It is helpful tovideotape each participant as they perform the requested tasks.9

    In addition to the standard means of observing a user protocol experi-ment, capturing the participants actions using screen-capture software canbe very beneficial in two ways. One, electronically capturing the usersefforts maintains an accurate record for analysis. Also, highlights of theprocess can be presented to decision-makers as irrefutable evidence of theneed for change.10

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 460 JOURNAL OF WEB LIBRARIANSHIP

    For a general introduction to user studies from the librarians perspec-tive, see Assessing Web Site Usability by Kim Guenter11 or Web SiteUsability Testing: A Critical Tool for Libraries by Shelagh Genus.12 Formore in-depth information, the American Library Association has pub-lished a guide to usability testing for library Web sites.13

    Identifying Users Needs and Expectations

    As noted above, it is critical that designers learn as much as possibleabout user groups information needs and expectations. The most com-monly used methods are focus groups and surveys. While studies of thiskind are integral facets of user-centered design and can be critical to thesuccess of a usability testing effort, they should not be considered a formof usability testing. For this article, they will be classified as user studies.

    Focus groups are excellent tools for this purpose. A focus group is anorganized and mediated interview of a group of individuals who representtypical users of the system. They are used regularly by libraries who wantto learn about user behavior and to obtain voice of the customer feedbackon library services.14

    Surveys and questionnaires are also useful tools for gathering this typeof information. Surveys are used by most libraries to learn about userbehavior, attitudes, and perceptions. They can help identify problem areas,and if repeated regularly, they can illuminate important trends.15

    Summary of Usability Testing and User Study Methodology

    In practice, no single method of studying users alone can gather allof the information needed to inform the design of a user-friendly Website or digital library system. Typically, a user-centered evaluation processconsists of a combination of methods, each of which uncovers differentkinds of information. Methods such as focus groups or interviews areconducted to gather information about the needs, wants, and expectationsof users. Usability testing is performed to assess how well a site meetsthose needs and expectations.

    A DISCUSSION OF DIGITAL LIBRARIES

    The term digital library and the related term virtual library have beendefined in numerous ways. Judy Jeng16 outlined several representativedefinitions.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 461

    One perspective is very broad. For example, digital libraries were de-scribed by Michael Lesk as organized collections of digital information17

    and by William Arms as a managed collection of information, with as-sociated services, where the information is stored in digital formats andaccessible over a network.18 Another conception of the digital library isas a central access point to various information resources. By this defini-tion, the typical academic library Web site should be considered a digitallibrary.19

    Another perspective on digital libraries is as a collection of documentsthat include text as well as multimedia objects such as graphical images,video, or sound. Such collections often store objects in databases andassociate the objects with metadata to facilitate retrieval. The informationis accessed and then displayed through a Web-based interface.20 Thesedigital libraries will be referred to as multimedia digital libraries. Thoughthe studies done on academic libraries Web sites and text-based digitallibraries will be discussed, studies that focus on multimedia digital librarieswill be studied in more detail.

    User-centered design practices, including usability testing, have be-come increasingly accepted both in the commercial sector and in libraries.As libraries face more competition from commercial information ven-dors, adopting sound usability practices can help libraries to successfullycompete in the information marketplace.21 A recent study of users per-ceptions of digital libraries found that usability was the most critical cri-terion for digital libraries.22 Specifically, general navigability, search andbrowse functionality, and the presence of help features were cited as criticalaspects.

    Academic Library Web Sites

    There have been numerous publications describing usability tests ofacademic library Web sites. Most described the process of user testing aspart of a library Web site redesign.23 Galina Letnikova24 has published anannotated bibliography on usability testing of academic Web sites.

    Jeng has contributed significantly in this area. In What is Usability inthe Context of the Digital Library and How Can It Be Measured? Jengfeatured a discussion of the methodology used in many of the user studiespublished to date. In Usability Assessment of Academic Digital Libraries:Effectiveness, Efficiency, Satisfaction, and Learnability, Jeng extensivelydiscussed the concept of usability, including the definitions promulgatedby dozens of researchers. That article included a table summarizing user

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 462 JOURNAL OF WEB LIBRARIANSHIP

    study strategies in about twenty academic library user assessments. It alsoincluded sample evaluation tools she designed and tested on two academiclibrary Web sites.25

    Several studies have focused on digital libraries that provide electronicaccess to journal articles. Ann Peterson Bishop studied DeLIver, a testbedcollection of articles from journals in science and technology. Susan Makartested the National Institute of Standards and Technology virtual library.Sueli Mara Ferreira and Denise Nunes Pithan examined InfoHab, a digitallibrary that provides access to Brazilian civil engineering resources. TinaE. Chrzastowski and Alexander Scheeline assessed the usability of theAnalytical Sciences Digital Library, a part of the National Science DigitalLibrary.26

    One subset of digital libraries with a relative dearth of practical re-search is the multimedia digital library. This subset includes sites that areprimarily intended to display visual information.27 Such collections areespecially common in the visual art fields. They are often used by muse-ums and archives to display digital representations of historic photographs,manuscripts, maps, or other visual-based media. Some of the most well-known collections include the Library of Congresss American Memorycollections (http://memory.loc.gov/ammem) and the Smithsonians Na-tional Museum of American History (http://americanhistory.si.edu/).28

    First, this article will consider some the unique challenges visuallyoriented collections present. Then it will look at the articles that featureformal usability testing of this type of digital library. The author will discusseach studys methodology and note any especially informative results orconclusions.

    Challenges of Image-Oriented Collections

    The indexing of visual images presents a different and arguably moredifficult challenge than text-only objects. Whereas books or articles conveytheir meaning using words, a visual image conveys its meaning throughvarious attributes, like color, shape, and texture. Because viewers are proneto experience and interpret images in different ways, it is inevitable thatsome measure of subjectivity is present when terms are chosen to representan image.29

    Among the greatest challenges to successful implementation of digitalimage collections is the lack of a standard metadata scheme. Unlike therelative stability of MARC and AACR2, the new metadata scheme digitalcollections rely on are in constant flux.30 Not only is there no agreed-upon

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 463

    metadata scheme, there is no universally accepted definition of the term it-self. The almost ubiquitous definition Metadata is data about data, whilenot incorrect, lacks the precision to guide one to a meaningful understand-ing of the concept.31

    Despite the great number of definitions that have been proposed formetadata, the essential point of disagreement is fairly simple. Some definemetadata broadly enough to apply to descriptive data for material in elec-tronic or print format. Others favor a more focused view of metadata asinformation that enables classification of electronic resources.32

    A detailed discussion of this very fertile research area is beyond thescope of this article. For an excellent discussion of the various emergingschemas and their relationship to specific research communities, see Intner,Lazinger, and Weihs 2006 book.

    USABILITY STUDIES OF DIGITAL IMAGE COLLECTIONS

    In this section, critical aspects of usability studies conducted on digitalimage collections will be examined. The studies assessed below are limitedby date to those published from 2003 to April of 2006. They are limitedby scope to those that were actually gathered data using formal usabilitytesting methods. Since usability testing is normally a part of a largerusability assessment effort, the user studies that were employed in concertwith usability testing will be noted but not addressed in great detail. Theauthor will discuss each studys methodology and note any especiallyinformative results or conclusions.

    In Context and Meaning: The Challenges of Metadata for a DigitalImage Library within the University, researchers at Pennsylvania StateUniversity describe some of the issues encountered in the Visual ImageUser Study. Though this article did not present the details of the userstudies, it did provide a background on the project and provided a link tothe project Web site, where the user studies were described in detail.33

    The Visual Image User Study Web site (http://www.libraries.psu.edu/vius/) includes a link to the detailed usability assessment document (http://www.libraries.psu.edu/vius/8.4.pdf). This document explains the method-ology employed and the results obtained.34 The process involved severaluser assessment methods. Several surveys were conducted, including bothstudents and faculty in a broad range of academic disciplines. Ten focusgroups and more than 45 individual interviews were conducted. Also, thesite logs of two image databases were analyzed.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 464 JOURNAL OF WEB LIBRARIANSHIP

    The usability testing component involved 25 participants examiningthree different user interfaces. The participants were all Pennsylvania StateUniversity students, including both graduate and undergraduate studentsfrom a variety of majors. The three interfaces tested were of varyingcomplexity. Ten participants used one interface; ten others tested the secondinterface; and five users tested the third interface. Each participant wasgiven the same scenario, one that involved searching for images and savingthem to enable future use.

    Data was recorded by two observers who took notes. The subjectscomments were also recorded, along with the screen images using screen-capture software. The results indicated the simplest interface yielded themost accurate results. The third interface, which was the most complex,proved difficult and frustrating to use.

    Jason A. Clark evaluated the Belgian-American Research Collection,held by the University of Wisconsin, in a 2004 study.35 This collectioncontains a variety of media formats, including images, text, and audio.This study included what Clark calls a focus group in addition to a for-mal usability test. As previously described a focus group normally consistsof representative users interviewed in a group setting for the purpose ofgathering information on their needs, attitudes, and perceptions as theyrelate to a library system. In this instance, the group was made up of gradu-ate students taking a human-computer interaction class. The students wereasked to view the site and share their general impressions and mention anyspecific usability concerns. This type of usability assessment might be moreaccurately described as heuristic evaluation. A heuristic evaluation is theinspection of a system by individuals with knowledge of usability practiceswho review a system, looking for adherence to usability design principles.

    In the usability test, five participants were observed as they performedfive tasks in the collection. The tasks included representative tasks suchas searching for media (by browsing topics and by simple and advancedkeyword searching), saving records, changing view options, and e-mailingthe results. The participants actions and verbalizations were recorded us-ing screen and voice-capture software. The studys results underscored thegreat importance of certain design practices. These include the value of aunified, consistent appearance and navigation scheme, choosing button andfield labels based on user expectations, and enabling multiple search strate-gies (browsing by topic and keyword searching). The study also revealedparticipants had some difficulty in determining the function of certain iconsand therefore stressed the importance of supplementing functional iconswith a text equivalent.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 465

    In the study Sustainable Design for Multiple Audiences: The Usabil-ity Study and Iterative Redesign of the Documenting the American SouthDigital Library, the authors describe the redesign of the University ofNorth Carolinas Documenting the American South (DocSouth) dig-ital library.36 In addition to testing DocSouth, the authors incorporatedselected commercial digital libraries into the testing for comparative pur-poses. Three groups of participants were selected to represent the librarysthree principal user groups: university students and researchers, the gen-eral public, and K12 educators. The first two groups were given a setof tasks including searching for documents related to specified topics.The K12 educators were given a more complex, scenario-based set ofquestions. They were given a hypothetical class to prepare for and wereasked to use DocSouth and three other sites to find suitable materials forthe class. After completing these tasks, the participants were asked severalfollow-up questions to gather qualitative data about their impressions of thesite.

    The tests discovered several interface design concerns that may be gener-alizable to other digital libraries. These include the importance of ensuringthere is sufficient contrast for users to easily distinguish links, ensuringlabels are meaningful to users, providing adequate white space on pagesto allow easy scanning, and minimizing the use of library-related jargon(e.g., index or bibliography) which most users found confusing. Theauthors also conducted several focus group sessions after the usability test.The purpose of the focus groups was to further clarify the users wants andneeds. In these sessions, participants were shown the existing site as wellas some prototype re-designs and asked to share their impressions.

    In The Flight Plan of a Digital Initiatives Project, Part 2: UsabilityTesting in the Context of a User-Centered Design, Holley Long, KathrynLage, and Christopher Cronin reported on the usability evaluation of aprototype digital image collection.37 The team began the assessment byidentifying the primary target audience, which included faculty, staff, andstudents in geology and geography. The needs of this group were exploredwith a series of individual interviews. The interviews identified typical tasksperformed and critical requirements of users, which proved instrumentalin defining the tasks to be included in the usability test. After definingthe target audience and establishing their basic requirements, the teamconducted a heuristic evaluation of the sites prototype. Again, in a heuristicevaluation, several evaluators assess a site, ensuring that it adheres to a setof established design principles. This assessment was based on the verypopular heuristics promulgated by Jakob Nielsen.38

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 466 JOURNAL OF WEB LIBRARIANSHIP

    In the usability testing component, four participants were observed asthey performed five tasks on the prototype system. The tasks were designedto elicit information regarding the relative usefulness of two search meth-ods, map-based searching and keyword searching. In Task 1, users wereinstructed to use the keyword search feature. For Tasks 2 and 3, they wereinstructed to use the map search feature. For Task 4, they were allowed touse either search method. The fifth task was to find directions for printinga retrieved photo.

    In addition to the formal usability testing, users were also asked toperform a card-sort exercise. Card sorting is a user study method that isoften used to inform decisions on how to group sets of links and to labelcategories. In a card sort, users are given a stack of notecards with thename of a library service, collection, or activity. They are asked to sort thecards into logical groups. Then they are asked to provide a label to identifyeach group. This type of testing provides unique insights into the way usersview library services.39

    One conclusion drawn from the study that may be helpful to designersof similar systems was that when given the choice between map-based andkeyword searching, users prefer map-based searching if they know whereto locate the item, but they prefer the keyword search for unfamiliar terms.

    In Responding to Diverse User Groups: Usability Testing of the WebInterface of the Visual Resources Collection, Jody Walz and BarbaraBrenny briefly discussed the usability evaluation process performed onthe Visual Resource Collection of the University of Minnesotas Collegeof Architecture and Landscape Architecture.40 The study consisted of aheuristic evaluation, a usability test, and a post-test questionnaire. Theheuristic evaluation was conducted by the two authors, based on Nielsensheuristics. In the user test, about 30 participants were observed as theyperformed a total of 23 tasks. The tasks were arranged to progress fromsimple to more complex. The participants actions were observed, time tocomplete each task was measured, and their comments recorded as theyperformed the tasks. The questionnaire was intended to gather measuresof satisfaction as well as qualitative data to deepen the insights gleaned inthe study. Among the findings was a high incidence of confusion regardingthe labels and naming conventions.

    The article Digital Image Library Development in Academic Environ-ment: Designing and Testing Usability described the process of designinga visual image database for the American University of Paris.41 In thisstudy, usability testing was limited to the back-end (data entry) com-ponent of the image database. Unfortunately, very little information was

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 467

    reported on the usability testing. Among the data not reported were thenumber of participants, number and description of the scenarios or tasks,and methods of observation and recording of data. Precise results werealso excluded. Despite the dearth of details regarding how the data wascollected, because this study alone focused on data entry, one uniqueobservation was made: the importance of providing an adequate meansof associating appropriate metadata with each image. Failure to provideenough metadata at the point of data entry could often render the imageirretrievable to the end user.

    The article IUPUI Image Collection: A Usability Survey assessedthe Indiana University-Purdue University-Indianapolis image collection,which consists of images of historic photos and maps.42 The assessmentconsisted of a usability test and a post-test questionnaire. In the usabilitytest, 70 participants were observed and timed as they searched for a photo.The amount of time taken to complete the search was measured. Resultswere tallied by tracking the number of participants who completed thesearch according to the following buckets: less than one minute, oneminute, then upwards by 30-second intervals to a maximum of five minutes.After this search, the participants were interviewed to glean informationthat would improve the sites usability.

    In a masters thesis titled Art Historians Use of Digital Images: AUsability Test of ARTstor, Alida M. Pask conducted a usability test ofthe ARTstor image database.43 ARTstor is an image database consistingof hundreds of thousands of art images compiled from university collec-tions and art history textbooks. Before conducting the usability test, Paskinterviewed two art historians. The purpose of the interviews was to learnhow they find, organize, and use digital images in their teaching and re-search. The information uncovered in these interviews was instrumental indeveloping the task list for the usability study.

    There were five participants in the usability test, all of whom weregraduate students in art history. Each participant performed nine tasks thatinvolved searching, browsing, viewing, saving, and comparing images.The participants were asked to think out loud, and their comments wererecorded. Whether they successfully completed each task, as well as thetime taken on each task, were recorded by the author.

    The results of this study found that although the site performed quitewell, there were some problem areas. One particular trouble spot was in-terpreting icons. Like several other cases, participants had difficulty deter-mining the function of several of the icon buttons. This again underscoresthe importance of providing textual support to supplement graphic icons

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 468 JOURNAL OF WEB LIBRARIANSHIP

    and of testing icons carefully. The other area of concern was with the sys-tems browsing functionality. Participants had difficulty completing a taskthat involved searching for an image based only on a description and aloose time period. The problem in this case was the majority (four of five)of participants failure to recognize that broad topics could be expanded toreveal a list of more narrow subtopics.

    In Information Seeking Behavior in Digital Image Collections: A Cog-nitive Approach, Krystyna K. Matusiak studied the information-seekingbehavior of digital collection users.44 The collection used for the study,Milwaukee Neighborhoods: Photos and Maps 18851992, consists ofmore than 600 images and twelve historic maps. In addition to a usabil-ity test and a post-test interview, Matusiak introduced a novel approachfor gathering information, which she called self-reported logs. Partici-pants were asked to use the site on their own, record information aboutthe searches they performed, and comment on their experiences with thesystem.

    In the usability test, seven students and five community members weredivided into student and community member groups and observed as theynavigated through a digital image collection. They each performed tentasks based on searching the image collection. They were asked to thinkaloud as they worked. The authors found that the two groups, students andcommunity users, differed markedly in their search techniques. The studentgroup relied almost exclusively on keyword searching. The communitygroup preferred browsing the collections by subject. The study emphasizedthe importance of proper indexing and metadata usage to enable successfulkeyword searching. It also emphasized the value of creating browsingpathways, such as browse by (date/topic) groupings.

    In An Analysis of Image Retrieval Behavior for Metadata Type ImageDatabase, Toru Fukumoto compared users behavior when presented withtwo different types of image-searching tasks.45 Unlike most of the otherstudies examined in this article, the purpose of this study was to learnabout user behavior in image collections generally, and not to evaluate aspecific collection. Hence, there were no related user studies involved inthe assessment, only the usability test.

    Some twenty undergraduate students, all of whom were comfortableusing Internet search engines to retrieve images, participated in this study.The participants were first shown an image and then instructed to searchfor and retrieve that image from the test database. These tasks were labeledTask 1-1 and 1-2 and considered closed-task questions. Then the par-ticipants were instructed to search the database for images that would be

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 469

    suitable to use in a summer greeting card (Task 2-1) and to select their fa-vorite landscape (Task 2-2). These tasks were considered the open-taskquestions. The participants were observed and their actions recorded onvideo.

    The author found, not surprisingly, that user behavior was differentdepending on the nature of the task. While performing the closed tasks,users spent more time inputting keywords and less time browsing theresults. Conversely, when performing the open tasks (which required themchoosing an image that in their opinion would be suitable for a givenpurpose), the participants spent more time browsing through the resultsand less time inputing search terms. The author concluded that to facilitateimage selection, the retrieved images should be shown to the user asthumbnails, thus enabling users to quickly view the images before decidingwhether to revise their search further.

    FINDINGS

    The purpose of this article was to assess the methodology of usabilitystudies on digital image collections. In terms of the usability testing, therewas considerable variability in methodology applied. The studies variedin the number of participants from a minimum of four to a maximum of70. The number of tasks performed also varied greatly, from only one taskperformed to 23 tasks performed.

    Digital library designers have used a number of user study methodsto complement usability testing. Of the articles which describe usabilityevaluations that include usability testing, most included some form of userstudy also. Four included post-test interviews or questionnaires. Two in-cluded focus groups. One performed a card-sort, and another used the novelapproach of asking participants to self-report their independent evaluationsof the system.

    This basic methodology information is represented in Table 1.

    RECOMMENDATIONS

    This study revealed a number of issues that designers of digital im-age collections should consider when designing their sites. One issuewas confusion about the use of icons as buttons. It is tempting touse icons in place of words, both to save space on the screen and to

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 470 JOURNAL OF WEB LIBRARIANSHIP

    TABLE 1. Summary of Research Methodology

    Methodology at a Glance

    User studies in additionalAuthor(s) to usability test # Participants # Interfaces # Tasks/scenarios

    Frost SurveysFocus groupsInterviewsSite log analysis

    25 3 9

    Clark Heuristic evaluation 5 5Norberg et al. Post-test interviews

    Focus groupsLong Heuristic evaluation

    Card-sort4 1 5

    Walz & Brenny Heuristic evaluationPost-test questionnaire

    30 23

    Roda et al. None Not reportedKramer Post-test interview 70 1 1Pask Interview 5 9Matusiak Post-Test Interview

    Self-reported logs12 10

    Fukumoto None 20 1 4

    improve the sites visual appeal. However, it is recommended that if iconsare used as buttons for essential functionality, they should be well testedto ensure their meaning is clear to users or be supported by explanatorytext.

    Another was difficulty with terminology, particularly library-specificjargon. It is very important to consider all possible audiences when writ-ing text for your library. Jargon should be replaced with more generalterminology that can be understood by a variety of potential users.

    Also, it was found that users greatly appreciate the ability to browse thecollection (by date, by collection, by theme, etc.) in addition to keywordsearching. Browse functionality should always be included, if technicallyfeasible.

    Lastly, providing image previews in the form of thumbnail images ishighly encouraged. This is especially important if users are expected toexplore the collection for items of visual interest, as opposed to searchingfor specific image as those from a particular artist or of a certain timeperiod.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 471

    CONCLUSION

    In this article, it was noted that there has been a steady increase ofreports on the usability assessment of digital libraries. The article coveredthe basic principles of user-centered design and showed how libraries havebeen applying these principles to improve the performance of their digitallibraries.

    NOTES

    1. Laurie Lopatin, Library Digitization Projects, Issues and Guidelines: A Surveyof the Literature, Library Hi Tech 7, no. 2 (2006): 273274.

    2. Anne R. Kenney and Stephen Chapman, Digital Imaging for Libraries andArchives (Ithaca, NY: Cornell University Library, 1996).

    3. William Arms, Digital Libraries (Cambridge, MA: MIT Press, 2000).4. Rebecca Mugridge, SPEC KIT 294: Managing Digitization Activities (Wash-

    ington, DC: Association of Research Libraries, 2006): 1112.5. Usability Professionals Association, What Is User-Centered Design:

    About Usability: UPA Resources, http://www.upassoc.org/usability resources/about usability/what is ucd.html (accessed April 12, 2007).

    6. Mary Pagliero Popp, Testing Library Web Sites: ARL Libraries Weigh In(paper presented at the Association of College and Research Libraries, 10th Na-tional Conference, Denver, CO, March 1518, 2001), 34, http://www.ala.org/ala/acrl/acrlevents/popp.pdf (accessed April 5, 2007).

    7. Judy Jeng, What Is Usability in the Context of the Digital Library and HowCan It Be Measured? Information Technology and Libraries 24, no. 2 (2005): 48.

    8. Denise Troll Covey, Usage and Usability Assessment: Library Practices andConcerns (Washington, DC: Digital Library Federation, 2002), 23, http://www.clir.org/pubs/reports/pub105/contents.html (accessed April 12, 2007).

    9. Jennifer L. Ward and Steve Hiller, Usability Testing, Interface Design, andPortals, Journal of Library Administration 43, 1/2 (2005): 158.

    10. Susan Goodwin, Using Screen Capture Software for Web Site Usability andRedesign Buy-In, Library Hi Tech 23, no. 4 (2005): 610621.

    11. Guenter, Kim, Assessing Web Site Usability, Online 27, no. 2 (2003): 6568.12. Shelagh K. Genus, Web Site Usability Testing: A Critical Tool for Libraries,

    Feliciter 50, no. 4 (2004): 161164.13. Elaina Norlin and C.M. Winters, Usability Testing for Library Web Sites.

    (Chicago: American Library Association, 2002).14. Covey, Usage and Usability Assessment, 15.15. Ibid., 1314.16. Judy Jeng, What Is Usability, 47.17. Michael Lesk, Practical Digital Libraries: Books, Bytes and Bucks (San Fran-

    cisco: Morgan Kaufman, 1997): 1.18. William Arms, Digital Libraries, 2.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 472 JOURNAL OF WEB LIBRARIANSHIP

    19. Judy Jeng. Usability Assessment of Academic Digital Libraries: Effective-ness, Efficiency, Satisfaction, and Learnability, Libri 5 (2005): 101102.

    20. Jason A. Clark, A Usability Study of the Belgian-American Research Collec-tion: Measuring the Functionality of a Digital Library, OCLC Systems and Services:International Digital Library Perspectives 20, no. 3 (2004): 115.

    21. Holley Long, Kathryn Lage, and Christopher Cronin, The Flight Plan of aDigital Initiatives Project, Part 2: Usability Testing in the Context of a User-CenteredDesign, OCLC Systems and Services 21 no. 4, (2005): 329.

    22. Hong Xie, Evaluation of Digital Libraries: Criteria and Problems from UsersPerspectives, Library and Information Science Research 28 (2006): 439.

    23. Brenda Battleson, Austin Booth, and Jane Weintrop, Usability Testing of anAcademic Library Web Site: A Case Study, The Journal of Academic Librarianship27, no. 3 (2001): 188198; Maryellen Allen, A Case Study of the Usability Testingof the University of South Floridas Virtual Library Interface Design, Online Infor-mation Review 26, no.1 (2002): 4053; and Janet Chisman, Karen Diller, and SharonWalbridge, Usability Testing: A Case Study, College and Research Libraries 60, no.6 (1999): 552559.

    24. Galina Letnikova, Usability Testing of Academic Library Web Sites: A Selec-tive Annotated Bibliography, Internet Reference Services Quarterly 8, no. 4 (2003):5368.

    25. Jeng, What Is Usability?; Usability Assessment of Academic Digital Li-braries.

    26. Ann Peterson Bishop, Measuring Access, Use, and Success in Digital Li-braries, The Journal of Electronic Publishing 4, no. 2 (2001), http://www.press.umich.edu/jep/04-02/bishop.html; Susan Makar, Stamp of Approval: How to Achieve Op-timal Usability, Computers in Libraries 23, no. 1 (2003): 1621; Sueli Mara Ferreiraand Denise Nunes Pithan, Usability of Digital Libraries: A Study Based on the Areasof Information Science and Human-Computer Interaction, OCLC Systems and Ser-vices 21, no. 4 (2005): 311323; and Tina E. Chrzastowski and Alexander Scheeline,ASDL: The Analytical Sciences Digital Library Taking the Next Steps, Science andTechnology Libraries 26, no. 3/4 (2006): 7994 (accessed April 12, 2007).

    27. Krystyna K. Matusiak, Information Seeking Behavior in Digital Image Col-lections: A Cognitive Approach, Journal of Academic Librarianship 32, no. 5 (2006):480.

    28. Eric Novotny, Finding United States Historical Images in Print or Online,Reference User Services Quarterly 45 no. 1 (2005): 1121.

    29. Robin Wendler, The Eye of the Beholder: Challenges of Image Description andAccess at Harvard, in Metadata in Practice, ed Dianne Hillman and Elaine Westbrooks(Chicago: American Library Association, 2004), 51; and Johanna Woll, User Accessto Digital Image Collections of Cultural Heritage Materials: The Thesaurus as Pass-Key, Art Documentation 24, no. 2 (2005): 20.

    30. Dianne Hillman and Elaine Westbrooks, eds., Metadata in Practice (Chicago:American Library Association, 2004), xvi.

    31. Sheila S. Intner, Susan S. Lazinger, and Jean Weihs, Metadata and Its Impacton Libraries (Westport, CT: Libraries Unlimited, 2006).

    32. Ibid.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 473

    33. John Attig, Ann Copeland, and Michael Pelikan, Context and Meaning: TheChallenges of Metadata for a Digital Image Library within the University, Collegeand Research Libraries 65, no. 3 (2004): 251261.

    34. James Frost, Think Aloud Protocol Study of CONTENTdmTM Interfaces,http://www.libraries.psu.edu/vius/8.4.pdf (accessed April 12, 2007).

    35. Clark (2004).36. Lisa R. Norberg, Kim Vassiliadis, Jean Ferguson, and Natasha Smith, Sus-

    tainable Design for Multiple Audiences: The Usability Study and Iterative Redesign ofthe Documenting The American South Digital Library, OCLC Systems and Services21, no. 4 (2005): 285299.

    37. Long, Flight Plan.38. Jacob Nielsen, Ten Usability Heuristics (updated version), http:/www.useit.

    com/papers/heuristic/heuristic list.html (accessed April 12, 2007).39. Covey, Usage and Usability Assessment, 32.40. Jody Walz and Barbara Brenny, Responding to Diverse User Groups: Usability

    Testing of the Web Interface of the Visual Resources Collection, Visual ResourcesAssociation Bulletin 31, no. 2 (2005): 4849.

    41. Claudia Roda, Ann Murphy Borel, Eugene Gentchev, and Julie Thomas, Dig-ital Image Library Development in Academic Environment: Designing and TestingUsability, OCLC Systems and Services 21, no. 4 (2005) 264284.

    42. Elsa F. Kramer, IUPUI Image Collection: A Usability Survey, OCLC Systemsand Services 21, no. 4 (2005): 346359.

    43. Alida M. Pask, Art Historians Use of Digital Images: a Usability Test ofARTstor (Masters thesis, University of North Carolina, 2005). Retrieved April 1,2007 http://etd.ils.unc.edu/dspace/bitstream/1901/195/1/alidapask.pdf.

    44. Krystyna K Matusiak, Information Seeking Behavior.45. Toru Fukumoto, An Analysis of Image Retrieval Behavior for Metadata Type

    Image Database, Information Processing and Management 42 (2006): 723728.

    WORKS CITED

    Allen, Maryellen. A Case Study of the Usability Testing of the University of South FloridasVirtual Library Interface Design. Online Information Review 26, no. 1 (2002): 4053.

    Arms, William. Digital Libraries. Cambridge, MA: MIT Press, 2000.Attig, John, Ann Copeland, and Michael Pelikan. Context and Meaning: The Challenges

    of Metadata for a Digital Image Library within the University. College and ResearchLibraries 65, no. 3 (2004): 251261.

    Battleson, Brenda, Austin Booth, and Jane Weintrop. Usability Testing of an AcademicLibrary Web Site: A Case Study. The Journal of Academic Librarianship 27, no. 3(2001): 188198.

    Bishop, Ann Peterson. Measuring Access, Use, and Success in Digital Libraries. TheJournal of Electronic Publishing 4, no. 2 (2001). http://www.press.umich.edu/jep/04-02/bishop.html (accessed April 2, 2007).

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • 474 JOURNAL OF WEB LIBRARIANSHIP

    Chisman, Janet, Karen Diller, and Sharon Walbridge. Usability Testing: A Case Study.College and Research Libraries 60, no. 6 (1999): 552559.

    Chrzastowski, Tina E., and Alexander Scheeline. ASDL: The Analytical Sciences DigitalLibrary Taking the Next Steps. Science and Technology Libraries 26, no. 3/4 (2006):7994.

    Clark, Jason A. A Usability Study of the Belgian-American Research Collection: Measur-ing the Functionality of a Digital Library. OCLC Systems and Services: InternationalDigital Library Perspectives 20, no. 3 (2004):115127.

    Covey, Denise Troll. Usage and Usability Assessment: Library Practices and Con-cerns. Washington, DC: Digital Library Federation, 2002. http://www.clir.org/pubs/reports/pub105/contents.html (accessed April 12, 2007).

    Ferreira, Sueli Mara, and Denise Nunes Pithan. Usability of Digital Libraries: A StudyBased on the Areas of Information Science and Human-Computer Interaction. OCLCSystems and Services 21, no. 4 (2005): 311323.

    Frost, James. Think Aloud Protocol Study of CONTENTdmTM Interfaces.http://www.libraries.psu.edu/vius/8.4.pdf. Last accessed April 12, 2007.

    Fukomoto, Toru. An Analysis of Image Retrieval Behavior for Metadata Type ImageDatabase. Information Processing and Management 42 (2006): 723728.

    Genius, Shelagh K. Web Site Usability Testing: A Critical Tool for Libraries. Feliciter50, no. 4 (2004):161164.

    Goodwin, Susan. Using Screen Capture Software for Web Site Usability and RedesignBuy-In. Library Hi Tech 23, no. 4 (2005): 610621.

    Guenter, Kim. Assessing Web Site Usability. Online 27, no. 2 (2003): 6568.Hillman, Dianne, and Elaine. Westbrooks, eds. Metadata in Practice. Chicago: American

    Library Association, 2004.Intner, Sheila S., Susan S. Lazinger, and Jean Weihs. Metadata and Its Impact on Libraries.

    Westport, CT: Libraries Unlimited, 2006.Jeng, Judy. What Is Usability in the Context of the Digital Library and How Can It Be

    Measured? Information Technology and Libraries 24, no. 2 (2005): 4756.Jeng, Judy. Usability Assessment of Academic Digital Libraries: Effectiveness, Efficiency,

    Satisfaction, and Learnability. Libri 5 (2005) 96121.Kenney, Anne R., and Stephen Chapman. Digital Imaging for Libraries and Archives.

    Ithaca, NY: Cornell University Library, 1996.Kramer, Elsa F. IUPUI Image Collection: A Usability Survey. OCLC Systems and Services

    21, no. 4 (2005): 346359.Lesk, Michael. Practical Digital Libraries: Books, Bytes and Bucks. San Francisco: Morgan

    Kauffman, 1997.Long, Holley, Kathryn Lage, and Christopher Cronin. The Flight Plan of a Digital Initia-

    tives Project, Part 2: Usability Testing in the Context of a User-Centered Design. OCLCSystems and Services 21 no. 4 (2005): 346359.

    Lopatin, Laurie. Library Digitization Projects, Issues and Guidelines: A Survey of theLiterature. Library Hi Tech 7, no. 2 (2006): 273289.

    Makar, Susan. Stamp of Approval: How to Achieve Optimal Usability. Computers inLibraries 23, no. 1 (2003): 1621.

    Dow

    nloa

    ded

    by [

    Uni

    vers

    ity o

    f T

    enne

    ssee

    At M

    artin

    ] at

    23:

    23 0

    4 O

    ctob

    er 2

    014

  • David J. Comeaux 475

    Matusiak, Krystyna K. Information Seeking Behavior in Digital Image Collections: ACognitive Approach. Journal of Academic Librarianship 32, no. 5 (2006): 479488.

    Mugridge, Rebecca. (2006). SPEC KIT 294: Managing Digitization Activities. Washington,DC: Association of Research Libraries.

    Nielsen, Jacob. Ten Usability Heuristics. www.useit.com/papers/heuristic/heuristic list.html (accessed April 12, 2007).

    Norberg, Lisa R., Kim Vassiliadis, Jean Ferguson, and Natasha Smith, Sustainable Designfor Multiple Audiences: The Usability Study and Iterative Redesign of the DocumentingThe American South Digital Library. OCLC Systems and Services 21, no. 4 (2005):285299.

    Norlin, Elaina and C.M. Winters. Usability Testing for Library Web Sites. Chicago: Amer-ican Library Association, 2002.

    Novotny, Eric. Finding United States Historical Images in Print or Online. ReferenceUser Services Quarterly 45 no. 1 (2005): 1121.

    Pask, Alida M. Art Historians Use of Digital Images: A Usability Test of ART-stor (Masters thesis, University of North Carolina, 2005). http://etd.ils.unc.edu/dspace/bitstream/1901/195/1/alidapask.pdf (accessed April 1, 2007).

    Popp, Mary Pagliero Testing Library Web Sites: ARL Libraries Weigh In. Paper presentedat the Association of College and Research Libraries, 10th National Conference, Denver,CO, March 1518, 2001. http://www.ala.org/ala/acrl/acrlevents/popp.pdf (accessed April5, 2007).

    Roda, Claudia, Ann Murphy Borel, Eugene Gentchev, and Julie Thomas. Digital Image Li-brary Development in Academic Environment: Designing and Testing Usability. OCLCSystems and Services 21, no. 4 (2005) 264284.

    Walz, Jody, and Barbara Brenny, Responding to Diverse User Groups: Usability Testingof the Web Interface of the Visual Resources Collection, Visual Resources AssociationBulletin 31, no. 2 (2005): 4849.

    Ward, Jennifer L., and Steve Hiller. Usability Testing, Interface Design, and Portals.Journal of Library Administration 43, 1/2 (2005): 155171.

    Wendler, Robin. The Eye of the Beholder: Challenges of Image Description and Accessat Harvard, in Metadata in Practice, Dianne Hillman and Elaine. Westbrooks, eds.(Chicago: American Library Association, 2004), 5169.

    Woll, Johanna. User Access to Digital Image Collections Cultural Heritage Materials: TheThesaurus as Pass-Key. Art Documentation 24, no. 2 (2005): 1927.

    Xie, Hong. Evaluation of Digital Libraries: Criteria and Problems from Users Perspec-tives. Library and Information Science Research 28 (2006): 433452.D

    ownl

    oade

    d by

    [U

    nive

    rsity

    of

    Ten

    ness

    ee A

    t Mar

    tin]

    at 2

    3:23

    04

    Oct

    ober

    201

    4

Recommended

View more >