i don’t do research . . . but

48
I Don’t Do Research . . . But Steve Hiller Director, Assessment and Planning University of Washington Libraries [email protected]

Upload: claral

Post on 23-Mar-2016

37 views

Category:

Documents


0 download

DESCRIPTION

I Don’t Do Research . . . But. Steve Hiller Director, Assessment and Planning University of Washington Libraries [email protected]. I Do Use Research Methods as Part of Our Assessment and Planning Program for:. Understanding our user communities How they work - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: I Don’t Do Research . . . But

I Don’t Do Research . . . But

Steve HillerDirector, Assessment and PlanningUniversity of Washington Libraries

[email protected]

Page 2: I Don’t Do Research . . . But

I Do Use Research Methods as Part of Our Assessment and Planning Program for:

• Understanding our user communities– How they work – Their library and information needs– How we can make them successful

• Organizational improvement– Improving organizational performance, effectiveness and

efficiency– Delivering services and programs that make a difference

Page 3: I Don’t Do Research . . . But

AssessmentMore than Numbers

Library assessment is a structured process: • To learn about our communities• To respond to the needs of our users• To improve our programs and services• To support the goals of the communities

Page 4: I Don’t Do Research . . . But

Why Assess?

• Accountability and justification; demonstrating value• Improvement of services • Comparisons with others• Identification of changing patterns• Marketing and promotion • Opportunity to tell our own story• Using data, not assumptions, to make decisions

–Assumicide!

Page 5: I Don’t Do Research . . . But

What’s Driving the Agenda• Environmental Changes

– Exploding growth in use and applications of technology– Increased customer expectations for services, including

quality and responsiveness– “Competition” from other sources

• Budgetary Constraints/Reductions– Justification for spending $$$ on libraries– Increasing competition for resources– Budget reductions and reallocations

• Demonstrating Value– Accountability– How do we enable those in our community to succeed

Page 6: I Don’t Do Research . . . But

Traditional Library Measures: InputsFocus on how big/how much• Budget (staff, collections, operations)• Staff size• Collection size• Facilities• Other related infrastructure (hours, seats, computers)• Size of user communities and programs ARL “Investment Index” measures inputs related to

expenditures and staff numbers

Page 7: I Don’t Do Research . . . But

Traditional Library Measures: Outputs

Focus on usage• Collections (print, electronic, ILL)• Reference services• Facilities (gate counts)• Instruction sessions• Discovery and retrieval• Other Web sessionsMay indicate if “inputs” are used, but doesn’t tell us

what users were able to accomplish as a result.

Page 8: I Don’t Do Research . . . But

These Are Self-Reported Statistics Too!

Page 9: I Don’t Do Research . . . But

The Challenge for Libraries

• Traditional statistics are no longer sufficient– Emphasize inputs/outputs – how big and how many– Do not tell the library’s or customers’ story– May not align with organizational goals and plans– Do not measure service quality or library impact

• Need better outcome measures that demonstrate difference the library makes and value it adds– To the individual, community and the organization

• “No longer what makes a good library but how much good does the library do” (Peter Brophy)

Page 10: I Don’t Do Research . . . But

Assessing and Demonstrating the Library Contribution to the Institutional Mission

• The library’s contribution to learning and research– Student learning (accreditation driven)– Externally funded research and scholarship

• Value of the library to the community– Information resources/collections – Library as place– Current services

• Changes in library and information needs and use• Organizational performance and effectiveness• Collaborations

Page 11: I Don’t Do Research . . . But

Good Assessment Starts Before You Begin . . . Some Questions to Ask

• Define the question – What’s Important– What do you need to know, why, and when

• How will you use the information/results• Where/how will you get the information

– Methods used– Existing data– New data (where or who will you get it from)

• How will you analyze the information• Who will act upon the findings

Page 12: I Don’t Do Research . . . But

Four Useful Assessment Assumptions

• Your problem/issue is not as unique as you think• You have more data/information than you think• You need less data/information than you think• There are useful methods that are much simpler than

you think

Adapted from Douglas Hubbard, “How to Measure Anything” (2007)

Page 13: I Don’t Do Research . . . But

Documenting Library Performance and Impact

• Common library assessment methods– Surveys (satisfaction, needs, importance)– Usage and other library statistics– Qualitative information (interviews, focus groups, etc.)

• Other statistical data– Institutional – Comparator (ARL, ACRL, peer groups, customized)– Government (NCES)

• Collaborations• Value, Impact and Return on Investment

– Lib-Value (IMLS grant to measure value and return on investment in academic libraries)

Page 14: I Don’t Do Research . . . But

Choosing the Right Assessment Method

Criteria• Utility• Relevance/Importance• Stakeholder needs• Measurability• Cost• Timely

Tools• Usage data• Surveys (local &

standardized)• Standardized tests• Performance

assessments• Qualitative methods• Rubrics

Page 15: I Don’t Do Research . . . But

Presenting Assessment Findings

• Make sure data/results are:– Timely– Understandable– Usable

• Identify important findings/key results– What’s important to know– What’s actionable

• Present key/important results to:– Library administration/institutional administration– Library staff– Other libraries/interested parties/stakeholders

Page 16: I Don’t Do Research . . . But

Success with Assessment

• Use multiple assessment methods• Mine/repurpose existing data • Invest in staff training and resources• Focus on the customer and community • Learn from our users• Partner with other campus programs and institutions• Present assessment information so that it is

understandable and usable

Page 17: I Don’t Do Research . . . But

A Skeptical View of Metrics

Page 18: I Don’t Do Research . . . But

Association of Research Libraries (ARL) and Library Assessment

ARL has played a major role in advancing assessment in academic libraries through:

• ARL Statistics• New measures and standardized methods

– LibQUAL+® user survey, MINES for libraries • Individual library consulting

– Effective, Sustainable and Practical Assessment (ESP) • 42 libraries visited 2005-2010 to evaluate assessment needs and programs

• Library Assessment Conference

Page 19: I Don’t Do Research . . . But

ESP Insights • Uncertainty on how to establish and sustain assessment• Staff lack essential assessment/data analysis skills and

knowledge• Lack of focus and assessment priorities; tenuous link to

planning and decision making • Underutilization of campus assessment resources• More data collection than data utilization• Overreliance on surveys for user input• Organizational issues play a significant role in

sustainable assessment

Page 20: I Don’t Do Research . . . But

From Institutional Based Assessment to a Community of Practice

THE NEED• Bring together library folks interested in assessment • Focus on effective and practical assessment• Establish an ongoing venue for presentation of library

assessment issues, activities and results• Build a continuing education component (workshops)• Make it fun!AN ANSWER• Library Assessment Conference

– Organized by ARL, U.Va and UW– Biennial conference first held in 2006

Page 21: I Don’t Do Research . . . But

Library Assessment Conference Facts

YearConference Site

# Proposals # ConferencePresentations

# Workshops Total registrants

2006Charlottesville

80 paper/panel15 poster

38 paper/panel20 posters3 plenary

Proceedings – 3 lbs.

3 half-day(each repeated)120 participants

220 30 turnaways

2008Seattle

95 paper/panel38 poster

59 paper/panel43 posters5 plenary

Proceedings - 4 lbs.

6 half-day160 participants

375 15 turnaways

2010Baltimore

154 papers55 posters

63 papers 80 posters5 plenary

2 full-day4 half-day

160 participants

47520 turnaways

Page 22: I Don’t Do Research . . . But

Using Assessment for Results at the University of Washington or How We Contribute to User

Success

• Assessment program established in 1991– Focus on user needs– Information seeking behavior and use– Patterns of library use– Library contribution to learning and research– User satisfaction with services, collections, overall

• Increasingly tied to strategic goals and priorities• Provides data to improve programs and services and to

demonstrate the library contribution to user success

Page 23: I Don’t Do Research . . . But

University of Washington Libraries Assessment Methods Used

• Large scale user surveys every 3 years since 1992 (“triennial survey”)

• In-library use surveys every 3 years beginning 1993• Focus groups/Interviews • User centered design• Observation (guided and non-obtrusive)• Usability• Usage statistics/data miningInformation about assessment program available at:

http://www.lib.washington.edu/assessment/

Page 24: I Don’t Do Research . . . But

UW Libraries Triennial Survey

• Started in 1992 with paper; web-based began in 2004• Survey designed by library staff and asks about needs,

importance, satisfaction, use patterns, and impact (comments valuable too)

• Survey all faculty and a sample of students• Survey for each group is different and survey questions

may change over time (although a core set remains the same over time and between groups)

• Survey can help measure effectiveness of existing programs and provide direction for future ones

Longest running cyclical survey in academic libraries

Page 25: I Don’t Do Research . . . But

Strategic Priorities 2007-2010

• Expand digital and physical delivery services• Enhance library contributions to research productivity• Raise visibility and effectiveness of librarian liaisons • Inform UW researchers/authors about good scholarly

communications practices• Strengthen library role in undergraduate learning• Reshape library spaces to enhance user experiences• Ensure content needed is accessible and deliverable • Implement new models of service

Page 26: I Don’t Do Research . . . But

What We Did 2007-2009

• Began pull and scan service; harmonized ILL• Implemented UW WorldCat as primary access point• Articulated service expectations for librarian liaisons• Expanded scholarly communication efforts• Began revisioning process for undergrad library space• Brought in consultant on teaching and learning• Participated in ARL Library Scorecard Pilot (2009-)• 2009 - 12% budget reduction

– Closed several branch libraries; cut hours; cut 29 positions in 2009– Reduced collections budget; cut serial subscriptions

Page 27: I Don’t Do Research . . . But

Libraries 2010 Triennial Survey Highlights

• Record number of faculty and graduate student responses • Satisfaction ratings highest ever for faculty and grads;

slightly lower for undergrads (at all 3 campuses)• Library contributions to teaching, learning, research and

overall success rated very high by faculty/grad students• Substantial increase in use and satisfaction with library

delivery services (ILL, pull and scan)• Online access to and delivery of scholarly information,

especially journals, are driving research and scholarship

Page 28: I Don’t Do Research . . . But

UW Libraries Triennial Survey Number of Respondents and Response Rate 1992-2010

http://www.lib.washington.edu/assessment/

2010 2007 2004 2001 1998 1995 1992

Faculty 163439%

145536%

156040%

134536%

150340%

135931%

110828%

Grad/ProfStudents(UWS)

64032%

58033%

62740%

59740%

45746%

40941%

56056%

Undergrads(UWS)

36516%

46720%

50225%

49725%

78739%

46323%

40741%

Page 29: I Don’t Do Research . . . But

Overall Satisfaction by Group1995-2010

Faculty 4.254.33 4.33

4.44

4.56Faculty 4.63

Undergrad 3.97

3.99

4.22

4.324.36

Undergrad 4.25Grad 4.18

4.11

4.26

4.34 4.36

Grad 4.43

3.8

3.9

4

4.1

4.2

4.3

4.4

4.5

4.6

4.7

4.8

3.8

3.9

4

4.1

4.2

4.3

4.4

4.5

4.6

4.7

4.8

1995 1998 2001 2004 2007 2010

Page 30: I Don’t Do Research . . . But

Library Services and Resources: Overall Importance to Work by Group

(Scale of 1 “Not Important” to 5 “Very Important)

3

3.5

4

4.5

5

Collections Discovery tools Info services and instruction

Physical spaces

Undergrad Grad Faculty

Page 31: I Don’t Do Research . . . But

UW Libraries 2010 Triennial SurveyLibraries Contribution to:

(Scale of 1 “Minor” to 5 “Major”)

Mean scores%= those marking 4 or 5

Faculty1634 surveys(39% response)

Graduate Students680 surveys(32% response)

Keeping current in your field 92% 4.67 90% 4.53

Finding information in related fields or new areas

90% 4.56 91% 4.57

Being a more productive researcher 92% 4.63 93% 4.64

Enriching student learning experiencesOverall academic success

77% 4.1892% 4.60

Making more efficient use of your time 87% 4.46 80% 4.21

Page 32: I Don’t Do Research . . . But

Importance of Books & Journals by Academic Area (2010, Faculty, Scale of 1 “not important” to 5 “very important)

Books

Journals<1990

Journals>1990

3

3.25

3.5

3.75

4

4.25

4.5

4.75

5

Health Sciences Sci-Eng-Env Hum-Soc Science

Books Journals<1990 Journals >1990

Page 33: I Don’t Do Research . . . But

Importance of Books & Older Journals by School

Dentistry Medicine Nursing Pharmacy Public Health Social work3

3.25

3.5

3.75

4

4.25

4.5

Books Journals<1990

Page 34: I Don’t Do Research . . . But

Services Satisfaction and Visibility by Group 2007/2010

2010Satisfaction

2010Visibility

2007Satisfaction

2007Visibility

Instruction - S is up, V is downFacultyGradUndergrad (usefulness)

4.454.203.36

34%42%44%

4.273.803.21

52%55%49%

Staff assistance - S is up, V is the sameFacultyGradUndergrad

4.484.304.04

75%75%75%

4.424.063.94

76%75%69%

ILL Books and Journals - S is up, V is upFaculty Grad Undergrad

4.444.454.06

77%81%57%

4.254.193.90

63%61%46%

Remote access to collect/services - ALL GOOD!FacultyGradUndergrad

4.644.654.20

90%90%89%

Page 35: I Don’t Do Research . . . But

Subject Librarian Visibility and Satisfaction By Faculty College/School (Balanced Scorecard Metric)

4.90

4.30

4.48

3.85

4.73

3.924.00

4.76

4.00

4.334.28

4.63

4.26

4.88

4.724.71

4.34

3.00

4.25

30% 67%

Satisfaction

Visibility

Built Environments

Education

Business

EngineeringMedicine

I School

PharmacyDentistry

Nursing

Public Affairs

Public Health

Social Work

Environment

Fine Arts

Humanities

Social Sciences

Sciences

Page 36: I Don’t Do Research . . . But

Use Patterns: Frequency of In-Library Visits 1998-2010 (Weekly or more often)

Faculty

Faculty

Grad

Grad

Undergrad

Undergrad

20%

30%

40%

50%

60%

70%

80%

1998 2001 2004 2007 2010

Page 37: I Don’t Do Research . . . But

Undergraduate Overall Satisfaction 2007-2010

UWT

UWS

UWB

4

4.2

4.4

4.6

4.8

5

4

4.2

4.4

4.6

4.8

5

2007 2010

Page 38: I Don’t Do Research . . . But

Undergrad Satisfaction With Facilities

Group work Quiet areas Furniture Security Light & Temperature3

3.25

3.5

3.75

4

4.25

4.5

Seattle Bothell Tacoma

Page 39: I Don’t Do Research . . . But

80% of the 400 comments from UWS Undergrads Dealt with Space and Hours

• Open is one thing, space and available computers / tables with laptop plug-ins is whole other issue

• More seating or computer areas, engineer a reduced noise level in Odegaard.

• 1. More space between the computers 2. More quiet study areas 3. Spaces to eat, drink and take breaks

• Suzzallo-Allen. Quiet, neat, clean, cool, beautiful, access to everything I need. Ode, on the other hand . . .

Page 40: I Don’t Do Research . . . But

What People Do in Libraries by Group 20082008 In-Library Use Survey: 73% UG, 22% Grad, 5% Faculty

Ask for help Look for ma-terial

Work alone Work in groups Use Lib Computer

Use Own Computer

0%

10%

20%

30%

40%

50%

60%

70%

Undergrad 2008 Grad 2008 Faculty/Staff 2008

Page 41: I Don’t Do Research . . . But

Other Relevant Data

• During the past five years at UWS:• Total number weekly hours libraries open dropped 26% • Number of library seats dropped 3% • Enrolment increased by 6%• Gate counts increased by 6% or 250,000 more entrants

Page 42: I Don’t Do Research . . . But

How UW Libraries Has Used AssessmentA Few Examples

• Extend hours in Undergraduate Library (24/5.5)• Create more diversified student learning spaces• Enhance usability of discovery tools and website• Provide standardized service training for all staff• Review and restructure librarian liaison program • Consolidate and merge branch libraries• Change/reallocate collections budget• Change/reallocate staffing• Support budget requests to University

Page 43: I Don’t Do Research . . . But

Integrated Organizational Performance Model The Balanced Scorecard

• A model for measuring organizational performance developed in the 1990’s by Kaplan and Norton that:– Helps identify the important statistics– Helps ensure a proper balance– Organizes multiple statistics into an intelligible framework

• Clarifies and communicates the organization’s vision• Provides a structured metrics framework for aligning

assessment with strategic priorities & evaluating progress

• ARL Library Scorecard Pilot in 2009-10 with 4 libraries– Johns Hopkins, McMaster, Virginia, Washington

Page 44: I Don’t Do Research . . . But

Goals of the ARL Pilot

• Evaluate the Balanced Scorecard a suitable performance model for academic research libraries

• Value as structured process to better integrate and strengthen strategy, planning and assessment

• Encourage cross-library collaboration• Review objectives and measures for commonalities

between libraries

Page 45: I Don’t Do Research . . . But
Page 46: I Don’t Do Research . . . But

Closing the Loop: Success with Assessment

• Assess what is important• Keep expectations reasonable and achievable • Use multiple assessment methods; corroborate• Mine/repurpose existing data• Focus on users; how they work, find & use information • Use the data to improve and add customer value• Keep staff, customers and stakeholders involved and

informed

Page 47: I Don’t Do Research . . . But

Eye to the Future

Measuring performance is an exercise in measuring the past. It is the use of that data to plan an improved future that is all important. Peter Brophy

• Data trends can inform the future• Strategic planning can frame the future• Organizational performance models can align ongoing

operations with future aspirations• Understanding how customers work, how that work is

changing, and ways we can make customers and institutions success are key to the future of libraries

Page 48: I Don’t Do Research . . . But

In Conclusion Can You Answer These Questions?

• What do we know about our communities and customers to provide services and resources to make them successful?

• How do we measure the effectiveness of our services, programs and resources from the customer perspective?

• What do our stakeholders need to know in order to provide the resources needed for a successful library?