assessment 101

72
Assessment 101 Joyce Chapman Project Manager, Triangle Research Libraries Network Heads of Cataloging Interest Group ALA annual, Anaheim CA 25 June 2012

Upload: ayala

Post on 23-Feb-2016

43 views

Category:

Documents


0 download

DESCRIPTION

Assessment 101. Joyce Chapman Project Manager, Triangle Research Libraries Network Heads of Cataloging Interest Group ALA annual, Anaheim CA 25 June 2012. What is assessment?. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Assessment 101

Assessment 101Joyce Chapman

Project Manager, Triangle Research Libraries NetworkHeads of Cataloging Interest Group

ALA annual, Anaheim CA25 June 2012

Page 2: Assessment 101

WHAT IS ASSESSMENT?

Page 3: Assessment 101

Assessment is a continuous and cyclical process by which we evaluate and improve services,

products, workflows, and learning.

Page 4: Assessment 101

A continuous process

"Assessment is a process whose power is cumulative.” –University of Washington’s assessment principles

• “One-shot" assessments can be useful in the right context

• Ongoing assessment is what fosters ongoing improvement

Page 5: Assessment 101
Page 6: Assessment 101

Planning phase

Amazing how often people decide to skip this. Planning is one of the most difficult phases.• Determine your objectives• Define the questions that need to be

answered• Design a method to answer the questions (set

up a study, collect new data, extract existing data from a system)

Page 7: Assessment 101
Page 8: Assessment 101

Implementation (data gathering)

• NOT numbers for the sake of numbers• We frequently measure things that are easy to

measure, without a good reason for doing so. This data may not help us answer meaningful questions.

• For data collection to foster assessment, we must first determine what it is we really care about, then initiate data collection that will inform meaningful analysis and outcomes.

Page 9: Assessment 101
Page 10: Assessment 101

Assessment is a continuous and cyclical process by which we evaluate and improve services,

products, workflows, and learning.

Page 11: Assessment 101

React / refine

• The most frequent piece of the assessment cycle that is ignored is the last: making change based on the findings of data analysis.

• It is often inaction on the part of management that causes the assessment loop to remain incomplete, ending with reporting of data analysis findings and never resulting in action.

Page 12: Assessment 101
Page 13: Assessment 101

Summary

• Continuous• Cyclical• Evaluate AND improve

Requires that action be taken in response to findings

• Our environment and users are always changing; we are always reacting.

Page 14: Assessment 101

Evidence based practice

A movement to encourage and give practitioners the means to incorporate research into their

practice where it may have been lacking.

– Journal of Evidence Based Library and Information Practice

• Assumption that it is impossible to make good evidence-based decisions when our evidence base is weak; therefore, an evidence base must be built.

Page 15: Assessment 101

Evidence based practice

Stresses three aspects contributing to a practice that is evidence-based:

1. the best available evidence is used2. moderated by user needs and preference3. applied to improve the quality of professional

judgments

Page 16: Assessment 101

“An approach to information science that promotes the collection, interpretation, and

integration of valid, important and applicable user-reported, librarian-observed, and research-derived evidence. The best available evidence, moderated by user needs and preferences, is applied to improve the quality of professional

judgments.” – Anne McKibbon

Page 17: Assessment 101

Context of assessment

• Libraries often talk about assessment in the specific context of proving our institutional value to external audiences– Contribution to student retention, graduation, and

employment rates; student learning outcomes• Equally valuable is assessment to improve

internal workflows and services; or assessment of the cost and value of workflows to contribute to knowledge in the field.

Page 18: Assessment 101

Why perform assessment?

1. Improve efficiency 2. Modify workflows to funnel staff time and

efforts where they provide the most benefit3. Prove the value of existing or proposed

services/positions to higher administration4. Contribute to the available data/literature in the

cataloging field so that you can work together to implement evidence-based practice across the nation

Page 19: Assessment 101

A CULTURE OF ASSESSMENT

Page 20: Assessment 101

What is a culture of assessment?

A culture of assessment refers to whether the predominating attitudes and behaviors that characterize the functioning of an institution

support assessment.

Page 21: Assessment 101

What signals a culture of assessment?

• Do staff take ownership of assessment efforts?• Does administration encourage assessment? • Is there comprehensive program?• Is assessment present, do we see ongoing

assessment efforts throughout the organization?• Are there efforts to teach staff about assessment?• Is there inclusion of assessment in plans and

budget?

“Establishing a Culture of Assessment” by Wendy Weiner, 2009

Page 22: Assessment 101

What signals a culture of assessment?

• Is assessment mentioned in the strategic plan? Does the organization have an assessment plan?

• Does the organization financially support staff members whose positions are dedicated in whole or in part to assessment-related activities?

• Does the organization fund professional development of staff related to assessment?

• Is the organization responsive to proposals for new endeavors related to assessment?

“Establishing a Culture of Assessment” by Wendy Weiner, 2009

Page 23: Assessment 101

Structural difficulties

• Bottom-up: it is difficult for staff to gain support for conducting assessment projects or for implementing change based on findings when upper admin is not assessment-focused.

• Top-down: an assessment-focused upper admin can have a staff that does not support assessment and mandates for assessment are resented (the “defiant compliant” culture).

Page 24: Assessment 101

CONTEXT OF ASSESSMENT IN HIGHER EDUCATION

Page 25: Assessment 101

Inputs, outputs, and outcomes

• Input measures: quantify a library's raw materials (collection size, staff size, budget). Longest tradition of measurement in libraries.

• Output measures: measure the actual use of library collections and services (circulation stats, gate counts, reference transactions)

• Outcome measures: measure the impact that using library services, collections, and space has on users (libraries' impact on student learning)

Page 26: Assessment 101

What drives the “assessment agenda”?

• Changing times– Explosive growth in technologies– Increased customer expectations for services such

as quality and responsiveness• Shrinking budgets– justifications for spending $ on resources,

programs, and services are now required– Increased competition for resources– A fight to remain relevant and prove value

Martha Kyrillidou, “Planning for Results: Making the Data Work For You.” 2008.

Page 27: Assessment 101

Why wasn’t there a focus on assessment in libraries for so long?

Page 28: Assessment 101

Return On Investment

“A performance measure used to evaluate the efficiency of an investment…. a way of considering

profits in relation to capital invested.”

“ROI provides a snapshot of profitability adjusted for the size of the investment assets tied up in the

enterprise.”

Sources: Wikipedia and Investopedia

Page 29: Assessment 101

PROFITS!Cost = $$

Value = $$

For-profits must create their own revenue.Otherwise, they cease to exist.

Livelihood depends on cost/value assessment.

Page 30: Assessment 101

Libraries

Like most higher education, we are non-profits. Funding sources aren’t directly tied to real value

(more “perceived value” and tradition).

Cost = ??Value = ??

Page 31: Assessment 101

Put simply…

because higher education has historically been given a large % of annual funding by external powers based on perceived value, we have not developed a culture of needing to closely prove value, track input to output, or investment to profit.

Page 32: Assessment 101

ASSESSING THE COST AND VALUE OF BIBLIOGRAPHIC CONTROL

Page 33: Assessment 101

2011 LRTS article by Stalberg & Cronin

Page 34: Assessment 101

• June 2009, Heads of Technical Services in Large Research Libraries interest Group of ALCTS sponsored a Task Force on the Cost/Value of Bibliographic Control.

• Members: Ann-Marie Breaux, John Chapman, Karen Coyle, Myung-Ja Han, Jennifer O’Brien Roper, Steven Shadle, Roberta Winjum, Chris Cronin, and Erin Stalberg

Page 35: Assessment 101

• The task group found that the technical services community has long struggled with making sound, evidence-based decisions about bibliographic control

• If technical services is to attempt to perform cost/value assessment on bibliographic control, one of our first problems is a lack of operational definitions of value we must create our own operational definitions of value with which to work.

Page 36: Assessment 101

Fundamental questions for defining value

1. Can value be measured in ways that are non-numeric?

2. Is discussing relative value over intrinsic value helpful?

3. Does value equal use? 4. Is it possible to define a list of bibliographic

elements that are “high-value” and others that are “low-value”?

Page 37: Assessment 101

While the charge was to develop measures for value, the Task Force determined that doing so would not be helpful until the community has a common vocabulary for what constitutes value and an understanding of how value is attained, and until more user research into which bibliographic elements result in true research impact is conducted.

Page 38: Assessment 101

Operational definitions of value

1. Discovery success2. Use3. Display understanding4. Ability of bibliographic data to operate on the

open web and interoperate with vendors and suppliers in the bibliographic supply chain

Page 39: Assessment 101

Operational definitions of value

5. Ability to support the Functional Requirements of Bibliographic Records (FRBR) user tasks

6. Throughput and timeliness7. Ability to support the library’s administrative

and management goals

Page 40: Assessment 101

“Value multipliers”

Extent to which bibliographic data:

• are normalized• support collocation and disambiguation in discovery • use controlled terms across format and subject domains• level of granularity matches what users expect • enable a formal and functional expression of relationships

(links between resources) to find “like” items • are accurate• enhancements are able to proliferate to derivative records

Page 41: Assessment 101

Measuring cost

While elements contributing to the cost can be outlined, determining whether the costs are too high is impossible without first having a clear understanding of “value.”

• Salaries & benefits multiplied by time spent on a task• Cost of cataloging tools, such as software• Time spent on database maintenance• Overhead (training, policy development, documentation)• Opportunity costs

Page 42: Assessment 101

COLLECTING DATA

Page 43: Assessment 101

Types of data

Quantitative methods focus on numbers and frequencies; provide data that is easy to analyze statistically. “Numbers.”

Analysis of log data, systems reports, time data, web usage analytics, survey data (not free text)

Qualitative methods capture descriptive data and focus on experience and meaning. “Words.”

Usability testing, focus groups, user interviews, ethnographic studies

Page 44: Assessment 101

Coding qualitative data

"There's no such thing as qualitative data. Everything is either 1 or 0.”

- Fred Kerlinger

• While qualitative data provides the important whys and hows of user behavior, it is difficult for us to digest large quantities of descriptive data.

• It is often useful to code quantitative data qualitatively for analysis.

Page 45: Assessment 101

• Fear: assessment takes a lot of time• Reality: it depends on the methodology and data

sources used• Qualitative data gathering, coding, and analysis

usually take a lot of time• Systems can be set up to gather quantitative data

programmatically. Such data can be analyzed quickly, given the proper tools and skills

• Quantitative data might also be gathered manually. Data collection will be a hassle, but analysis will be quick

Page 46: Assessment 101

Existing data or new data?

Page 47: Assessment 101

digital exhaust data

Page 48: Assessment 101

Collect new data

• Know what questions the data needs to be able to answer

• Data requirements; structure of the data• Make sure you will be able to extract the data• Make sure the data format you’ve chosen will

be interoperable with any other data you are using in an initiative

Page 49: Assessment 101

Bad data planning

Page 50: Assessment 101

METHODOLOGIES / TECHNIQUES

Page 51: Assessment 101

A/B Testing

• Common in web-based marketing research. Involves an online performance comparison between a control group and a single variable test group.

• Measures differences in web usage stats• Could test use differences based on: presence

or absence of metadata, order of metadata display, metadata display labels

Page 52: Assessment 101

Warning

• If more than one variable is at play during A/B testing, it becomes difficult to know which variables was responsible for performance differences achieved.

• As of summer 2012, Google Analytics includes a new A/B Testing feature.

Page 53: Assessment 101

Focus groups

• A form of qualitative research in which a group of people are asked about their perceptions and opinions of, or interaction with, a product or service.

• Questions are asked in an interactive group setting where participants are free to talk with other group members.

• A moderator leads a small group of people who share a common experience or characteristic through a discussion using a pre-prepared script of open-ended questions.

Page 54: Assessment 101

Usability testing

• Focuses on measuring a product's ability to meet its intended purpose by gathering direct input about how real users use the system.

• In contract to a focus group or interview, usability testing captures a users' behavior whether they are aware of it or not.

• Code as you go! Software such as Morae includes special features to help with this.

Page 55: Assessment 101
Page 56: Assessment 101
Page 57: Assessment 101

Web usage analytics data

What kinds of things you can track?• Clicks on any link (Event Tracking)• The pages people came from• The pages people go to next on your site• The pages people exit your site from

Page 58: Assessment 101
Page 59: Assessment 101

Important to evaluating “value”• A/B testing• Focus groups• Usability studies• Web usage analytics data

Important to evaluating “cost”• Time studies

Page 60: Assessment 101

Watch out for bias

• Biased goal: “To prove the value of X”• Unbiased goal: “to prove or disprove the value

of X”• It can be considered a serious ethical conflict

to have the people who benefit from a certain outcome of assessment be the same people who conduct the assessment.

Page 61: Assessment 101

Institutional Review Board (IRB)

• Charged with protecting the rights of human research subjects

• Mandated by federal law• Library projects are the least of their worries,

but you may need to go through the process• Check with your Assessment Librarian or

Assessment committee; they may have guidelines for when to go through review

Page 62: Assessment 101

Institutional Review Board (IRB)

Guidelines provided by IRB rep to UNC-CH:Must do IRB if you are using human subjects and plan to…• Publish results• Make generalizable claims• Collect identifying or sensitive information

(SSN, sexual orientation, names)

Page 63: Assessment 101

Institutional Review Board (IRB)

• All investigators listed on IRB must do a CITI online ethics training (~3 hours) the first time they submit an IRB proposal ( no need to list every single person involved in the study!)

• Most library research is eligible for expedited approval

• Tip for quick process: list the IRB approval number for similar library studies already approved by the IRB.

Page 64: Assessment 101

THE ROLE OF MANAGEMENT

Page 65: Assessment 101

Earning buy-in

• Discuss/explain assessment and why it’s important

• Considering bringing a speaker to a department meeting to talk about assessment practices, tools, ideas, etc.

• Show examples from other institutions about how assessment benefited a department

• Explain that you aren’t assessing staff; you’re assessing workflows

Page 66: Assessment 101

Dealing with fear of task timing

• It’s scary to be asked to time yourself as you conduct your daily tasks!

• Assumption that data collection is based on a desire to decrease the time spent on tasks, or penalize those who take “too long”

• Explain the big picture and what you’re trying to achieve; consider whether anonymous data could be just as useful to you

Page 67: Assessment 101

Prove that it makes a difference

• Make sure you don’t forget the 4th step of assessment: take ACTION based on findings

• Inform staff of how their work informed your decision-making and helped the management of the department: show concrete changes

• Praise staff participating in assessment library-wide; publicize the success of the assessment project within the library or department

Page 68: Assessment 101

Provide staff with training

• It’s important to provide staff with the training they need to do what you’re asking

• Does required reporting involve querying an Access database? Crunching data in Excel (writing functions or making pivot tables)? Are you asking them to create a survey?

• These are often not listed in required job skills, but people are later tasked with data reporting

Page 69: Assessment 101

Thank you!• Joyce Chapman• [email protected]

“Digital exhaust” base matrix image credit: http://alwaysoncommunications.com/22/advanced-online-audience-buying/

Resources• Kyrillidou, Martha. (Feb. 2008). FLICC meeting, “Planning for Results: Making the Data Work

For You.” “Why Assess? What is assessment? What do we mean by actionable data?” The Cato Institute, Washington, D.C.

• McKinsey Global Institute report (May 2011), “Big data: the next frontier for innovation, competition, and productivity.”

• Stalberg, Erin & Christopher Cronin. (2011). "Assessing the Cost and Value of Bibliographic Control.” Library Resources & Technical Services, 55(3) 124-137.

• Weiner, Wendy. (2009). “Establishing a Culture of Assessment.” Academe 95(4).

Page 70: Assessment 101

SPEAKING OF ASSESSMENT…

Please fill out the evaluation survey for the ALCTS events that you attended at ALA annual conference 2012 in Anaheim, CA:

http://www.surveymonkey.com/s/alctsevents2012?

Page 71: Assessment 101

GROUP DISCUSSION

Page 72: Assessment 101

Discussion questions

1. What are the kinds of things we always want to know about ourselves (TS) but never invest the time in assessing?

2. What are your ideas about / experiences with how to create a culture of assessment?

3. What important assessment work do we need to do as a community? Can we initiate any of that collaborative work now?