value and impact measurement programme · 2014-11-21 · libqual+ has been popular with most sconul...

42
Value and Impact Measurement Programme Report for WGPI, SCONUL Suzanne Lockyer, LISU Angela Conyers, Evidence Base Claire Creaser, LISU

Upload: others

Post on 22-Jul-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Value and Impact Measurement Programme Report for WGPI, SCONUL

Suzanne Lockyer, LISU Angela Conyers, Evidence Base Claire Creaser, LISU

Page 2: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Contents 1. Introduction ..................................................................................................................... 1

2. Critical Review ................................................................................................................ 2

2.1 Introduction............................................................................................................ 2

2.2 The Effective Academic Library and beyond: the SCONUL contribution............... 2

2.3 Projects in the HE Library sector ........................................................................... 9 2.3.1 The SCONUL/LIRG impact project.........................................................................9 2.3.2 EQUINOX (1998-2000).........................................................................................10 2.3.3 The JUBILEE project (2000-2004)........................................................................10 2.3.4 The JUSTEIS project (1999-2002) .......................................................................10 2.3.5 The Evalued project and the Evalued toolkit (2003-4)..........................................11 2.3.6 The e-measures project (2003-5) .........................................................................11 2.3.7 The Outcomes project (2003-5)............................................................................11 2.3.8 SCONUL Process benchmarking projects ...........................................................11 2.3.9 General comments on projects.............................................................................12

2.4 Other UK library examples .................................................................................. 12 2.4.1 British Library ........................................................................................................12 2.4.2 Public libraries.......................................................................................................12 2.4.3 Health libraries......................................................................................................13 2.4.4 Further education libraries ....................................................................................13 2.4.5 School libraries .....................................................................................................14

2.5 Examples from academic library communities in other countries........................ 14 2.5.1 United States (ARL)..............................................................................................14 2.5.2 Australia (CAUL - Council of Australian University Librarians).............................15 2.5.3 South Africa ..........................................................................................................16

2.6 Summary of Critical Review ................................................................................ 16

3. Survey of members....................................................................................................... 18

3.1 Questionnaire ...................................................................................................... 18

3.2 Interviews ............................................................................................................ 23 3.2.1 Understanding of the term ‘value and impact measures’ .....................................23 3.2.2 Institutional requirements and drivers...................................................................24 3.2.3 Barriers to undertaking value and impact measures ............................................25 3.2.4 Advantages of undertaking value and impact measures......................................28

3.3 Summary of survey of members.......................................................................... 28

4. Synthesis and recommendations.................................................................................. 29

4.1 Synthesis of results ............................................................................................. 30

4.2 Gap analysis........................................................................................................ 31

4.3 Recommendations............................................................................................... 32

5. References.................................................................................................................... 34

Appendix 1: Questionnaire .................................................................................................. 38

Appendix 2: Interview Schedule .......................................................................................... 40

Value and Impact Measurement Programme i

Page 3: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

1. Introduction

Library managers are increasingly concerned with the impact of their services on learning, teaching and research. A recent (January 2006) survey of the top concerns of SCONUL representatives noted the increasing need to review services and demonstrate the essential role of the library within the institution.

The Working Group on Performance Improvement (WGPI) has an active role in providing tools and support for members undertaking measures. However, the current tools may not be sufficient to demonstrate value and impact. The Value and Impact Measurement Programme aims to review the tools currently available and establish members’ understanding and use of value and impact measures in order to identify gaps and inform future work.

This report outlines the findings of the first phase of the programme. The report opens with a critical review of some of the major tools and methodologies currently available. A survey of members, including a questionnaire sent to all members and follow-up interviews with a selection of respondents, demonstrates the current level of activity within libraries. From these two stages, the gaps in current provision are identified and recommendations made for the next phase of the programme.

Value and Impact Measurement Programme 1

Page 4: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

2. Critical Review

2.1 Introduction This review aims to cover the major methods and tools currently available to SCONUL member libraries which can be used to measure the impact, worth or value of the library service. It covers methods and tools available through SCONUL itself or promoted by SCONUL and its working groups, and those developed outside SCONUL but aimed mainly at the UK higher education community. It also looks at related work done for the British Library, the UK further education, school, public and health library sectors, and at the services available to members of the Association of Research Libraries (ARL) in the US, the Council of Australian University Librarians (CAUL) and at work in South Africa. Although a number of references are given in the review, this is not intended as a full survey of the very extensive research literature in this area. A regularly updated bibliography on ‘The impact and outcome of libraries’ produced as part of the work of the IFLA Section for Statistics and Evaluation is available from the Münster University Library web-site (Poll, 2006).

2.2 The Effective Academic Library and beyond: the SCONUL contribution

The starting-point for this review is The Effective Academic Library: a framework for evaluating the performance of UK academic libraries, which was published in March 1995 (HEFC 1995). This was a consultative report from an ad hoc group on Performance Indicators for libraries set up by the Joint Funding Councils following the publication of the Follett Report (HEFC 1993).

This report proposed a framework to identify library effectiveness in five key areas:

• Integration

• User satisfaction

• Delivery

• Efficiency

• Economy

Within these five areas, a set of 33 indicators was recommended. It is not the purpose of this review to look in detail at all these recommendations and how they have been followed through but rather to identify what SCONUL itself has done to address the issues raised and what resources are available from the SCONUL web-site to assist member libraries.

A list of performance indicators in each of the five key areas is followed by a commentary indicating what work has been done. The boxes indicate specific recommendations to SCONUL or the Advisory Committee on Performance Improvement (ACPI) within the Effective Academic Library report.

Value and Impact Measurement Programme 2

Page 5: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

P.1 Integration Identifying the level of integration between the mission, aims and objectives of the institution and those of the library service.

Recommended Performance Indicators

1.1 Strategic cohesiveness: Availability of documents e.g. strategic plans, information strategies

1.2 Resourcing mechanisms: Evidence of formal budget mechanisms and links with strategies

1.3 Planning process: Evidence of formal processes for new course development, research development etc.

1.4 Service-user liaison: Evidence of formal and informal communication channels between library service, senior management, academics and students

1.5 Assessment and audit mechanisms: Regular assessment and audit e.g. library reviews or via subject reviews. Degree of involvement of academic and external library colleagues.

Commentary

SCONUL Working Group on Quality Assurance Much of the work of the Working Group on Quality Assurance has been concentrated on assisting SCONUL libraries with preparations for Institutional Audit. It successfully produced an aide-memoire for the Quality Assurance Agency (QAA) to assist reviewers in earlier subject reviews and has produced the SCONUL guidelines for QAA Institutional Audit in England. Its web-site contains examples of documents produced for Institutional Audit and members’ experiences of the process.

The working group’s current action plan emphasises this role:

• Develop resources available for SCONUL/UCISA members via the Group’s web pages so as to provide members with a range of resources and information to support their input to quality assurance processes in their own institutions

Although directly related to advising on the library’s contribution to institutional review, this area provides the main example of ways of showing the integration of library services. Other aspects considered in sections P2 and P3 (e.g. HELMS, student surveys) are also relevant here.

Value and Impact Measurement Programme 3

Page 6: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

P.2 User satisfaction Measures devised should reflect both:

• Views of user constituencies on extent to which library meets expectations

• Users’ reports of experiences and level of satisfaction when using particular services

Recommended Performance Indicators

2.1 Overall user satisfaction

2.2 Document delivery services

2.3 Information services

2.4 Study facilities

2.5 Information skills programme

EAL recommendation

It was suggested that HEFCE funding should be made available for ACPI to develop & test suitable instruments for user satisfaction data, draw up guidelines on sample sizes and document outcomes of work

Achieved (without HEFCE funding)

Commentary

2.1 Overall user satisfaction

SCONUL Working Group on Performance Improvement The Working Group has been responsible for a number of SCONUL initiatives in the area of user surveys:

• SCONUL user survey templates

Originally introduced in 1996, the survey has been regularly updated to take account of e-resources and converged services. The templates allow for local changes. A number of libraries are using Libra from Priority Research and WGPI keeps in touch with them to ensure that the version used by their customers is appropriate to their needs.

• LibQUAL+ ™

The Working Group has overseen the participation of SCONUL libraries in the worldwide annual LibQUAL+ survey run by ARL. A total of 20 libraries are participating in the 2006 SCONUL LibQUAL+ consortium. Eight are repeat participants, making a total of 54 different institutions participating in the SCONUL consortium over the four years. In 2005, seventeen SCONUL libraries took part, several of these participating for the second or third time. SCONUL

Value and Impact Measurement Programme 4

Page 7: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

has obtained a reduced participation fee for member libraries. Training is also organised.

LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis (it is all done for you as part of the LibQUAL+ service), standard and detailed outputs and the benchmarking of results against other HE libraries (Town & Lock, 2005)

Unlike the SCONUL user survey, however, there is no room for local customisation. Additional questions have, however, been added for the UK version.

• Other surveys and results

The WGPI web-site also includes the LSE Library survey as an example of ‘best practice’ and contains the results of some library surveys as submitted by member libraries.

• Benchmarking

A report by LISU on ‘Benchmarking the SCONUL standard user survey’, published in January 2005, gives the results of a pilot analysis into possibilities of benchmarking using the standard SCONUL user survey template (Creaser, 2005). Nine member libraries took part.

Some preliminary attempts have also been made to benchmark some UK results of the LibQUAL+ surveys, though there are not yet sufficient results available for significant results.

• Take-up of surveys among member libraries

A survey carried out in December 2003 (West, 2004) showed that out of 65 respondents, 62 (95%) carried out user surveys. The SCONUL user survey template was mentioned by 16 respondents and the number of actual users may have been higher.

2.2 Document delivery services 2.3 Information services 2.4 Study facilities

• SCONUL Annual statistics

The SCONUL annual library statistics have been processed by LISU since 1995, though they go back well beyond that date. Since 1995, they have undergone some major overhauls:

- In 2000/2001, library statistics for HE Colleges previously maintained separately were incorporated into the SCONUL return. This change was used as the opportunity to revise the questions and ensure that they were still relevant to libraries’ interests, while maintaining the continuity of the existing structure

Value and Impact Measurement Programme 5

Page 8: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

- In 2003/4 a number of new questions were added on a trial basis to record holdings, use and costs of e-resources. These questions were derived from the experiences of 25 SCONUL libraries who took part in the HEFCE-funded e-measures project run by evidence base with the support of SCONUL (see The e-measures project: 2.3.6, page 11). Following the trial run, further discussions are ongoing on the form of these questions and the ratios that can be derived from them.

- The inclusion of e-measures questions within the SCONUL return has led to a number of queries from member libraries on how these statistics should be kept and the results of the first trial year appear promising (Conyers, 2005)

• Benchmarking with the SCONUL annual statistics

Libraries are able to request reports from LISU at moderate cost based on their own usage or on that of selected ‘peer groups’. From 2005, the task of analysing statistics and benchmarking results with other libraries has been made far easier for individual libraries with the introduction of a web-based version of the SCONUL statistics.

2.5 Information skills programme

SCONUL Working Group on Information Literacy This group has been active in promoting information literacy, with its now established Seven pillars of information literacy model. A report in 2004 on Learning outcomes and information literacy, produced with the support of the Higher Education Academy, presents a series of case studies as examples of good practice where information literacy has been incorporated into the curriculum (Peters, 2004). A series of performance measurement workshops were developed in 2001 by the SCONUL Task Force on Information Skills and were used to develop the Critical Success Factors in the area of information literacy.

There are links from the web-site to the large number of articles and conference papers given by members of the working group, and to IL tutorials from SCONUL libraries and also from libraries in the United States and Australia.

In addition, questions have been added to the LibQUAL+ survey to gather some information in this area.

P.3 Delivery

Recommended Performance Indicators

3.1 Meeting service standards (local) Setting of local agreed standards e.g. how long to get an ILL or reshelve a book

EAL recommendation HEFCE to commission ACPI to create specimen set of service standards No HEFCE funding – not achieved

Value and Impact Measurement Programme 6

Page 9: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

3.2 Meeting development targets (local) Targets to be reported on annually and in reviews of library services

3.3 Documents delivered per FTE Need to develop measures for e-documents

3.4 Enquiries answered per FTE

3.5 Information skills instruction per FTE

3.6 Library study hours per FTE

3.7 Volumes in stock per FTE

Commentary

A number of HE libraries have been awarded the Charter Mark, either directly or through institutional applications. Charter Mark holders are required to meet six criteria:

Set standards and perform well Actively engage with your customers, partners & staff Be fair and accessible to everyone and promote choice Continuously develop and improve Use your resources effectively and imaginatively Contribute to improving opportunities and quality of life in the communities you serve

This development fits with the set of performance indicators relating to service standards and development targets here recommended by the Effective Academic Library for local collection (3.1 and 3.2, page 6). Charter Mark holders and potential holders have their own mailing list (lis-chartermark) and the process has been well described in SCONUL Focus (Morrow, 2005).

Service standards have been a major emphasis in UK public libraries and are discussed in Public Library Service Standards and Impact Measures: 2.4.2.1 (page 12).

• SCONUL Annual statistics

The SCONUL statistics described in 2.2 (page 2) above have a ‘Library provision and use’ section which contains a number of indicators based on number of FTE students. Discussions are ongoing about incorporating ratios for the new e-measures questions in this section.

The web-based version of the statistics makes sorting by FTE or other variable much simpler.

• HELMS

The UK Higher Education Library Management Statistics (HELMS) have been produced since 1997/98, as a result of work following the publication of the Effective Academic Library. They present a small sub-set of management statistics from the SCONUL Annual Library Statistics, based mainly on FTE user ratios, with accompanying library and institutional contextual data and charts. This publication is aimed at Vice-Chancellors, to present a ‘snapshot’ view of libraries.

Value and Impact Measurement Programme 7

Page 10: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

P.4 Efficiency

Recommended Performance Indicators

4.1 Items processed per library staff FTE

4.2 Total library expenditure per items processed

4.3 Documents delivered per library staff FTE

Need to establish measure for delivery of e-documents

4.5 Enquiries answered per library staff FTE

4.6 Total Library expenditure per enquiries answered

4.7 Total library expenditure per study hours pa

4.8 Volumes in stock per library FTE

4.9 Total library expenditure per volumes in stock

EAL recommendation

SCONUL to carry out investigation of model to analyse staff time and costs, so these can be divided among different library activities

Some discussion, but not conclusive

ACPI to devise measures of library efficiency in support of research

No action – but worth revisiting in light of institutional repositories?

Commentary

• SCONUL Annual Statistics

All these ‘efficiency’ indicators can be answered from the SCONUL annual statistics, though not all are currently in the ratios section and those not included may not now be considered as relevant. Although there has been discussion from time to time on how to attribute staff time to different activities, this has never been attempted on a national level, though there may be local examples. The only exception is the division of time between ‘library’ and ‘computing’ activities for those working in a converged service.

P.5 Economy

Recommended Performance Indicators:

5.1 Total Library expenditure per FTE

5.2 Library staff expenditure and operating costs per FTE

5.3 Space per FTE

5.4 FTE per number of libraries

Value and Impact Measurement Programme 8

Page 11: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

5.5 Acquisition costs per FTE

5.6 FTE per professional library staff

5.7 FTE per seat

Commentary

SCONUL Annual statistics These ‘economy’ ratios are found in the SCONUL Annual statistics and also in HELMS.

Conclusion on the Effective Academic Library The section above has attempted to map SCONUL’s current support to member libraries onto the most relevant set of performance indicators proposed in the Effective Academic Library. SCONUL Focus and e-bulletins also keep members in touch with developments and provide a forum for libraries to write up their own projects in this area.

2.3 Projects in the HE Library sector There have been a number of other projects concerned with demonstrating the impact, value or worth of the UK HE library service over the past few years, generally funded by HEFCE or JISC and often supported by SCONUL. Many of these relate to the new electronic environment, and take up the challenges in the Effective Academic Library on how these new resources should be measured and valued. Some of the major projects are described below.

2.3.1 The SCONUL/LIRG impact project The SCONUL/LIRG impact initiative was started in 2003 as a joint initiative with the CILIP Library and Information Research Group (LIRG). There have been two cohorts taking part, 10 libraries in 2003 and a further 12 in 2004. Each programme ran for one year, with regular meetings and workshops and support through an email list. Project aims, interim and final reports were made available to all participants on a project web-site.

The project has been extensively written up, in a special issue of Library and Information Research (2005) and elsewhere by individual libraries reporting on their projects in articles and conference presentations, by the originator, Philip Payne, (Payne & Conyers, 2005) and the facilitators (Sharon Markless and David Streatfield), whose recent article (2005) describes the methodology used for the project.

The Impact initiative began with a well-attended seminar which explored impact projects in the higher education and other sectors on the topic ‘Do libraries aid learning?’ Discussion at this event showed that more work was needed if a research methodology was to be developed which libraries could use to demonstrate their impact on users and stakeholder groups.

Libraries taking part were asked from the outset to propose their own subject of research. Significantly, perhaps, the 22 topics chosen concentrated on some aspect of assessing the impact of electronic services or of information literacy work.

Value and Impact Measurement Programme 9

Page 12: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

At the final session, a number of points were raised by participating libraries which concerned the future of this and other such initiatives. These are reproduced in full here, as they have a bearing on this new VAMP project and how it might be followed through:

• The need for greater clarity on what is expected of participating institutions.

• The need for ongoing support for participating institutions and a more pro-active approach to nudging and encouraging participants

• The need for more assistance in relation to the use of qualitative research methods and the analysis of qualitative data.

• The need to be able to share, and draw upon materials and resources, that have already been produced [from the Initiative but also from projects such as eValued].

• The need to encourage greater sharing of experience between participating institutions. [Bournemouth and the UWE worked collaboratively on their project and this was very successful.]

• The need to take account of prior experience of conducting research and adjust the events according to the needs of those who will be conducting the projects.

There was strong support for seeking to develop a ‘community of practice’, building upon involvement in the Initiative and seeking to support others who wished to measure impact.

2.3.2 EQUINOX (1998-2000) The EQUINOX project (Library Performance Measurement and Quality Management System), led by Peter Brophy of CERLIM was funded under the European Commission’s Telematics for libraries programme and aimed to develop international agreement on performance measures for the electronic library and also to test a quality management and performance measurement tool for library managers. The project was completed in November 2000 and has been important in informing international standards work. The toolkit developed was offered to SCONUL members (Brophy, 2001).

2.3.3 The JUBILEE project (2000-2004) The Jubilee project (JISC User Behaviour in Information Seeking: Longitudinal Evaluation of EIS) was run by the Information Management and Research Institute (IMRI) at Northumbria University and funded by JISC (Coulson & Banwell, 2004). The project involved in-depth surveys with around 25 HE libraries and also included FE. One of its aims was to produce “an Action Plan for HE and FE managers to adopt in order to facilitate the embedding of optimal EIS use into their institution's customs and practices”. The result of this has been the JUBILEE toolkit, which uses data collected during the project to help libraries record progress in embedding electronic information services from baseline through to full integration. The toolkit is currently under review.

2.3.4 The JUSTEIS project (1999-2002) The JUSTEIS project (JISC Usage surveys: trends in Electronic Information Services) was funded by JISC and run by the Department of Information Studies at the University of Wales Aberystwyth. It used critical success factors and critical incident techniques to assess user behaviour (Urquhart & others, 2003)

Value and Impact Measurement Programme 10

Page 13: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

2.3.5 The Evalued project and the Evalued toolkit (2003-4) The Evalued project was funded by HEFCE and run by evidence base (McNicol, 2004b). An important outcome of the project was the setting up of a toolkit, which is “designed to support information services staff in Higher Education Institutions with the evaluation of electronic information services (EIS)”. It has four main sections, ‘How to evaluate EIS’, ‘EIS evaluation themes’, ‘evaluation tools’ (e.g. questionnaires, surveys) and ‘custom tools’ where users can produce their own material. It provides a readily available source of information on good practice, model questions, evaluation themes, case-studies, and practical advice on how to conduct evaluations with library staff, students and academic staff.

Although its starting point is the evaluation of electronic services, its coverage in fact is more generally applicable and it contains a wide range of material for libraries to use. Training events run in association with the project have attracted participants across a large number of academic libraries.

2.3.6 The e-measures project (2003-5) The e-measures project was funded by HEFCE as a continuation of the Evalued project. As noted above, it had as one of its primary aims the setting up of a series of e-measures questions in the SCONUL Annual Statistics. At the same time, it was designed to help libraries with using e-measures statistics in their own budgeting and decision-making. Twenty-five SCONUL libraries took part in the project (Conyers, 2004).

2.3.7 The Outcomes project (2003-5) (http://www.ebase.uce.ac.uk/projects/outcomes.htm)

The Outcomes project was part of the HEFCE-funded Library Outcomes and Measures project run by evidence base in 2003-5. The project ‘‘aims to devise, pilot and test the feasibility of approaches to aligning library and information services outcomes with institutional aims through a case study approach”.

The final report of the project Academic libraries: planning, outcomes and communication (McNicol, 2004b) gives the results of surveys of UK higher education library directors and senior managers and covers areas such as the library’s involvement in the institutional planning process. It also covers comparable areas in selected overseas and special libraries. The work led to the development of the Outcomes database.

The Outcomes database forms part of the Evalued toolkit (see 2.3.5, above). It is intended to provide ideas for ways in which academic libraries can contribute to wider institutional goals such as Learning and Teaching, Widening Participation or ICT. The database contains examples from a range of institutions showing how libraries can contribute and how their contribution can be evaluated. Libraries are invited to add examples from their own institution by using the online form.

2.3.8 SCONUL Process benchmarking projects

In 1997 SCONUL undertook seven pilot process benchmarking projects covering areas such as: advice desks, library skills, counter services, interlibrary loans, and library environment. The project resulted in the production of the SCONUL Benchmarking Manual

Value and Impact Measurement Programme 11

Page 14: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

(Town, 2000) and has inspired the development of other benchmarking groupings, for example, the 1994 Universities Benchmarking Consortium and the Northern Consortium.

2.3.9 General comments on projects Although there are extensive write-ups of these various projects in the professional literature and they are based on sound research methods and have involved a large number of HE libraries, there is less evidence of any use of the various toolkits produced beyond the project participants. Thebridge and Hartland-Fox (2002), for example, report little knowledge or use of two current toolkits in their study for the evalued project . This is an area that is worth further exploration.

2.4 Other UK library examples A selection of examples of methods and tools used in UK library sectors is presented here. These examples are representative rather than definitive.

2.4.1 British Library In its recent report “Measuring our value”, the British Library (2005) gave the results of an independent economic impact study to measure the Library’s direct and indirect value to the UK economy.

The British Library used a technique known as ‘Contingent Valuation’ to assess its value. It set out to discover the value enjoyed directly by library users and indirectly by all UK citizens who get the benefits to the economy of the world class research which is underpinned by the library’s collections and services. The ‘consumer surplus’ (i.e. the value gained above the actual cost of library services to their consumers) was measured through a series of surveys which asked, among other questions, how much people would be willing to pay for the library’s continued existence and how much they would have to pay for alternatives if the library did not exist. Over 2,000 people –both users and indirect beneficiaries – were interviewed and the results showed that the value generated by the British Library was ‘around 4.4 times the level of its public funding’.

2.4.2 Public libraries

2.4.2.1 Public Library Service Standards and Impact Measures The Public Library Service Standards (PLSS)(DCMS, 2004), first introduced in 2001, have set a framework for performance monitoring in public libraries. These standards are being reviewed in the light of Framework for the Future and the aim will be to develop standards for measuring performance across core activities and also introducing impact measures that assess the impact of the library on identified priority areas and local needs.

The Public Library Service Standards set targets which libraries are expected to reach. As an example:

• PLSS5 – Requests suggests that 50% of requests for books should be met within 7 days, 70% within 15 days and 85% within 30 days. This type of standard would fit with the set of P2 indicators in the Effective Academic Library.

In developing impact measures, the MLA’s report on Developing performance indicators for local authority museums, libraries and archives (2005) proposes, in addition to the PLSS, a

Value and Impact Measurement Programme 12

Page 15: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

series of impact measures which relate to identified priorities e.g. provision and take-up of health related stock would relate to the aim of ‘promoting healthier communities’ and also to ‘quality of life’.

The guidelines recommend a minimum level of community profiling for which a toolkit is currently being developed and a set of measures which demonstrate reach, satisfaction, service impact and VFM.

2.4.2.2 Libraries Impact Project (LASER Foundation) The report conducted by PricewaterhouseCoopers and commissioned by the Laser Foundation in 2005 followed the work of seven pilot public libraries and took into account four of the Comprehensive Performance Assessment (CPA) areas set by the government (Laser Foundation, 2005). It aimed to show the public libraries’ impact on learning, health, social inclusion and community cohesion. The report included both quantitative and qualitative information and developed methods for libraries to produce their own measures to demonstrate value.

In providing practical examples, the report has contributed to the setting of new measures and provided detailed guidance to libraries. It supports the development of an evidence base and further work on approaches to measuring impact.

2.4.2.3 Longitude II. A library networking impact toolkit for a user-driven environment

The aim of Longitude II is to design & produce a web-based toolkit which can be adopted by UK public libraries to evaluate the longitudinal impact of IT-based services to end users (Brophy & Craven, 2004). It will include qualitative assessment to indicate why people use services, with illustrations of usage.

2.4.3 Health libraries The ‘evidence based’ approach to libraries (Booth & Brice, 2004) has its origins in the health sector. Studies such as the Value and EVINCE projects conducted by Aberystwyth University have attempted to show the impact of libraries on patient care or clinical competence, though as Urquhart points out (2004), such direct results are not necessarily productive. Her review of a number of impact studies in health libraries leads to some practical suggestions for conducting such studies.

In a further article based on her extensive experience of conducting impact studies, Urquhart (2005) advises on using methods which others have already tried and tested and warns against ignoring negative impacts.

2.4.4 Further education libraries The Council for Learning Resources in Colleges (CoLRiC) (2005) has produced a set of six questions as ‘Performance and Impact Indicators’ aimed at either library managers or students to be included within annual library surveys. These cover Service usage, User attitudes to satisfaction, User-service interface (quality control), Finding and using information and Integration of the LRS with curriculum delivery in the college. In Wales, fforwm, the body representing 25 further education colleges in Wales, has recently produced a quality toolkit to enable colleges to evaluate their library and ICT services (fforwm, 2006). This was based on a model already produced for Scottish FE Colleges

Value and Impact Measurement Programme 13

Page 16: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

(SLAINTE, 2003) and both use a similar tabular approach, identifying key questions, with quality indicators and evidence against each.

2.4.5 School libraries In the school library sector, a study conducted for Resource by the Robert Gordon University used a focus group and case study approach to develop a framework of learning experiences and examples of indicators (Williams & Wavell, 2001). This study demonstrated the need to look beyond the school library itself for evidence of impact.

A set of school library self-evaluation frameworks has been produced by the Dept of Education and Skills (Streatfield & Markless, 2004). These have been designed to assist school librarians with assessing the quality of their service and measuring outcomes.

2.5 Examples from academic library communities in other countries 2.5.1 United States (ARL) This information has been obtained mainly from the ARL web-site (www.arl.org). The aim has been to show what services in this area are available to ARL members.

ARL offer a suite of services in its ‘Statistics and Measurement program’ under the generic title ‘New measures initiatives’. These have been designed to address issues of service quality, electronic resource use and value and outcomes assessment. They include the following:

2.5.1.1 LibQual+tm

The LibQual+ web-based survey is now in use in over 500 libraries in the US, UK and elsewhere. Its aims are to:

• Foster a culture of excellence in providing library service

• Help libraries better understand user perceptions of library service quality

• Collect and interpret library user feedback systematically over time

• Provide libraries with comparable assessment information from peer institutions

• Identify best practices in library service

• Enhance library staff members' analytical skills for interpreting and acting on data

The survey itself is supported by training (see page 4), it is now becoming an established tool within the UK HE sector.

2.5.1.2 DigiQUALtm

The DigiQUAL project is based on the LibQUAL+ protocol and is being developed to assess the services provided for the user communities of the National Science, Math, Engineering and Technology Education Digital Library (NSDL) program.

2.5.1.3 MINES for LIBRARIEStm

MINES (Measuring the impact of networked information services) is “an online transaction-based survey that collects data on the purpose of use of electronic resources

Value and Impact Measurement Programme 14

Page 17: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

and the demographics of users.” As well as its use in the US, this service is being used by the Ontario Council of University Libraries (OCUL) to assess their use of electronic services through a common portal.

Results of a survey using MINES across seven main campus libraries and seven academic health libraries in 2003-5 are described by Kyrillidou and others(2005). Participation in the survey was mandatory and controlled through access arrangements to the electronic resources.

2.5.1.4 E-metrics The E-metrics project attempted to define data to be collected on the use and value of electronic resources. The e-measures project (see 2.3.6, page 11) drew on this work and had similar aims.

Associated with the e-metrics project, work has been done on developing models for measuring the impact of libraries on institutional outcomes (Fraser & McClure, 2002). Frameworks for outcome measurement are frequently used in US public libraries (Florida Dept of State, 2002; California State Library, 2003).

2.5.1.5 SAILS SAILS (Standardized Assessment of Information Literacy Skills) is a knowledge test with multiple choice questions aimed at testing standards of information literacy among students. Over 80 institutions in the US and Canada took part in the research and development phase.

2.5.2 Australia (CAUL - Council of Australian University Librarians) The information below has been taken from the CAUL web-site (www.caul.edu.au). The aim has been to show what help is available to Australian university librarians in measuring the impact, worth and value of their services.

CAUL has a ‘Best Practice’ programme. Among the activities it promotes are:

2.5.2.1 The CAUL Performance Indicator database

This enables comparisons of outcomes across member libraries. Members are also asked to contribute case studies detailing improvements or good practice. The three areas covered are:

A. Library/clientele congruence (i.e. satisfaction) indicator

B. Document delivery quality indicator - a new edition of the manual was produced in 2003 (CAUL, 2003)

C. Proportion of sought material obtained at time of visit (Materials availability)

2.5.2.2. Case studies Case studies and supporting documents relating to aspects of Best Practice from member libraries are provided.

Value and Impact Measurement Programme 15

Page 18: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

2.5.2.3. Client satisfaction and customer surveys These surveys use either Rodski or LibQual+. For Rodski, there is a CAUL portal where libraries can view their own data and download survey forms. This can also be used for benchmarking.

There is also a report on ‘Top performing libraries in Rodski client satisfaction survey’ (CAUL, 2004) where libraries which have done well identify the key factors in their success and their future action plans and benchmarking activities.

2.5.2.4. Publications and guidelines These include two related publications, one on Guidelines for the application of best practice in Australian university libraries (CAUL, 2000a)) and the other a Handbook of best practice (CAUL, 2000b). Both were published in 2000 and draw heavily on existing literature and projects both in Australia and abroad.

It is interesting to note in surveys reported in the above publications the variety of methods in use (e.g. balanced score card, TQM) in addition to the ‘Best Practice’ database.

Much work has been done on digital reference services, including a literature review and the identification of key Performance Indicators for digital reference services by the group of university librarians who form the University Librarians in the State of New South Wales (UNISON) group (UNISON, 2006).

It should be noted that the main sections of the Best Practice web-site have not been updated since 2004, and several publications cited have not been recently updated. Anecdotal evidence suggests that these materials may not currently be widely used, but this has not been examined.

2.5.3 South Africa Searching for direct links between a student’s achievement and their library use, Karin de Jager of Cape Town University describes a study which took the final year students in certain subjects with the highest and lowest exam marks and compared these to their library borrowing record (de Jager, 2002). Results showed that in most subjects studied students with high scores borrowed more books, though there were also instances of students with high marks who did not borrow books at all.

De Jager has also written on methods of performance measurement for public libraries in South Africa (de Jager & Nassimbeni, 2005).

2.6 Summary of Critical Review This review has shown that, since the publication of the Effective Academic Library in 1995, SCONUL, through its various working groups, has introduced or adapted a number of resources which address the report’s original recommendations. These resources are available through the SCONUL web-site and are also publicised in SCONUL Focus and on lis-sconul. Such resources are normally found through the relevant working group’s section of the web-site and are not collectively promoted or branded as examples of ‘value and impact studies’.

Value and Impact Measurement Programme 16

Page 19: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

In addition, SCONUL has provided active support to other projects in the HE library sector, most notably the SCONUL/LIRG impact initiative and the e-measures project. There have been a number of other ‘value and impact’ studies which have produced toolkits and other resources of potential benefit to SCONUL members. As externally funded projects, these have generally been time-limited with no provision for continued sustainability when the project ends.

Other UK library sectors have also recognised the importance of demonstrating the impact of their services. Approaches range from the British Library’s use of the ‘Contingent Valuation’ technique, to the detailed service standards and targets set for the public library sector. Attempts have been made in the health sector to show the impact of libraries on patient care and clinical competence and in school and FE sectors to capture the impact of libraries on learning.

Examples drawn from the United States, Australia and South Africa show how academic libraries in other countries are attempting to address the measurement of impact or outcome of their services. In the United States, the Association of Research Libraries (ARL) is leading on several important initiatives, including LibQual which has been taken up by a number of UK academic libraries with the support of SCONUL. In Australia, the Council of Australian University Librarians (CAUL) has made available to its members a number of detailed manuals and case studies to assist in the identification of good practice. The Critical Review has given examples of the material available in other countries, but has not attempted to show how successful it has been or how widely it is currently used.

Value and Impact Measurement Programme 17

Page 20: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

3. Survey of members

A survey of SCONUL members was carried out in parallel with the critical review. Its aim was to investigate the level of current activity with regard to value and impact measurement, and to identify the key gaps in the range of available tools perceived by members.

3.1 Questionnaire For the first stage of the survey, a short questionnaire was distributed electronically in April 2006 via the main SCONUL mailing list, with a reminder about ten days later. The timescale for replies was necessarily short, and a total of 38 institutions, representing a broad cross section of the SCONUL academic library membership, responded. A copy of the questionnaire is given in Appendix 1.

Of the 38 institutions responding,

• 26 had undertaken some value and impact measures.,

• Six institutions had participated in the LIRG/SCONUL Impact Initiative (2 in Phase 1 and 4 in Phase 2).

• Only eight institutions were required to undertake measurements by their parent institution, and of these only three specified data and three methodology (never both). However, there was some discrepancy between Question 1 (does your institution require you to undertake value and impact measurement – 8 ‘Yes’ responses) and Question 7 (reasons for undertaking measures – 6 responses to ‘required by institution).

Table 1 shows the reasons given for undertaking value and impact measures.

Table 1 Reasons given for undertaking value and impact measures

Reason for measures Institutions/number (total respondents =26)

Advocacy in institution 19

Improve services 20

Comparison with other institutions 17

Required by institution 8

Other 10 Most libraries gave more than one response

Other reasons included response to external drivers (for example HEFCE ‘Value for Money’) or to provide evidence for library management decisions, for example collection management.

Value and Impact Measurement Programme 18

Page 21: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Respondents were asked who, in the library service, was responsible for instigating value and impact measures. Figure 1 illustrates the results, several libraries which had not undertaken any measures to date also responded.

Figure 1 Person responsible for instigating value and impact measures in the library

213

5

5

Head of service

Deputy

Management team

Other

Other roles mentioned were those responsible for development, planning and/or resources. Number of respondents = 34.

Libraries were aiming to measure a variety of aspects of the service, which can be roughly categorised as shown in Table 2. Categories were assigned by the researcher and responses allocated to the ‘best-fit’ category.

Table 2 Purpose of value and impact measures undertaken by 26 institutions Some projects fitted more than one category

Purpose of Value and Impact Measure exercise Institutions/number (total respondents =26)

Customer satisfaction/service evaluation 14

Benchmarking/comparison with other institutions 8

Staffing/operational issues 4

Use of service (e.g. by a particularly category of user, or of a particular service) 3

Impact of library overall, or of specific aspects of service 3

e-resources 2

Library service evaluation specifically related to institutional strategies 3

The frameworks used reflect these purposes, with survey packages such as LibQual+ mentioned by eight institutions. Such packages were seen as useful in providing objective measures, allowing benchmarking, and prioritising issues (Libra). However, respondents noted lack of flexibility, particularly relating to local issues as a disadvantage of these packages. Therefore, it is not surprising, that many libraries devised their own research tools (9 institutions). A summary of the tools and frameworks specifically mentioned is given in Table 3.

Value and Impact Measurement Programme 19

Page 22: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Table 3 Frameworks/tools used by the 26 libraries which had undertaken measurement projects Frameworks/tools used Institutions/number

In house methodology 10

Survey packages e.g. LibQual+, Libra, Sconul 8

Internal data 5

External Stats e.g. SCONUL ALS 5

Business planning methods (e.g. Balanced Scorecard) 3

Consultants 2

Data shared with other institutions 2 Some libraries had done more than one exercise and/or used a variety of tools

Most respondents reported some success with their value and impact measures, but did not elaborate. Six respondents thought the exercise valuable for advocacy. Two institutions mentioned that the methodology used had now become part of normal working practice.

The problems reported with the methods used and with the overall process could be categorised as shown in Table 4.

Table 4 Problems reported by the 26 libraries which had undertaken measurement exercises Some reported more than one problem Problems reported Institutions/number

Time 9

Poor response rates 3

Reliance on external data 3

Lack of comparator institutions 3

Lack of robust methodology 3

Lack of staff knowledge/skills 2

Stakeholder buy-in 3

Lack of flexibility in packages 2

Quantitative basis (‘cost not value’) 1

Lack of measures for e-resource use 1

Cost 1

The problems can be divided into library related problems e.g. lack of time, lack of knowledge and skills, and problems related to tools such as lack of suitable methodologies and reliance on external data. The problems listed above are based on experience of carrying out measures. However, similar issues arise from question 14 (what do you perceive as barriers to the use of such tools/frameworks) and this includes responses from libraries which have not carried out any measurements (Table 5).

Value and Impact Measurement Programme 20

Page 23: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Table 5 Perceived barriers to value and impact measurement

Perceived barriers

Measurements undertaken/no. of

responses N= 24

No measurements undertaken/no. of

responses N=10

Lack of appropriate methodology/statistics 13 7

Lack of resources (time/cost) 12 6

Skills and knowledge 7 3

Stakeholder buy-in 7 4 Some respondents noted more than one barrier

Figure 2 illustrates the percentage of respondents in each category (those who had carried out measures and those who had not) citing specific barriers.

Figure 2 Percentage of respondents perceiving specific barriers

5450

29 29

70

60

30

40

0

10

20

30

40

50

60

70

80

method resources skills stakeholder buy-in

% o

f res

pons

es

Measures

No previous measures

Lack of appropriate methodology covered several issues; in particular was the need for tools to be flexible to allow adaptation to local needs, but also allow comparison between institutions. Several specialist institutions mentioned the lack of comparison institutions and/or the lack of relevance of national statistics as being a barrier.

Time related issues included time to plan and carry out measures, but also time to learn methodology. This is related to the issues of skills and knowledge, which covered both management knowledge to select tools and staff knowledge to apply them. These issues emerged in comments about appropriate methodology, e.g. that it should be ‘accessible’ and ‘easy to use’. The questionnaire covered current level of knowledge of performance measurement tools; responses to this represent senior library management knowledge. Figure 3 illustrates the level of knowledge reported by the 38 respondents.

Value and Impact Measurement Programme 21

Page 24: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Figure 3 Level of knowledge of performance measurement tools/frameworks

4

20

12

2

none

little

moderate

extensive

Number of respondents = 38

Stakeholder buy-in was mentioned as having been a problem in previous measurement exercises and was also seen as a perceived barrier. This included engaging staff in the process, maximising user participation in the exercise and also stimulating interest in the results. Problems (actual and perceived) within the wider institution included lack of understanding of results and ability of such measurements to influence decisions in the library’s favour.

Although most respondents listed barriers, all respondents were keen to undertake measures, and Table 6 summarises the areas libraries would like to measure:

Table 6 Areas of service libraries would like to measure Areas to measure Institutions/number

Staffing/operational issues 16

Impact of library : general and specific services 12

e-resources 11

Impact of library related to institutional strategies 5

Cost issues 5

Customer satisfaction 5

Use of library: general and specific services 2

All (38) respondents completed this question and several gave more than one area of interest

Compared with the measures previously undertaken, future measures concentrate more on the impact of the library (12 institutions compared with 3 that had measured impact to date) and also more awareness of the library related to institutional strategies (5 responses compared with 3). Staffing and operational issues are of particular concern. Areas to investigate included use of subject librarians, technical services, shelving, enquiry service and introduction of self service.

Value and Impact Measurement Programme 22

Page 25: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

E-resources is given as a separate category because it was mentioned in several of the answers. Lack of measures for e-resources was seen as a barrier, but growth of these resources impinges on many other issues. Some of the aspects mentioned by respondents were uptake of e-resources, accessibility of e-resources, cost of e-resources vs print, value of e-resources to research, impact of e-resources, and comparison of use of e-resources within institutions and nationally.

Respondents were asked to cite examples of good practice. Several of those involved in the LIRG/SCONUL Impact Initiative thought this represented good practice which should be more widely disseminated; it was also mentioned as an ‘ideal method’. Other methodologies developed specifically for libraries and considered good practice or an ideal methodology included Evalued (www.evalued.uce.ac.uk), the Value and Evince Studies (Urquhart and Hepworth 1995; Davies et al 1997); and theBritish Library ‘Measuring our Value’ exercise (British Library 2005). Business planning tools such as Balanced Scorecard were also suggested by some respondents. However, several admitted having no suggestions or stated that no ideal methodology existed.

3.2 Interviews Twelve respondents were approached to provide follow up interviews. These were selected from volunteers to give a range of institutions and level of involvement in value and impact measures. However, given the short timescale of the project only ten respondents were able to take part in the interview stage.

The aim of the interviews was to investigate in more depth some of the issues raised in the questionnaire. A copy of the interview schedule is given in Appendix 2.

3.2.1 Understanding of the term ‘value and impact measures’ To start the interview, all respondents were asked what they understood by the term ‘value and impact measures’. This was thought to be a crucial point, both to put the interview into context and to help explain the range of measures being undertaken, as revealed by the questionnaire survey.

Although all respondents readily provided some definition, one respondent noted that she believed there was not yet a ‘sectoral understanding’ of value and impact measures. One respondent noted that she had not come across this as a phrase – linking the two terms - before. This seems to be common view, with respondents either defining the two terms separately, or taking only one of the terms. However, in all cases similar elements emerged as important:

• The need for evidence : eight respondents mentioned the need to demonstrate that the library was providing value for money and/or making a difference to users

• Linking measures and the outcomes from measures to the University’s mission or strategy: three respondents mentioned value and/or impact in terms of the University’s mission, and a further four mentioned the importance of demonstrating the library’s worth to the University. ‘How satisfactorily the library is performing in the context of the University mission – are we highly valued in our community’.

• Different stakeholders’ interests: four respondents noted that different measures may be of interest to different stakeholders. ‘The people for whom there is a vested interest in

Value and Impact Measurement Programme 23

Page 26: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

those two things [value for money and impact on users] are slightly different sets of stakeholders’

• Quantitative measures: value in terms of cost or other quantitative measures was mentioned by six respondents. These were considered important to demonstrate value for money and for comparison with other institutions. However, all respondents stressed the need to go beyond this. ‘ it’s different to simple cost-benefit analysis – the value you add and extent to which you can show you have added it’.

In summary, the understanding of what value and impact measures should achieve did seem to go beyond the measures currently being undertaken. There is some evidence that this interpretation may have affected the results of the questionnaire survey. For example, both of the interviewees selected because they reported not having undertaken any value and impact measures had, in fact done so; one institution did benchmarking using SCONUL statistics and undertook an annual student survey using the SCONUL survey; and one institution had run focus groups to investigate issues raised by external assessors.

3.2.2 Institutional requirements and drivers The institutional requirements for value and impact measures were of particular interest. The questionnaire responses suggested that there was little institutional requirement for libraries to undertake value and impact measures. However, there was some discrepancy in the results, and the interviews provided an opportunity to investigate this issue further. Four respondents undertook measures in response to University requirements. Of these, two institutions had undertaken measurements as part of an institution-wide review with external drivers, but in both cases had gone beyond the basic requirements:

• One respondent completed a self assessment report on the library service as part of the 5-year Quality Review of the University (an Irish University requirement); the library also did a benchmarking exercise that went beyond the requirements of the review.

• One library extended the investigation undertaken by external consultants as part of the HEFCE Value for Money requirements

Two had responded to more implicit requests to provide evidence as part of overall institution management:

• One library was ‘invited’ to provide evidence of performance for annual budget review

• One library stated that it was ‘not a written requirement, but was implicit’

All (six) of the remaining respondents had done some measures. Of these:

• Three mentioned reporting results to University management and all of these had received positive outcomes, in terms of recognition and/or additional investment

• Two mentioned one of the reasons for undertaking measures was in preparation for questions from University management

Value and Impact Measurement Programme 24

Page 27: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Overall, six respondents stressed that undertaking measures was an essential part of library management, whether required by the institution or not. In all cases, measures had been well accepted by the institution and led in some cases to positive outcomes, such as increased investment.

‘it was instrumental in getting additional investment’

‘where you can demonstrate [value for money and impact] it impresses’

This is contrary to the views expressed in some of the questionnaire responses, suggesting that library measures were not understood and/or doubting that such measures would have any positive outcome for the library.

Respondents were asked if they thought that the institutional requirements for value and impact measures would increase in the future. Responses were influenced by current institutional culture; with three interviewees uncertain whether there would be any firm requirements for value and impact measures in the short term. Of these, two respondents noted that it was not the University culture, but that external influences could change this focus. The remaining seven respondents thought that there would be increased requirements for value and impact measures. Some of the drivers were internal, such as institutional culture for transparent accounting and extension of current measures from value for money to impact on overall university strategy and mission, in particular impact on research output. External drivers included increased competitiveness and raised expectations linked with National Student Survey.

3.2.3 Barriers to undertaking value and impact measures The questionnaire survey had highlighted common barriers to undertaking value and impact measures. These issues were explored further in the interviews.

3.2.3.1 Lack of appropriate methodology/tools This was the most frequently mentioned barrier in the questionnaire survey. Interviewees reported mixed experiences with tools.

Methodologies used to date included:

• Internal and external statistics, e.g. SCONUL statistics. These are seen as useful for benchmarking and for PR – especially if the library compares well with its comparator institutions. The problem of finding comparator institutions was mentioned by two respondents. Quantitative data may reveal issues requiring more in-depth studies of a part of the library service.

• ‘Off the shelf’ survey tools which may or may not be adapted to local use. Examples mentioned were:

- Websurveyor software for designing web questionnaires and presenting results.

- Libqual+ : one respondent had been involved in a previous Libqual survey, and noted the advantages as highlighting user expectations and the facility to benchmark. Two further respondents expressed an interest in future surveys for these reasons.

Value and Impact Measurement Programme 25

Page 28: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

- Libra (Priority Research): one respondent had used this, but the survey was sub-contracted; another respondent had rejected this as ‘inflexible’

• Inhouse surveys. These aimed to gain more qualitative data about a particular aspect of the service and these may be in addition to quantitative measures. For example, one institution held focus groups to investigate in more depth an issue raised by external consultants, and another institution held interviews to evaluate a new student facility. However, one institution had an ongoing programme of measures throughout the service, undertaken by staff at a local level.

• Three of the interviewees had been involved in the LIRG/SCONUL impact initiative and a further two respondents commented on the project and its outcomes. The initiative was viewed as interesting, but the methodology was considered quite complex and time-consuming – as one respondent put it ‘it is not an instant toolkit you can plug into’. Another respondent considered that the initiative revealed the lack of familiarity with robust methodologies within the library sector. However, another respondent noted the usefulness of the approach for linking measures to overall strategy, which was often overlooked in more quantitative methods. Lack of robustness, and inability to provide firm evidence were considered the main problems with in-house, qualitative methods, well summarised by one respondent ‘it is very difficult to get a methodology that wouldn’t be quickly rubbished by someone who wanted to do that’.

• External consultants. Five respondents have used external consultants to some extent:

- In three institutions consultants reviewed the library service as part of an institution-wide review

- One institution had commissioned a consultant to undertake a benchmarking exercise as a starting point for future measures by the library.

- One institution commissioned a survey of staff, and also participate in an annual student survey undertaken by external consultants.

In all cases, the library either had input to the process (e.g. by input to a questionnaire) or has added to the work done by the external reviewers. Only two respondents mentioned particular benefits in outsourcing surveys: one considered that it reduced time on analysis, but also noted that library must always have input to any questionnaire; and one respondent welcomed the statistical expertise of external consultants, but stressed that results must be well presented to aid understanding by those with limited statistical knowledge.

All respondents intended to undertake some value and impact related measures in the future, and expressed interest in having a set of tools to select from. However, several respondents doubted that the ‘ideal’ tool existed and problems highlighted included:

• Difficulty in proving library impact in a HE environment, for example isolating library impacts from other institutional inputs; also areas of interest to HE libraries are difficult to measure, such as impact on research output.

• Providing evidence beyond the anecdotal – robustness

• Flexibility to consider local needs as well as comparisons

• Lack of comparator institutions

Value and Impact Measurement Programme 26

Page 29: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

• Reconciling the need for good time series data with the need to change questions

Additional requirements for tools and methodologies were:

• Ease of use

• Credibility

• Go beyond cost efficiency

The advantages of having a ‘toolkit’ for HE library value and impact measures included time saved on investigating methodologies, and involvement in a ‘community of people doing similar work’.

3.2.3.2 Staff involvement Staff skills and resistance were mentioned as barriers in the questionnaire survey, however the interviews suggest this was not a notable issue.

• Three respondents noted that measures were undertaken following a push by staff. For example, involvement of one institution in the LIRG/SCONUL impact initiative had been driven by the enthusiasm of certain members of staff; another institution mentioned constant awareness of senior managers to ‘pick up’ likely issues

• Two respondents noted that measures involved all staff, and it was a particular policy to do so. For example, in one case measurements are undertaken at a local level by relevant staff.

• The remainder (five) saw the measures currently undertaken as a management responsibility – usually involving the head of service and/or senior management. This was due to the type of measurement, e.g. use of SCONUL statistics, or the number of staff in the institution.

3.2.3.3 Time The time issues raised in the questionnaire survey included management time to select tools, and time to carry out and analyse surveys. This did not emerge as a major issue in the interviews, but respondents did raise the following points:

• Two interviewees mentioned lack of staff time as an issue, one was a small library and the other had very recently had considerable staff cuts. However, both institutions had undertaken measurements and intended to do more in future.

• One respondent stressed the importance of repeating measures to evaluate changes – so time investment was ongoing.

• Two institutions noted that undertaking measures was now so embedded that time was no longer an issue – it was part of the work. However, considerable time was required for the initial push.

the initial stage and doing it the first time is extremely time consuming, but we build on this every year

Value and Impact Measurement Programme 27

Page 30: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Overall, respondents recognised that there was a time commitment but they considered it essential to undertake some sort of measures. One respondent noted that more accessible tools may reduce some of the ‘thinking time’.

3.2.4 Advantages of undertaking value and impact measures All of the interviewees reported some level of success with the value and impact measures they had undertaken. These ranged from improving library services to obtaining increased funding.

3.3 Summary of survey of members The questionnaire survey found that 68% of respondents had carried out value and impact measures. However, the interviews suggest that the percentage may be higher than this. The problem with assessing the level of current activity may be associated with the understanding of the term ‘value and impact measures’. This was not asked in the questionnaire survey, but some interviewees expressed doubt about a sector-wide understanding. However, there was some consensus, particularly the need to demonstrate value and impact, and that such measures should go beyond quantitative ‘cost-effectiveness’ type measures. In particular, the need to link library activity to the institution’s mission was seen as an essential part of measures.

The questionnaire survey also tended to underestimate the institutional requirements for measures, albeit implicit; some questionnaire respondents also doubted the benefit of measures for the library. However, all interviewees had received recognition for the measures they had done, particularly where these were not requested. In several institutions, the library was seen as ‘ahead of the game’. Those institutions that were not required to undertake measures did so in preparation for questions from University/College management. This reflected the view of interviewees that institutional requirements for value and impact measures would increase in the future.

Staff buy-in and time pressures were noted as barriers in the questionnaire survey. However, these were not considered notable issues by interview respondents. In several cases staff were keen to be involved in measures. Time was recognised as a barrier, but the time required decreased somewhat after the initial round of measures. Both of these issues can be linked the availability of suitable methodologies and tools.

Lack of suitable tools and methodologies emerged as the most significant barrier, both in the questionnaire survey, and the interviews. The main difficulties were the large range of methodologies available, therefore requiring time and expertise to select. Once selected, these required adaptation to the library’s needs. More generally, the impact of the library within HE institutions was considered a difficult, if not impossible, factor to measure. However, all of the libraries had active plans for future measures, and welcomed the availability of a toolkit to select from.

Value and Impact Measurement Programme 28

Page 31: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

4. Synthesis and recommendations

The critical review and survey of members were undertaken in parallel, so this section seeks to provide a synthesis of the findings from the two areas, and to identify any gaps in provision where further work is required. A set of recommendations has been made for future development of value and impact measurement in academic and research libraries.

The survey of members uncovered an apparent lack of understanding of the term “value and impact”, and other terms such as ‘outcome’ were used in the evidence uncovered in the critical review. In her talk to the 2005 IFLA conference, Poll (2005) considers the terms ‘impact’ and ‘outcome’ as inter-changeable and sees the definition of impact or outcome in the Encyclopedia of Library and Information Sciences as still valid:

“Outcomes can be seen as the eventual result of using library services, the influence the use had, and its significance to the user” (Revill, 1990)

Markless and Streatfield (2006) prefer the more generic term ‘impact’ which they define as: “any effect of the service (or an event or initiative) on an individual or group”

to the term ‘outcomes’ which they suggest may be confused with ‘outputs’.

The current project team feel that there is a useful distinction to be drawn between the effect the library service has on its institutional stakeholders, in particular in the areas of the institution’s mission, aims and objectives, and the effects which it has on its user stakeholders, with regard to their experience of the library and its value to their work. The following definitions are therefore offered for use in this context:

Value The worth of the library to its institution in terms of the value for money provided and cost-effectiveness of its operations, in the context of the quality of its services

Impact The effect which the library has on its stakeholders, including:

Impact on the institution: the significance of its contribution to the institution’s mission, aims and objectives;

Impact on users: the importance of the library to its users

Outcome The consequences for individual users of having used the library service. This is a particularly difficult concept to measure, encompassing effects ranging from enhancing academic success through increasing the quality of academic research to improving the quality of life of library users.

Value and Impact Measurement Programme 29

Page 32: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

4.1 Synthesis of results The Critical Review identified a wide range of projects and available toolkits, covering several aspects of value and impact measurement. Although survey respondents were aware of some of these tools, few had apparently been used and several were not mentioned at all. Table 7 compares the tools identified in the Critical Review to those mentioned in the survey.

Table 7 Tools identified in Critical review and those used or cited by survey respondents

Tools/projects identified in Critical Review

Tools used no. of respondents (n=26)

Mentioned but not used

no. of respondents+

SCONUL Annual Statistics/HELMS 4 1

SCONUL User Survey Templates 2 2

LibQUAL+ 4

EQUINOX

JUBILEE

JUSTEIS

Evalued 1

e-measures 1

Benchmarking Activity (including SCONUL Framework)

5

SCONUL/LIRG Impact Initiative 6 4

Outcomes

SCONUL QAA Guidelines

WGIL Resources

British Library – Contingent Valuation 1

Public Library examples, including PLSS & Longitude II

Health Libraries (Evince etc) 2

USA examples (other than LibQUAL+)

Australia examples 1 + Based on examples of good practice and ideal methods. Some respondents cited tools they had used.

Not all respondents answered this question.

This table includes examples of tools and resources available to librarians in other UK library sectors and in academic library organisations in other countries. No attempt has been made to examine how widely these resources are being used or how successful they have been outside the UK academic library sector, though in some cases there is evidence that material has not been kept up-to-date.

The survey demonstrated that a wide variety of tools and methods are being used by academic libraries to measure their performance. In some cases, however, respondents did not consider the measures they were using to be ‘value and impact measures’. There

Value and Impact Measurement Programme 30

Page 33: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

appears to be a lack of understanding of the term ‘value and impact’, which could be overcome by clearer dissemination of good practice within the academic library community. By demonstrating how the measures already in use can be used to convince senior institutional stakeholders of the value and worth of the library, the concept of ‘value and impact’ may become more widely understood and used.

Relatively few of the available tools had been used by respondents, although further tools were mentioned, for example as being ‘good practice’. Note that the survey did not provide a list of available tools to prompt respondents, but collected data via an open ended question. If respondents had been prompted by a list of available tools, higher levels of use may have been recorded; however the use of an open ended question gives results based on the most recently used tools and those seen as being most useful and relevant. This suggests there may be a need to promote the tools available and how they can be used to provide evidence of value and impact.

4.2 Gap analysis The gaps identified are summarised below:

• The critical review and survey results suggest that there is not a shortage of tools, but that it is difficult to locate the right one for the job. This may be due in part to a lack of knowledge of the tools available.

• Related to this is a lack of confidence in the available tools. This includes confidence of staff to use the tools (highlighting training needs) and confidence in the quality and ‘branding’ of instruments. Lack of robustness of methods was a concern, particularly for more qualitative measures.

• The survey revealed a need for instruments which could be adapted to incorporate local issues, but still allow comparison between institutions. However, some specialist institutions had problems identifying comparators, so guidelines are required in this area.

• There is a lack of easy to use tools. This includes readily identifiable tools (see above), as well as ease in use.

• Although only one survey respondent mentioned The Effective Academic Library, it was clear that many of the measures being undertaken followed the framework outlined in the report. Therefore, new performance indicators do not seem to be required. The problems lie in taking measures beyond this – measuring the impact of the library.

• There has been much work on the impact of various initiatives and developments on the library, but less to measure the impact of the library, and demonstrate the value of the library in the light of institutional aims. This was recognised by most respondents in the survey, but the problem of how to do this remained.

• As noted above, libraries may be undertaking measurements but not using the outcomes as evidence of the library’s value to the institution. This suggests a lack of promotion and dissemination of best practice in getting the most out of work already being done, as well as developing new measures.

Value and Impact Measurement Programme 31

Page 34: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

4.3 Recommendations The four recommendations arising from the critical review and the survey fall into two distinct categories. The first two relate to ways of ensuring the take-up and sustainability of both new and existing tools and resources, and the second two address areas where gaps in provision have been noted.

Recommendation 1 Establishing a ‘community of practice’ The evidence of the Impact project in particular shows that an important element in the success of any initiative is the establishment of a ‘community of practice’ where those involved can share knowledge and experience and disseminate examples of good practice.

It is recommended that WGPI establish such a community to share information on the work currently being undertaken. This would also promote the sustainability which has been identified as a missing element in many projects and assist in providing support and training.

Recommendation 2 Review of existing tools; provision of support and ensuring sustainability

The Critical Review and survey suggest there is not a great shortage of tools, but clarification of what tools are available for what purpose is needed. Therefore, the following three-part strategy is recommended:

2.1 An in-depth assessment of the value of the existing tools

Work should be commissioned to report in more depth on the toolkits currently available and their relevance to academic libraries. This should include:

• Categorising the tools according to their application to different types of measure

• Mapping the current use of tools – if any are not being used identify why not; an example of note is Equinox

2.2 Promotion, branding and sustainability of existing tools

• Add branding and publicise relevant and potentially useful tools. This is an important role for SCONUL, possibly via the WGPI, as tools currently provided and promoted via SCONUL are well used

• Design a mechanism to promote ownership of methodologies: keeping tools up to date when the initial project ends, leading to ownership in the profession and providing an exit strategy for individual research projects

2.3 Providing training on existing processes and methods

• The gap analysis has shown that staff may lack confidence in using the tools that are available. Training and support are therefore seen as a vital element

• Such training and support should also make provision for adapting existing tools to local practice

Recommendation 3 Value and Impact: a study of Outcomes Rather than a toolkit, we recommend that a project be commissioned to develop a set of guidelines to address the following:

Value and Impact Measurement Programme 32

Page 35: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

• Value of the library to the institution, including value for money studies and economic impact

• Impact of the library on learning and teaching

• Impact of the library on research

The aim would be to provide senior library managers with a set of resources to assist in demonstrating the value and impact of the service. This might be achieved by a collection of examples drawn from academic libraries and elsewhere. It might also give an indication of those areas or ‘outcomes’ as defined in section 4, where it is thought impossible to demonstrate direct value and impact in any quantitative way. Satisfaction surveys could provide models for these less tangible areas.

It is not thought that a simple tabulated toolkit approach is appropriate at this level.

Recommendation 4 Staffing and operational issues Staffing and operational issues, in particular optimal use of staff in times of change, emerged as an important area for investigation from the survey, but respondents noted a gap in appropriate tools.

A project should be commissioned to develop useful, flexible and sustainable methods to monitor and enhance performance in relation to staffing and operational issues. This should include:

• A toolkit which libraries could use to assign unit costs to various process activities, building on existing resources

• A set of guidelines and examples of best practice drawn from academic libraries and elsewhere to support senior library managers in promoting change management and assessing the contribution of individual roles and structures.

Value and Impact Measurement Programme 33

Page 36: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

5. References

Booth, Andrew and Brice, C, eds. (2004), Evidence-based practice for information professionals: a handbook. Facet, 2004.

British Library (2005), Measuring our value: results of an independent economic impact study commissioned by the British Library to measure the Library’s direct and indirect value to the UK economy. 2005. Available at http://www.bl.uk/pdf/measuring.pdf

Brophy, Peter (2001), Electronic library performance indicators: the EQUINOX project. Serials 14(1), 2001, pp5-9

Brophy, Peter and Craven, Jenny (2004), Longitude II: a library networking impact toolkit for a user-driven environment. MLA, 2004

California State Library (2003), Outcomes measurement basics. Available at http://www.library.ca.gov/assets/acrobat/lsta/OutcomesMeasurementBasics.pdf

CAUL (2000a), Guidelines for the application of best practice in Australian university libraries. Available at http://www.dest.gov.au/archive/highered/eippubs/eip00_11/00_11.pdf)

CAUL (2000b), Handbook of best practice. Available at http://www.dest.gov.au/archive/highered/eippubs/eip00_10/00_10.pdf

CAUL (2003), Performance indicator manual. Available at http://www.caul.edu.au/best-practice/DocumentDeliveryPerformanceIndicatorManual.doc

CAUL (2004), Top performing libraries in Rodski client satisfaction survey. Available at http://www.caul.edu.au/surveys/library-performance2004.doc,

Conyers, Angela (2004), E-measures: developing statistical measures for electronic information services. Vine, 34(4), 2004. pp 148-153.

Conyers, Angela (2005), E-resources in SCONUL member libraries: what the statistics tell us. SCONUL Focus, 36 (winter 2005), pp65-7. Available at http://www.sconul.ac.uk/pubs_stats/newsletter/36/22.pdf

Coulson, Graham and Banwell, Linda (2004), The impact of the JUBILEE toolkit in institutions. Vine, 34 (4), 2004. pp 154-160.

Council for Learning Resources in Colleges (CoLRiC) (2005), Performance and impact indicators. Available at http://www.colric.org.uk

Creaser, Claire (2005), Benchmarking the standard SCONUL user survey: report of a pilot study. SCONUL Focus, 34, 2005, pp 61-65

Davies, Rebecca, Christine J Urquhart, Jill Smith, Catherine Massiter and John B Hepworth (1997), Establishing the value of information to nursing continuing education: report of the evince project, British Library RIC Report 44

Value and Impact Measurement Programme 34

Page 37: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

de Jager, Karin (2002), Impacts and outcomes: searching for the most elusive indicators of academic library performance. Proceedings of the 4th Northumbria International Conference on Performance Measurement in Libraries and Information Services, pp. 291-297. Washington DC:ARL

de Jager, Karin and Nassimbeni, Mary (2005), Towards measuring the performance of public libraries in South Africa, South African Journal of Libraries and Information Science 71(1) pp 39-50

Dept for Culture, Media & Sport (2004), Public Library Service Standards. Available at http://www.culture.gov.uk/global/publications/archive_2004/library_standards.htm

Fforwm (2006), Services supporting learning in Wales: a quality toolkit for evaluating learning resource services in further education colleges. Available at http://www.fforwm.ac.uk/index/work/publications.html

Florida Dept of State, Division of Library and Information Services (2001). Workbook: Outcomes measurement. Available at http://dlis.dos.state.fl.us/bld/Research_Office/OutcomeEvalWkbk.doc

Fraser, Bruce T. and McClure, Charles R. (2002), Toward a framework of library and institutional outcomes. Information use management and policy institute, Florida State University. Available at http://www.arl.org/stats/newmeas/emetrics/phase3/ARL.Emetrics.Outcomes.Paper.Final.Jan.8.02.pdf

Higher Education Funding Councils (1993), Joint Funding Councils Libraries Review Group: Report (The Follett Report), Bristol: Higher Education Funding Councils

Higher Education Funding Councils (1995), The Effective academic library: a framework for evaluating the performance of UK academic libraries. A consultative report to the HEFCE, SHEFC, HEFCW and DENI by the Joint Funding Councils’ Ad-hoc group on performance indicators for libraries.

Kyrillidou, Martha and others (2005), Assessing the value of networked electronic services: the MINES survey. Available at : http://www.libqual.org/documents/admin/MINES%20Panel%20ACRL.ppt

Laser Foundation (2005), Libraries impact project. July, 2005.

Library & Information Research, (2005) 29 (91), Spring 2005.

Markless, Sharon and Streatfield, David (2005), Gathering and applying evidence of impact of UK university libraries on student learning and research. International journal of information management, 26 (2005), pp 3-15

Markless, Sharon and Streatfield, David (2006), Evaluating the impact of your library. Facet.

McNicol, Sarah (2004a), Academic libraries: planning, outcomes and communication. Evidence base. Available at http://www.ebase.uce.ac.uk/docs/Outcomes_project_report.pdf

Value and Impact Measurement Programme 35

Page 38: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

McNicol, Sarah (2004b), The eVALUEd toolkit: a framework for the qualitative evaluation of electronic information services. Vine, 34(4), pp 172-175

Morrow, John (2005), Newcastle University Library and the Charter Mark. SCONUL Focus 35, pp17-19. Available at http://www.sconul.ac.uk/pubs_stats/newsletter/35/6.pdf

Museums, Libraries and Archives Council (2005), Developing performance indicators for local authority museums, libraries and archives. MLA, 2005

Payne, Philip and Conyers, Angela (2005), Measuring the impact of higher education libraries: the LIRG/SCONUL impact implementation initiative. Library and Information research, 29 (91), Spring 2005, pp 3-9

Peters, Janet (2004), Learning outcomes and information literacy. SCONUL. ,Available at http://www.sconul.ac.uk/activities/inf_lit/papers/outcomes.pdf

Poll, Roswitha (2006). Bibliography “Impact and outcome of libraries”. Available at http://www.ulb.uni-muenster.de/projekte/outcome/downloads/bibliography-impact+outcome.pdf

Revill, Don (1990), Performance measures for academic libraries, Encylopedia of Library and Information Science, 45 (10), p 316

SCONUL, Annual Library Statistics, London: SCONUL

SCONUL, UK Higher Education Library Management Statistics, London: SCONUL

SLAINTE (2003), Library service development toolkit. Available at http://www.slainte.org.uk/files/pdf/fenet/toolkit03.pdf

Streatfield, D.R. and Markless, S. (2004), Improve your library: a self-evaluation process for primary schools. Dept for Education and Skills. Available at http://www.teachernet.gov.uk/teachingandlearning/resourcematerials/schoollibraries/

Thebridge, Stella and Hartland-Fox, R. (2002), Towards a toolkit for evaluating electronic information services. SCONUL Newsletter, 2002, pp 37-43.

Town, J. Stephen (ed) (2000), The SCONUL benchmarking manual. London: SCONUL

Town, J.Stephen and Lock, Selena A. (2005), LibQual+ in the UK and Ireland: three years findings and experience . Available at http://www.libqual.org/documents/admin/UK_&_Ireland_paper_final.doc

UNISON (University Librarians in the State of New South Wales) (2006), Digital reference key performance indicators. Project report. Available at; http://www.caul.edu.au/best-practice/digitalreference2006.doc

Urquhart, Christine (2004), How do I measure the impact of my service? (Guideline) in Booth, Andrew and Brice, C, eds. Evidence-based practice for information professionals: a handbook, pp 210-222

Value and Impact Measurement Programme 36

Page 39: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

Urquhart, Christine (2005), Assessing impact: let us count the ways? Library and information update, 2005. Available at http://www.cilip.org.uk/publications/updatemagazine/archive/archive2005/december/urquhartdecupdate.htm

Urquhart, Christine and others (2003), Uptake and use of electronic information services: trends in UK higher education from the JUSTEIS project. Program 37 (3), 2003, pp 168-180

Urquhart, Christine J. and John B Hepworth (1995), The value to clinical decision making of information supplied by NHS Library and Information Services, British Library R&D Report 6205

West, Christopher (2004), User surveys in UK and Irish HE Libraries. Available at http://www.sconul.ac.uk/activities/performance/papers/SCONULSurveys2003.doc

Williams, Dorothy and Wavell, Caroline (2001). The impact of the school library resource centre on learning. Robert Gordon University. (Library and Information Commission research report, 112). Available at http://www.rgu.ac.uk/abs/research/page.cfm?pge=6924

Value and Impact Measurement Programme 37

Page 40: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

APPENDIX 1 - QUESTIONNAIRE

SCONUL – VALUE ADDED AND IMPACT MEASUREMENT PROGRAMME Survey of Members

Please provide as much detail as you wish – boxes will expand

A. USE OF VALUE FOR MONEY AND IMPACT MEASURES WITHIN YOUR INSTITUTION

1. Does your institution require you to undertake value and impact measurement? Please tick as applicable No Yes (please give details below) Who in your institution specifies such requirements? (Please give job title and department)

2. Does your institution specify any requirements for such measures where used?

Please tick as applicable. No (please go to Question 3) Yes: specifies type of data required. (Please give details)

Yes: specifies frameworks or tools to use. (Please give details)

3. Were you required to use specific frameworks in the past, but are no longer required to do so? Please tick as applicable. No Yes (Please give details)

B CURRENT USE OF VALUE FOR MONEY AND IMPACT MEASURES WITHIN ACADEMIC

LIBRARIES

4. Who is responsible for instigating value and impact measurement for the library service? (Please give job title)

5. Did you participate in the LIRG/SCONUL Impact Initiative? Please tick as applicable

Yes: Phase 1 ( ) Yes: Phase 2 ( ) No ( )

6. Has your library undertaken any other value and impact measurements?

No ( ) Please go to Question 12 Yes ( )

7. What were the reason(s) for this exercise? Please tick all that apply Required by Institution To compare our service to that provided by other libraries For advocacy within the institution To improve services Other (please give further details)

8. What were you aiming to measure?

_________________________________________________________________________ Contact: Suzanne Lockyer, LISU, Loughborough University, Loughborough LE11 3TU,

[email protected], Tel 01509 635690 – Fax 01509 635699

Value and Impact Measurement Programme 38

Page 41: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

APPENDIX 1 - QUESTIONNAIRE

9. What value and impact measurement frameworks or tools did you use?

10. How successful were these in achieving your objectives?

11. What problems did you encounter in using these methods?

12. Good practice in Value and Impact Measures. If you know of any examples of good practice in Value and Impact Measures please give brief details. This may be in your own institution or elsewhere and does not need to be restricted to libraries.

C. REQUIREMENTS FOR FURTHER PERFORMANCE MEASURES FOR ACADEMIC LIBRARIES

13. What is your current level of knowledge of performance measurement tools or frameworks. Please tick as applicable

None ( ) A little ( ) Moderate ( ) Extensive ( )

Further details:

14. What do you perceive as barriers to the use of such tools/frameworks?

15. What areas of performance within the library service would you like to be able to measure/benchmark?

16. What do you consider an ideal methodology to do this?

Thank you for taking the time to complete this questionnaire. Please return to Suzanne Lockyer: [email protected] by 14 April 2006. Follow-up telephone interviews will be carried out to investigate in depth some of the issues raised in this questionnaire. If you would be prepared to be interviewed, please provide contact details. Interviews are scheduled for the period 24 April to 5 May. Name : Job title: Institution: Email: Telephone _________________________________________________________________________

Contact: Suzanne Lockyer, LISU, Loughborough University, Loughborough LE11 3TU, [email protected], Tel 01509 635690 – Fax 01509 635699

Value and Impact Measurement Programme 39

Page 42: Value and Impact Measurement Programme · 2014-11-21 · LibQUAL+ has been popular with most SCONUL libraries who have participated. It has several advantages, including ease of analysis

APPENDIX 2 – INTERVIEW SCHEDULE

Interview schedule What do you understand by the terms value and impact measures? Institutional requirements and drivers Clarification of institutional requirements for value and impact measures, based on questionnaire response. If insititution does not require you to undertake Value and Impact Measures, but you have done it anyway, why? How was this received by institution? Is the requirement to undertake value and impact measures likely to become an issue at institution level in the future? What are drivers for this? Tools and frameworks What methods used so far, if none why not Explore problems and successes mentioned in the questionnaire Discussion on barriers mentioned by several respondents Time – why was this an issue for you, or how did you overcome time constraints Skills – what staff and management skills are missing, did you provide training to overcome this. Methodology – what tools/methodologies are needed – what’s missing from currently available tools Mentions future projects – do you consider suitable tools exist for these? What type of thing would you need

Value and Impact Measurement Programme 40