Recent developments in library evaluation, statistics and measurement – and why they are important
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
This paper identifies six key changes in library evaluation over the past five years and suggests why they may be important for IFLA delegates. Deciding which changes to feature was difficult because library evaluation is a highly contentious area (as is evaluation more generally) and because practice across the world is partial, patchy and uneven. The authors summarize their choices by noting a gradual move in the library evaluation field towards deeper understanding of the nature and significance of evaluation. Their key propositions are that: the demand for evaluation expertise is beginning to outstrip supply; there is a technology-enabled shift in research and evaluation focus from the world of the library to the information world of potential library users; there is growing experience of conducting library impact evaluation across countries; there is a dawning recognition that sustaining impact evaluation efforts is a core element in ensuring sustainability of libraries; there are early signs of movement towards more ethical evaluation; and as library services become more innovative the limitations of simple evaluation models are beginning to show. The implications of each of these propositions for IFLA delegates is adumbrated.
Description
Keywords
Citation
Carden, F. (2009) Knowledge to Policy: making the most of development research, Sage/IDRC, Thousand Oaks, CA, available at www.idrc.ca/openebooks/417-8/
Chiranov, M. (2011) Applying Pop-up Survey Software to Incorporate Users’ Feedback into Public Library Computing Service Management, Performance Measurement and Metrics, 12 (1), 50–65.
Chiranov, M. (2012) Combining Qualitative and Quantitative Evidence: forming strategic collaborations – and telling stories. In Streatfield, D. R., Paberza, K., Lipeikaite, U., Chiranov, M., Devetakova, L. and Sadunisvili, R. (2012) Developing Impact Planning and Assessment at National Level: addressing some issues, Performance Measurement and Metrics, 13 (1), 63–5.
Fried, S., Kochanowicz, M. and Chiranov, M. (2010) Planning for Impact, Assessing for Sustainability, Performance Measurement and Metrics, 11 (1), 56–74.
Greene, J. C. (2006), Evaluation, Democracy and Social Change. In Shaw, I.F., Greene, J.C. and Mark, M.M. Handbook of Evaluation: policies,programs and practices, London, Sage.
Guba, E. G. and Lincoln, Y. (1987) Fourth Generation Evaluation, London, Sage.
Hall, I., Stephens, J. and Kennedy, S. (2014) Can you measure IT? The UK Experience of TechQual+, Performance Measurement and Metrics, 14 (1/2), http://www.emeraldinsight.com/journals.htm?issn=1467-8047&volume=15&issue=1
Killick, S., van Weerden, A. and van Weerden, F. (2014) ‘Using LibQUAL+® to Identify Commonalities in Customer Satisfaction: the secret to success?, Performance Measurement and Metrics, 1/2, http://www.emeraldinsight.com/journals.htm?issn=1467-8047&volume=15&issue=1
Markless, S. and Streatfield, D. R. (2004) Improve your Library: a self evaluation process for secondary school libraries and learning resource centres, 2 vols, London, Department for Education and Skills, www.informat.org/ (click on School Libraries) .
Markless, S. and Streatfield, D. R. (2006) Gathering and Applying Evidence of the Impact of UK University Libraries on Student Learning and Research: a facilitated action research approach, International Journal of Information Management, 26, 3–15.
Markless, S. and Streatfield, D.R. (2013) Evaluating the Impact of Your Library, London, Facet Publishing.
Mertens, D.M. (2003) The Inclusive View of Evaluation: implications of transformative theory for evaluation. In Donaldson, S.I. and Scrivens, M. (eds) Evaluating Social Programs and Problems: visions for the new millennium, Mahwah, NJ, Lawrence Erlbaum Associates.
Nicholas, D. and Rowlands, I. (2008) Digital Consumers: reshaping the information professions, London, Facet Publishing.
Parlett, M. and Hamilton, D. (1972) Evaluation as Illumination: a new approach to the study of innovatory programmes. Republished in Parlett, M. and Dearden, G. (eds) (1977) Introduction to Illuminative Evaluation: studies in higher education, Cardiff-by-the-Sea, CA, Pacific Soundings Press, and Guildford, Society for Research into Higher Education.
Patton, M. Q. (2011) Developmental Evaluation: applying complexity conceptsto enhance innovation and use, New York, NY, The Guilford Press.
Patton, M. Q. (2012) Essentials of Utilization-focused Evaluation, Thousand Oaks, CA, Sage.
Poll, R. (2014) Bibliography: Impact and Outcome of Libraries, The Hague, Netherlands, IFLA.
Quick, S., Prior, G., Toombs, B., Taylor, L. and Currenti, R. (2013) Cross-European Survey to Measure Users’ Perceptions of the Benefits of ICT in Public Libraries (funded by the Bill & Melinda Gates Foundation), http://www.ifla.org/publications/publications-associated-with-the-s-e-sectionhttp://tascha.uw.edu/publications/cross-european-survey-to-measure-users-perceptions-of-the-benefits-of-ict-in-public-libraries/
Rogers, P. J. (2008) Using Programme Theory to Evaluate Complicated and Complex Aspects of Interventions, Evaluation, 14 (1), 29–48, http://evi.sagepub.com/cgi/content/abstract/14/1/29.
Shaw, I. F., Greene, J. C. and Mark, M. M. (2006) Handbook of Evaluation:policies, programs and practices, London, Sage.
Tenopir, C. and Volentine, R. with King, D.W. (2012) UK Scholarly Reading and the Value of Library Resources, JISC Collections, www.jisc-collections.ac.uk/news/Uk-scholarly-reading
Tenopir, C., King, D., Mays, R., Wu, L. and Baer, A. (2010) Measuring Value and Return on Investment of Academic Libraries, Serials, 23 (3), 182–90.
TNS RMS East Africa (2011) Perceptions of Public Libraries in Africa: combined report, http://www.eifl.net/system/files/201108/perceptions_study_coverpage.jpg