1 / 62

Library Services Assessment

Library Services Assessment. Isla Jordan, Carleton University Julie McKenna, University of Regina February 2, 2007 OLA Super Conference 2007: Session 1408. Outline. Definition and Purpose Survey of Assessment Practices Types of Assessment Benchmarks, Standards and EBL

bsylvia
Download Presentation

Library Services Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Library Services Assessment Isla Jordan, Carleton University Julie McKenna, University of Regina February 2, 2007 OLA Super Conference 2007: Session 1408

  2. Outline • Definition and Purpose • Survey of Assessment Practices • Types of Assessment • Benchmarks, Standards and EBL • Drivers of Assessment • Tools and Techniques • Assessment strategy • Questions

  3. … a critical tool for understanding library customers and offering services, spaces, collections, and tools that best meet their needs. Without good assessment, libraries could lose touch with users’ desires and needs and even become irrelevant. Nardini (2001) Assessment

  4. Assessment …any activities that seek to measure the library’s impact on teaching, learning and research as well as initiatives that seek to identify user needs or gauge user satisfaction or perceptions with the overall goal being the data-based and user-centered continuous improvement of our collections and services. Pam Ryan, libraryassessment.info

  5. The purpose of assessmentin libraries • To understand user interaction with library resources and services; and • To capture data that inform the planning, management and implementation of library resources and services. Bertot, 2004

  6. Winter 2007 Survey of Assessment Practices in Canadian University Libraries

  7. Survey of Assessment Practices - Purpose • Benchmark services assessment practice • Capture some measures about the culture of assessment

  8. Survey Sections • Demographic Information • Assessment Planning • Involvement in Assessment in Organization • Collection and Use of Data to Inform Decision-Making • Final Comments

  9. Survey Participants • Invitation to complete a web-based survey to all University Librarians of: • Council of Prairie and Pacific University Libraries (COPPUL) • Ontario Council of University Libraries (OCUL) • Council of Atlantic University Libraries (CAUL/AUBO) • Invitation (February 12, 2007) to complete a French edition of the web-based survey: • members of Conférence des recteurs et des principaux des universités du Québec (CREPUQ)

  10. Survey of Assessment Practices • English Survey • 60 invitations; 39 respondents • 65% response rate • French Survey • To launch February 12, 2007

  11. University of Toronto UWO Queen’s University McMaster University of Windsor York University Guelph University Nipissing University University of Waterloo Carleton University Brock University Memorial University University of Saskatchewan UBC University of Alberta And many more…. Thank you to …

  12. Types of Assessment • Input & Output • Service Quality • Performance Measures • Outcomes or Impact

  13. 1. Input & Output • Input measures: expenditures & resources • Funding allocations, # of registered students, print holdings, etc. • Output measures: activities & service traffic • Reference transactions, lending and borrowing transactions, # of instruction sessions, program attendance, etc. • Ratios • Students/librarians, print volume holdings/student, reference transactions/student, etc.

  14. Type of data Gate count Body counts Reference transactions Circulation statistics Decision-making Hours Staffing & scheduling Service points Collection decisions Survey Results – how output data is used

  15. 2. Service Quality • Services defined as all programs, activities, facilities, events, … • Measure capture results from interactions with services • Subjective evaluation of “customer service” • Measure of the affective relationship

  16. (Zeithaml, Parasuraman and Berry 1990) “The only criteria that count in evaluating service quality are defined by customers. Only customers judge quality; all other judgments are essentially irrelevant.”

  17. LibQUAL+ • Association of Research Libraries • Standard for service quality assessment (2003) • Total market survey • Based in Gap Analysis Theory • User perceptions and expectations of services • Measures outcomes and impacts

  18. 3. Performance Measures Involves the use of efficiency and effectiveness measures • Availability of resources • Usability of programs, resources and services • Web page analysis • Content analysis • Functionality analysis • Cost analysis

  19. 4. Outcomes or Impacts “the ways in which library users are changed as a result of their interaction with the Library's resources and programs” Association of College & Research Libraries Task Force on Academic Library Outcomes Assessment Report, 1998

  20. Examples • The electronic journals were used by 65 scholars in the successful pursuit of a total of $1.7 million in research grants in 2004. • In a 2003 study, eighty-five percent of new faculty reported that library collections were a key factor in their recruitment.

  21. LibQUAL+ Measures Outcomes • The library helps me stay abreast of developments in my field(s) of interest. • The library aids my advancement in my academic discipline. • The library enables me to be more efficient in my academic pursuits. • The library helps me distinguish between trustworthy and untrustworthy information.

  22. Benchmarks, standards and EBL • Standards: “Measures that tie the value of libraries more closely to the benefits they create for their users” NISO 2001 (National Information Standards Organization) • Benchmarking: improving ourselves by learning from others (UK Public Sector Benchmarking Service)

  23. Benchmarks, standards and EBL • EBL (Evidence Based Librarianship): “attempts to integrate user reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making.” (Booth, “Counting What Counts” 2006)

  24. Example of a Standard Example: Information Literacy Standards for Science and Engineering Technology (ACRL 2006) Standard #1: The information literate student determines the nature and extent of the information needed. Performance Indicator #3: The information literate student has a working knowledge of the literature of the field and how it is produced. Outcome #a: ... student knows how scientific, technical, and related information is formally and informally produced, organized, and disseminated.

  25. CACUL Standards Committee • Goals: • Add Canadian context to existing standards in college and university libraries, e.g. ACRL • prepare report for CACUL AGM at CLA 2007 • form new team in summer 2007 contactJennifer Soutter jsoutter@uwindsor.ca

  26. Survey Results: Drivers of Assessment University Library Administration 92% Need for evidence to inform planning 87% University Administration 62% CARL, ARL or regional lib. Consortium 54%

  27. Multiple Methods of Listening to Customers • Transactional surveys • Mystery shopping • New, declining, and lost-customer surveys • Focus group interviews • Customer advisory panels • Service reviews • Customer complaint, comment, and inquiry capture • Total market surveys • Employee field reporting • Employee surveys • Service operating data capture Note. A. Parasuraman. The SERVQUAL Model: Its Evolution And Current Status. (2000). Paper presented at ARL Symposium on Measuring Service Quality, Washington, D.C.

  28. Canadian Adoption of LibQUAL+: Benefits • Quick, inexpensive • Standardized and tested instrument and practice • Data set of comparables for Canada • Insight into best practices at peer institutions • Build staff expertise and encourage evidence based practice and practitioners • Opportunity to introduce Canadian changes to instrument

  29. User Surveys: LibSAT, LibPAS • continuous customer feedback • LibSAT measures satisfaction • LibPAS (beta) measures performance http://www.countingopinions.com/

  30. Usability testing • gives user perspective • often for website design: • e.g. “user driven web portal design” (U Toronto 2006) • also for physical space: • e.g. “wayfinding” in library:http://www.arl.org/arldocs/stats/statsevents/laconf/2006/Kress.ppt

  31. Instruction Program Example -- Assessment Methods • Learning outcomes • Student performance on examinations, assignments • Pre- and post-test results • Level of "information literacy" • Program service measures (outputs) • # of instruction sessions offered, requests for course specific support, # of session attendees, by discipline, by faculty member, by course, logins to library-created online tutorials, # of course pages created within university’s learning portal, etc. • Student course evaluations & peer evaluations • Qualitative and quantitative • Service quality assessment • LibQUAL+ (gap between expectations and perceptions)

  32. Examples • Use patterns • laptop loans, GIS over paper maps, eBooks… • Space usage studies • e.g. Learning Commons study (University of Massachusetts Amherst) • Instruction and Information Literacy • e.g. use of online learning modules

  33. Electronic resources assessment • statistics not being systematically captured for digital collections or services • need for standard measures for use of digital collections is increasingly important: • to justify huge expenses of electronic collections • decline in use of traditional services (reference, ILL)

  34. Electronic resources assessment COUNTER: Real-time acquisition of usage statistics: • imports usage statistics from content vendors in a uniform format (COUNTER - Counting Online Usage of Networked Electronic Resources) • reduces need to retrieve statistical data on a resource-by-resource basis • can compare usage statistics with cost information to evaluate service benefits of e-resources

  35. Electronic resources assessment • Output statistics for ScholarsPortal databases and e-journals, e.g. • the number of requests for articles • holdings of different aggregators, to see overlap • Web logs, to see patterns of use

  36. Survey Results: Statistics - electronic resources

  37. Survey Results: Electronic resources assessment • "we are gathering e-resources stats as part of an overall journal review " • “The Library is currently reviewing Scholarly Statistics, a product designed to gather and present for analysis e-resource statistics. Also under consideration is an ERM which, along with its other capabilities, will provide statistic analysis.”

  38. Electronic resources assessment “I have been busy this week with the compilation of electronic journal usage statistics for ARL. To complete Section 15 (Number of successful full-text article requests) in the Supplementary Statistics section, I am limiting myself to Counter-compliant JR1 statistics provided by the publisher. Still, I am encountering unexpected complexities. .. The JR1 format is based on an the calendar year, but the ARL statistics are reported on the budget year. This means for every publisher I have to compile two years worth of data and manipulate it.” http://www.libraryassessment.info/

  39. Surveys, Interviews, Focus Groups • Surveys • quick to implement, difficult to design • identify issues, pick up anomalies • wording is critical • test, test, test …. • users over-surveyed • Interviews and focus groups • more scope for follow-up, explanation • subjective, time-consuming

  40. Survey Results: Top 5 planned assessment studies • User satisfaction survey / LibQUAL • Gate traffic study • Electronic database use • Electronic journal use • Usability of the website

  41. Strengths Formal presentations Formal reports Draw conclusions Make recommendations Project management Facilitate focus groups Weaknesses Sampling Research design Focus group research Survey design Qualitative analysis Survey Results: Staff Abilities

  42. Challenges of assessment • Gathering meaningful data • Acquiring methodological skills • Managing assessment data • Organizing assessment as a core activity • Interpreting data within the context of user behaviours and constraints. (Troll Covey, 2002)

  43. Survey Results: Where is assessment placed? • Assessment Librarian (2 institutions) • Assessment Coordinator • Libraries Assessment and Statistics Coordinator • Library Assessment and Information Technology Projects Coordinator • Librarian, Evaluation & Analysis • Manager, Evaluation & Analysis

  44. Survey Results: Who else is assigned assessment responsibility? • distributed to all unit heads or team leaders (4) • AULs have responsibility (6) • UL or Director (3) • administrative or executive officer (4) • access services or circulation (3) • other positions (12)

  45. Survey Results: Committees • Assessment Committee • Priorities and Resources Committee • Statistics Committee • LibQual Committee • LibQUAL+ Working Group • Library Services Assessment Committee • Community Needs Assessment Committee • PR/Communications Committee • Accreditation Self-Study Steering Committee • Senior Management Group • Cooperative Planning Team

  46. Services Assessment Strategy “The evaluation environment is increasingly complex, and requires knowledge of multiple evaluation frameworks, methodologies, data analysis techniques, and communication skills” Note. J.T. Snead et al. Developing Best-Fit Evaluation Strategies. (2006).Paper presented at Library Assessment Conference, Virginia.

More Related