1 / 37

The Basic Problem

Making Incremental Improvements to Public Library Comparative Statistical Practices Ray Lyons Jason Holmes Library Assessment Conference Seattle, Washington August 5, 2008. The Basic Problem. The ultimate goal of the library is public enlightenment.

vilmaris
Download Presentation

The Basic Problem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Making Incremental Improvements to Public Library Comparative Statistical PracticesRay LyonsJason HolmesLibrary Assessment ConferenceSeattle, Washington August 5, 2008

  2. The Basic Problem • The ultimate goal of the library is public enlightenment. • It is difficult to assess our impact on enlightenment of the community because we have no way to measure it. • We measure what we CAN measure • We compare what we CAN compare

  3. Context of Comparative Statistics • Assessment • Measures used as part of a more general process of assessing library value and effectiveness • Management practice • Measures intended for iterative and ongoing process of performance measurement

  4. Public Library Assessment • Library profession traditionally applies systems or industrial model: • Inputs: resources supplied to the library • Outputs: products and services Input Throughput Output

  5. Performance Measurement Model RESULTS EFFORTS Intermediate Outcomes End Outcomes Inputs Outputs Outcome Measures Outcome Measures

  6. Performance Measurement Steps • Define long term goals (desired outcomes) • Define medium and short term objectives • Develop programs aimed at objectives • Specify measurement ‘indicators’ • Monitor indicators to track accomplishments

  7. Rationale for Standardized Statistics • PLA Planning-for-Results approach to library management • Abandonment of established operational and performance standards • ALA / PLA 1987 publication, Output Measures for Public Libraries: A Manual of Standardized Procedures, defines standard statistics and collection procedures

  8. ALA / PLA Approach to Standardized Statistics • Useful for self-evaluation based on service response choices library makes • Should be interpreted with respect to library mission, goals, objectives • Interpretation left up to the library

  9. Current Practices in Comparative Library Statistics • How are library statistics currently used for comparing public libraries? • What are the bases for these uses? • What purposes do they serve?

  10. Survey of Ohio Public Libraries on Use of Comparative Statistical Measures • Exploratory study Available at: http://www.plstatreports.com/compare/ UsePerceptComparStats.pdf • Responses via interview or online questionnaire • Stratified random sample of 90 Ohio public libraries • Two strata: urban & rural counties • Response rate = 47% (42 libraries)

  11. Survey Findings

  12. Survey Findings: Frequency of Managerial Review of Input Measures Frequency Input Measure Annually Quarterly Monthly Weekly Rarely Not Sure NOTE: Sky blue highlighting indicates measures that 50% or more library managerial teams review periodically. Light blue highlighting indicates higher frequencies that, combined, total 50% or more.

  13. Survey Findings:Frequency of Managerial Review of Output Measures Frequency Output Measure Annually Quarterly Monthly Weekly Rarely Not Sure NOTE: Sky blue highlighting indicates measures that 50% or more library managerial teams review periodically. Light blue highlighting indicates higher frequencies that, combined, total 50% or more.

  14. Survey Findings

  15. Survey Findings

  16. Survey Findings

  17. Survey Findings: Statistical Measures Libraries Usein Comparisons with Other Libraries(Table Format) Measure % of Libraries Using Measure Measure % of Libraries Using Measure

  18. Interpreting Library Measures There are no ‘right’ or ‘wrong’ scores on an output measure; ‘high’ and ‘low’ values are relative. The scores must be interpreted in terms of library goals, scores on other measures, and a broad range of other factors. - Van House, Weill, and McClure (1990)

  19. Interpreting Library Measures ALA / PLA policy since 1980’s: Leave data interpretation to local library “Each library staff should decide for them-selveswhether the [statistical] findings for that library were acceptable in terms of performance expectations.” - Ellen Altman (1990) describing the Public Library Performance Measurement Study by Deprospo et al. (1973)

  20. Key Problems with Library Statistics • Lack of criteria for evaluating measures • Collection of standard statistics assumes all library resources/activities counted to be equivalent

  21. Key Problems with Library Statistics • Standardization ignores differences in: - Complexity - Sophistication - Relevance - Quality (Merit) - Value (Worth) - Effectiveness - Efficiency - Significance

  22. Key Problems with Library Statistics • Data imprecision due to: • Inconsistent collection methods • Mistakes • Sampling error • “Gaming” • Statistical imputation • Imprecision makes individual library comparisons less valid

  23. Key Problems with Library Statistics • Lack of reliable methods for identifying peer libraries • Comparisons are either approximate or inaccurate • Can result in incorrect or misleading conclusions • Data are self-reported and unaudited

  24. Key Problems with Library Statistics • The More-is-Better Myth • Views higher numbers as favorable performance, lower as unfavorable “More activity does not necessarily mean better activity” - Van House, Weill, and McClure (1990) • Striving to earn higher numbers may compromise service quality

  25. Key Problems with Library Statistics • Statistics say nothing about performance adequacy, quality, effectiveness, or efficiency of library resources/activities • No consensus on constructs that statistics can realistically reflect • Difficult to determine remedies for problems which statistics might reveal

  26. Key Problems with Library Statistics • Variety of reasons for insufficient scores: - Inadequate knowledge of community needs - Staff skill deficiencies - Inadequate staffing • - Inefficient workflows • Inadequate planning • - Limited user • competencies • . . . and others Adapted from Poll and te Boekhorst (2007)

  27. Output measures “reflect the interaction of users and library resources, constrained by the environment in which they operate. The meaning of a specific score on any measure depends on a broad range of factors including the library’s goals, the current circumstances of the library and its environment, the users, the manner in which the measure was constructed, and how the data were collected.” [emphasis added] - Van House, Weill, and McClure (1990)

  28. Policy-Level Problems withLibrary Statistics • PLA managing-for-results approach has produced undesirable results • Confusion about meanings of statistical indicators • Expectations that local libraries are able to interpret data productively have been too optimistic

  29. Policy-Level Problems withLibrary Statistics • Exaggerated or inaccurate advocacy campaigns undermine credibility of assessment process, methods, and data • Biased advocacy studies at cross-purposes with need for accountability

  30. Negative Side-Effects of Library Advocacy Practices • Advocacy narratives can ‘dumb down’ data analysis and assessment efforts • Encourage absurd interpretations of library statistics • Misuse key assessment terms and concepts • Promote unjustifiable conclusions drawn from studies that have employed limited research methods

  31. Improvements Needed • Commit to ensuring credibility of assessment data and performance measurement findings • Discourage naïve, disingenuous, and unsupportable use of statistics or assessment findings • Specify profession’s ideology regarding “rules of evidence”

  32. Improvements Needed • Fuller understanding of limitations of statistical indicators and comparison methods • Discourage describing performance solely based on standard statistics “The measures are best used with other information about the library.” - Van House, Weill, and McClure (1990)

  33. Improvements Needed • Relate levels of resources/services to levels of community needs • Explore relationships among entire set of standard statistical indicators, i.e. complementary and conflicting dimensions

  34. Improvements Needed • Identify peer libraries using multiple indicators: Community population + library budget + key demographic characteristics • Explore feasibility of alternative sets of indicators depending on library type, size, mission, etc.

  35. Improvements Needed • Increased understanding of measurement and interpretation • Draw reasonable conclusions and interpretations • Basic behavioral science measurement practices

  36. Behavioral Science Measurement Model Conceptualization Nominal Definition Operational Definition Measurement in Real World Babbie (2007)

  37. References Ellen Altman. “Reflections on Performance Measures Fifteen Years Later” In Library Performance, Accountability, and Responsiveness: Essays in honor of Ernest R. DeProspo, C.C. Curran and F.W. Summers, eds. (Norwood, NJ: Ablex, 1990) Earl Babbie, The Practice of Social Research, 11th ed. (Beaumont, California: Thomson, 2007) Roswitha Poll and Peter te Boekhorst. Measuring Quality: Performance Measurement in Libraries, 2nd ed. (Munich: KF Saur, 2007) Nancy A. Van House et al. Output Measures for Public Libraries: A Manual of Standardized Procedures, 2nd ed. (Chicago: American Library Association, 1987) Nancy A. Van House, Beth T. Weill, and Charles R. McClure, Library Performance: A Practical Approach (Chicago: American Library Association, 1990)

More Related