1 / 25

Collection Analysis: Overview Sponsored by ALCTS CMDS Measures & Education Committees

Collection Analysis: Overview Sponsored by ALCTS CMDS Measures & Education Committees. Peggy Johnson, Associate University Librarian University of Minnesota m-john@umn.edu. “Culture of Assessment/Evaluation”. Way to demonstrate Relevance Value Impact Considered from the view of Users

elma
Download Presentation

Collection Analysis: Overview Sponsored by ALCTS CMDS Measures & Education Committees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collection Analysis:OverviewSponsored by ALCTS CMDS Measures & Education Committees Peggy Johnson, Associate University Librarian University of Minnesota m-john@umn.edu

  2. “Culture of Assessment/Evaluation” • Way to demonstrate • Relevance • Value • Impact • Considered from the view of • Users • Stakeholders Amos Lakos & Shelley Phipps – “Culture of Assessment” John Crawford – “Culture of Evaluation”

  3. “Those who fail to move in the direction of systematic assessment will be unable to cope with the increasingly difficult questions that promise to confront collection officers in years to come.” Mark Sandler, Univ. of MI

  4. Why do we do it? As part of good management. • Accountability: To demonstrate to funders and clients that the service is delivering the benefits expected when the investment was made • Decision-making: To ensure that resources are being used efficiently and effectively (an internal control mechanism) • Marketing: To report success and accomplishments (public relations)

  5. Collection assessment assumes that the criteria for success are defined and understood by those doing the assessment and those to whom it is being reports.

  6. What is it? A mechanism to determine: • If the collection is meeting its objectives • How well it is serving its users • In which ways or areas it is deficient, and what remains to done to develop the collection • If selectors are performing their responsibilities effectively • How to allocate collections/access funds

  7. How is assessment different from evaluation? • Evaluation determines how well the collection supports the goals, needs, and curriculum of the parent organization. • Assessment examines or describes collections either in their own terms or relative to other collections and checklists.

  8. Who is the audience ? • Accreditation agencies • Parent organization (administration, board, senior management) • Library administration • CDM supervisor • Selector • User community or communities • Consortial partners

  9. How can we do it well? • Simple • Practical • Repeatable • Clear focus • Understandable results • Meaningful results • Results lead to action

  10. Collection-based Measures Look at: • Size • Growth • Coverage (depth, breadth, balance)

  11. Collection-based Measures • Checking lists, catalogs, bibliographies • Evaluating the collection directly • Compiling comparative statistics • Application of collection standards

  12. Use- and User-based Measures Look at: • Who is using the collection? • How often? • What are users’ expectations? • What are user’s needs? • What are their perceptions?

  13. Use- & User-based Measures • Circulation studies • In-house use studies • Survey of users • Shelf availability studies • Analysis of online usage of electronic resources • Analysis of ILL statistics • Citation studies • Document delivery tests • Cost-per-use

  14. Quantitative Measures • Count things • Use • Expenditures • Titles • Physical items

  15. Quantitative Measures • Titles • Circulation transactions • Expenditures • E-metrics • ILL transactions • Ratios (monographs/serials; volumes/students; expenditures/degree programs; electronic/print)

  16. E-Metrics • Online sessions • Documents downloaded • Records downloaded • Virtual visits • Turn-aways • Alert usage • Personal profile users • Remote versus onsite usage

  17. Qualitative Research “A process of inquiry that draws data from the context in which events occurs . . . using induction to derive possible explanations based on observed phenomena” Gorman and Clayton, Qualitative Research for the Information Professional: A Practical Handbook, 2nd ed. (London: Facet, 2005)

  18. Qualitative Measures Look at: • Strengths • Weaknesses • Non-strengths

  19. Qualitative Measures • Provide the context • Offer a way to understand the attitudes that inform the statistics

  20. Quantitative Measures • Focus groups • Online or printed surveys • Interviews (structured or unstructured) • Observation

  21. Collection Analysis Methods

  22. Steps in a Collection Analysis Project

  23. Where to start? • Define the question or problem • Determine metrics to use • Decide: • Where to locate the information • Who will collect the information • Who will analyze and report the information • Who will act on the information

  24. Remember • Chose measures that matter • Chose an approach that is simple • Don’t aim for perfection—good ‘nuff is OK • Don’t do it once and never again • Know your audience • Present data in a context—explain what it means

More Related