1 / 42

The South Dakota Regental Information Literacy Exam : A Tool to Document and Assess Information Literacy

The South Dakota Regental Information Literacy Exam : A Tool to Document and Assess Information Literacy. Carol A. Leibiger, Ph.D. Head of Public Services and Reference, Information Literacy Coordinator, I.D. Weeks Library, University of South Dakota William Schweinle, Ph.D.

marilu
Download Presentation

The South Dakota Regental Information Literacy Exam : A Tool to Document and Assess Information Literacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The South Dakota Regental Information Literacy Exam : A Tool to Document and Assess Information Literacy Carol A. Leibiger, Ph.D. Head of Public Services and Reference, Information Literacy Coordinator, I.D. Weeks Library, University of South Dakota William Schweinle, Ph.D. Assistant Professor of Psychology Boise State University

  2. Information Literacy and Life-long Learning • An information literate person is able to: • determine the extent of the information needed • access the needed information effectively and efficiently • evaluate information and its sources critically • incorporate selected information into one’s knowledge base • use information effectively to accomplish a specific purpose • understand the economic, legal, and social issues surrounding the use of information, and access and use information ethically and legally American Library Association/Association of College and Research Libraries, Information Literacy Competency Standards for Higher Education (2000)

  3. Rise of IL Instruction and Assessment • Higher education assessment movement • Rise of strategic planning and Total Quality Management (TQM) in higher education • Library instruction movement: change in focus from library skills to IL in academic libraries • General education reform movement • Inclusion of IL in accreditation standards Meulemans, Y. (2002); Rockman, I. (2002)

  4. Other Documents Supporting IL • Reform on Campus (1972, and follow-up reports of the Carnegie Commission on Higher Education) • Information Literacy Competency Standards for Higher Education (1989) • SCANS Report (Secretary’s Commission on Achieving Necessary Skills, Dept. of Labor, 1991) • Goals 2000: National Educate America Act (1994) • Information Power National School Library IL Standards (1998) • Greater Expectations: A New Vision for Learning as a Nation Goes to College (2002, Association of American Colleges and Universities)

  5. Higher Learning Commission (2003) • Criterion 4, “Acquisition, discovery, and application of knowledge” • 4a: The organization demonstrates, through the actions of its board, administrators, students, faculty, and staff, that it values a life of learning. • 4b: The organization demonstrates that acquisition of a breadth of knowledge and skills and the exercise of intellectual inquiry are integral to its educational programs. • 4c: The organization assesses the usefulness of its curricula to students who will live and work in a global, diverse, and technological society. • 4d: The organization provides support to ensure that faculty, students, and staff acquire, discover, evaluate, and apply knowledge responsibly.

  6. Graduate Student IL Assessment Graduate students… • often overestimate their information-finding skills • operate on the principle of least effort • tend to choose the easiest and most convenient resources • don’t know enough - yet - about their disciplines to be effective searchers • don’t have the critical skills to handle the information explosion George, C., Bright, A., Hurlbert, T., Linke, E. C., St. Clair, G., & Stein, J. (2006); Bellard, E.M. (2005); Chu, S. K.-W., & Law, N. (2003), Grant, M. & Bert, M., (2003).

  7. Graduate Student IL Assessment • Demographic changes in the graduate student population • average graduate student is female over 35 • has been away from education for at least 2 years • linguistically, ethnically diverse • work and family responsibilities • time management issues • feelings of inadequacy • technology anxiety Bellard, E.M. (2005); Gordon, C. (2002)

  8. Graduate Student IL Assessment • Graduate students can benefit from IL instruction because they are also: • highly- and self-motivated learners • cognitively mature • understand their own learning styles, • apply meta-cognitive strategies to their information seeking • familiar with higher-quality search engines like Google Scholar • an important conduit of IL instruction for undergraduate students

  9. Graduate Student IL Assessment • no systematic IL assessment of advanced learners • isolated suggestions and tools • research paper required of all applicants to graduate programs • IL skills audit or test required of entering graduate students (University of Missouri-Columbia, Boston College, and Australian National University)

  10. Graduate Student IL Assessment • SDILE can serve as an entrance assessment of IL for advanced learners • skill set assumed for graduate students is congruent with that of undergraduate students who are information literate • short yet valid and reliable instrument documenting and assessing IL • allows identification of deficiencies in IL and formulation of appropriate remediation

  11. South Dakota Regental Universities • Black Hills State University • Dakota State University • Northern State University • South Dakota State University • South Dakota School of Mines and Technology • University of South Dakota (USD)

  12. South Dakota BOR ITL Requirement • Information Technology Literacy was defined institution-specifically. • All universities except USD defined ITL as IT. • Definition affected how it was taught and assessed. • Only USD taught and assessed it as IL

  13. South Dakota System General Education Requirements (2005) • Goal #1: Student will write effectively and responsibly and will understand and interpret the written expression of others. • Goal #2: Student will communicate effectively and responsibly through listening and speaking. • Goal #3: Student will understand the organization, potential, and diversity of the human community through study of the social sciences. • Goal #4: Students will understand the diversity and complexity of the human experience through study of the arts and humanities. • Goal #5: Students will understand and apply fundamental mathematical processes and reasoning. • Goal #6: Students will understand the fundamental principles of the natural sciences and apply scientific methods of inquiry to investigate the natural world. • Goal #7: Students will recognize when information is needed and have the ability to locate, organize, critically evaluate, and effectively use information from a variety of sources with intellectual integrity.

  14. IL Student Learning Outcomes Students will… • determine the extent of information needed; • access the needed information effectively and efficiently; • evaluate information and its sources critically; • use information effectively to accomplish a specific purpose; • use information in an ethical and legal manner. (ALA/ACRL IL Competency Standards)

  15. Seeking an IL Assessment: Problems With the ITL Exam • Characteristics of the ITL Exam • WebCT • 20 questions • Multiple-choice questions • Passing score 13/20 (65% correct) • Problems • Function = documentation of IL • No assessment value (KR20 ≈ .30) • Privileged students who passed SPCM 101 at USD

  16. Seeking an IL Assessment: National Standardized Exams • Project SAILS • ETS Information and Communication Technology (ICT) Literacy Assessment • James Madison Information Literacy Test (ILT)

  17. Seeking an IL Assessment The BOR institutions’ needs: • true assessment of IL (not ITL) • short, yet valid and reliable, instrument • student-level information

  18. IL Subcommittee • Co-chaired by BOR assessment expert and a librarian (IL Coordinator, USD) • 5 assessment experts (including one psychometrician, USD’s Director of Assessment) • 5 library faculty • 2 English instructors • 1 Communication Studies instructor • Charged with creating an assessment with special properties

  19. Special Properties Required of the SD IL Exam (SDILE) • Brevity: 25 multiple-choice questions • Online delivery • Content valid vis-à-vis the Association of College & Research Libraries’ (ACRL) IL Competency Standards for Higher Education • Discrete cutoff (proficiency threshold) • Continuous (assessment) scores • Both documents and assesses IL

  20. IL Exam Questions • The Solution: Two scoring methods • The documentation questions will have low and very similar item “difficulties” (locations in IRT terms). • The assessment questions will be more difficult and be more varied in their “difficulty”. • For each set of 5 questions • 3 documentation: documents attainment of minimum level of Information Literacy, i.e., close to a fixed point • 2 assessment: assesses levels of Information Literacy along a continuum, i.e., along a line. Documentation Items Assessment Items “Difficulty” Proficiency Point

  21. SLO 2: Documentation QuestionAccess the needed information effectively and efficiently Why is Interlibrary Loan so valuable to a student’s research? a. It allows a student to visit and check out materials from a library that is not his/her local library. b. It allows a student to request materials from a library that is not his/her local library. c. It allows a student to access online materials at a library that is not his/her local library. d. It allows a student to purchase materials not located in his/her local library. Classical Difficulty = .87

  22. SLO 2: Assessment QuestionAccess the needed information effectively and efficiently Your instructor has given an assignment that requires the use of primary source materials. Which would you consult? a. a biography of someone involved in the issue with criticism b. a diary written by someone who was involved in the issue c. a textbook article about someone who was involved in the issue d. a journal articles about someone who was involved in the issue Classical Difficulty = .63

  23. Analyses • Classical: average item difficulties across ALL item types is around 0.70, giving a pass rate of about 98% with a cut-score of 13/25. • Two IRT analyses for the items relevant to each ACRL Standard (unidimensionality) • 2PL (Rasch) to look for location (Θ) and discrimination (slopes) • Nominal (Bock)

  24. 2PL (Rasch) Traces for Items

  25. “Partial Credit” Traces for SLO2 Documentation 1.0 0.8 0.6 0.4 0.2 0 -3 -2 -1 0 1 2 3 Nominal Response Model Item Characteristic Curve b a Probability c d Ability

  26. Ongoing Test Revision • Items are and will be added, revised, tested and dropped with each IL testing cycle. • Biased (DIF) items will be removed. • Gender • Ethnicity • Location • Etc.

  27. Nominal (Bock) Scores for the SD BOR Institutions

  28. Additional Evidence: Correlations Between Nominal and Classical Scores (n = 2171) (Red correlations are not significant at p < .05)

  29. SLO 4: Documentation Question(Old 4D2a) To best demonstrate the scope of a problem one should use... a. pictures. b. statistics. c. books. d. articles.

  30. Bock Nominal Response Model 1.0 b 0.8 0.6 Probability 0.4 0.2 c d a 0 -3 -2 -1 0 1 2 3 Ability Item Characteristics (Old 4D2a)

  31. ACRL Student Learning Outcome #4 An Information Literate student will use information effectively to accomplish a specific purpose. Problems: • The question lacks a context? • This SLO requires higher cognitive processing which is difficult to test with MC items? • This SLO represents the intersection between research (taught by the library) and argumentation (taught in ENGL and SPCM). Should use of information in argumentation be more effectively taught in gen. ed. courses? What do you think?

  32. 4D2a Revised The best visual aid for a speech comparing changes in the profits of two or three competing companies over a three-year period is... a. a spreadsheet. b. a market analysis. c. a line chart. d. a table.

  33. Item Characteristics(New 4D2a)

  34. Benefits of the SDILE • A dual-purpose IL Exam • Documentation • Assessment • Random item rotation • Continuous improvement and refinement • Low cost – WebCT administration

  35. The End of the SDILE: Bureaucratic Blunder • Feb. 2005: The IL Exam questions (with answers) were posted to the BOR web site (www.sdbor.edu) • 2006-2007: Conference presentations on the IL Exam actively sought beta-testing partners • April 2007: Acclaimed presentation at ACRL, interest in beta-testing/cooperation from Project SAILS and 9 prestigious colleges and research universities

  36. The End of the SDILE: Bureaucratic Blunder • April 2007: Student taking the IL Exam discovered the questions/answers online at the SD BOR web site (invalidated the pilot) • Investigators discovered that the IL Exam questions had been downloaded 293 times (in-state, in-country, and abroad) • Beta-testing partners were notified to stop the pilot

  37. The End of the SDILE: The Vultures Gather… • A member regental university had been a “less than enthusiastic” participant • Once the IL Exam pilot was invalidated, this university immediately proposed dropping the exam • New BOR academic officer had also problematized the notion of an IL Exam • May 2007: AAC/BOR persuaded to drop the IL Exam as a system IL requirement

  38. The End of the SDILE:USD Drops the IL Exam • Spring 2007: USD decides to reconstitute the IL Exam as an institutional measure • Summer 2007: Will leaves USD for Boise State U. • USD refuses to hire Will as consultant • USD claims IP ownership of the IL Exam

  39. Lessons Learned • Avoid unfunded mandates • Get release time or some other tangible benefit, in writing, before beginning • Get buy-in rather than imposing mandates • Institutions should genuinely support the project • Members should support the project, even if their institutions don’t • Give creative teams room to work; don’t oversee or micro-manage

  40. Lessons Learned • Choose participants carefully. Members should… • be chosen because of subject know-ledge and competence • be competent to understand all members’ roles in the project • promote the project’s agenda rather than their own or their institution’s agenda

  41. Lessons Learned • Clarify IP issues • Work out IP ownership issues up-front • BOR IP • Institutions’ IP • Individual members’ IP vs. work product • Protect secret information • Clear those who handle information • Don’t post to unprotected sites • Provide consequences for divulging IP • Don’t expect kudos, gratitude, or apologies from administration during or after the project

  42. For further information… • On the SDILE • Carol Leibiger, Head of Public Services and Reference, Information Literacy Coordinator, University of South Dakota C.Leibiger@usd.edu • William Schweinle, Assistant Professor of Psychology, Boise State University WillSchweinle@boisestate.edu • On IRT Reise, S. P., Ainsworth, A. T., & Haviland, M. G. (2005). Item Response Theory: Fundamentals, applications, and promise in psychological research. Current Directions in Psychological Research, 14, 2, 95-101.

More Related