1 / 23

Assessment: The “Hidden Variable” of Achievement

Assessment: The “Hidden Variable” of Achievement. A Field-tested Learning Assessment Guide (FLAG) for STEM Instructors Michael Zeilik University of New Mexico (www.flaguide.org). Pop Quiz! In terms of learning gains , which instructor attribute enhances achievement the most?.

Download Presentation

Assessment: The “Hidden Variable” of Achievement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment: The “Hidden Variable” of Achievement A Field-tested Learning Assessment Guide (FLAG) for STEM Instructors Michael Zeilik University of New Mexico (www.flaguide.org)

  2. Pop Quiz!In terms of learning gains, which instructor attribute enhances achievement the most? • A. Teaching experience • B. Clarity of presentations • C. Energy and enthusiasm • D. Deep knowledge of subject • E. None of the above

  3. What Works? 20th Century Gain Results • Active Learning, Mastery Learning(SD ≥ 0.5, PSI, discussion, debates, games, role playing, controversy) • Cooperative Learning(SD ≥ 0.5, cognitive and affective; century of research, all disciplines) • One-on-One Tutoring(SD = 2, with trained tutors)

  4. What Does Not Work? (Small gains SD < 0.3) • EVERYTHING ELSE! • Lectures (“standard model”), reinforce memorization! • Unstructured discussion, supervised independent study, autonomous small groups, self study • Audio-tutorials, programmed instruction, computer-based instruction, instructional television, Web-based instruction

  5. What Works in Context: Physics & Astronomy • Disciplinary education research, uses “tools of the trade” to conduct experiments • Create a mostly empirical robustknowledge base about learning in physics & astronomy • Probe initial state (prior knowledge), final state (learning outcomes), and student thinking • Measurement: Assessment!

  6. Galileo Galilei: “Measure what is measurable, and make measurable what is not so.” But: “Measure what you value, and value what you measure.” (M. Zeilik)

  7. FLAG Features • Assessment Basics • “What is this assessment business all about?” • Making Goals • “What do you want to measure?” • Classroom Assessment Techniques (CATs) • “How do you measure it?” • Searchable Database—Toolbox • “What are good tools for measurement?” • All peer reviewed and evidence based

  8. Attitudinal Surveys: E. Seymour, E. Lewis Concept Tests: A. Ellis Concept Maps: M. Zeilik Conceptual Diagnostic Tests: M. Zeilik Interviews: M. Smith, S. A. Southerland Performance Assessments: T. Slater Portfolios: T. Slater Scoring Rubrics: D. Ebert-May Student Assessment of Learning Gains: E. Seymour Weekly Reports: E. Etkina Mathematical Thinking: M. Swan, J. Rideway Multiple-Choice Tests: J. Parkes Minute Papers: M. Zeilik FLAG CATs

  9. Minute Paper • Take a few minutes at the end of class and ask for a written response to: • “What was the most importantconcept you learned in class?” • “What important question remains unanswered?” • “What was the muddiest pointof this class?” • Few Minute Paper: Teams reach consensus, submit written report • Analysis: Sort into themes (cards) • Weekly Report: Extended minute paper

  10. Concept Tests • Quick feedback on conceptual (not factual) understanding • Instructor gives conceptual questionwith choices (common “misconceptions”) • After a minute, whole class responds(hands, flash cards, class polling system) • Instructor assesses responses: If most incorrect, pair up do discuss (peer teaching) • Class response again to gauge mastery; instructor adapts in real time

  11. Earth asteroid The magnitude of the force exerted by the asteroid on the Earth is a) largerthan the magnitude of the force exerted by the Earth on the asteroid b) the same as the magnitude of the force exerted by the Earth on the asteroid c) smallerthan the magnitude of the force exerted by the Earth on the asteroid d) zero. (the asteroid exerts no force on the Earth) .

  12. Results: Midwestern Universities (Dostal)(P111, conceptual; P221 calc; P112 algebra)

  13. Student Assessments • Attitude Surveys: Perceptions about course, discipline; seem easy-not!, best based on a robust, field-tested model • Minute Paper: “Given limited resources, what one change would you make to improve this course?”; sort by themes • Student Assessment of Learning Gains (SALG): Probes learning gains that students perceive; avoids performance critiques; easily customized (15 min); available on-line

  14. Cooperative Quiz Gainsby Gender

  15. Does it stick?(Coop quizzes/Test)

  16. Attitude Results: Intro Astro & Physics

  17. Conceptual Diagnostic Tests • Ideally research-based on “misconceptions”, revealed by student “think aloud” interviews • Measures pre/post conceptual gainsas a summative assessment • Force Concept Inventory (FCI; 1985); Astronomy Diagnostic Test (ADT; 1999) version 2; national baselines; large data sample (about 5,000) • Follow protocol!

  18. ADT 2 UNM Fall 2000 vs. National ES (F) = 0.84 => 80% of postscores above mean of prescores ES (M) = 0.53 => 70% of postscores above mean of prescores (Pre = 5346; Post = 3842)

  19. Pre/post: ADT National Project/UNM<g> = (post% - pre%)/(100% - pre%) Standard errors plotted UNM UNM

  20. Classroom Assessment:Good News! Well-done formative assessment results in a pre/post gain of about 0.5 standard deviation(70% rather than 50% on a “standardized” test)

More Related