470 likes | 578 Views
Teaching Day 2013. How Are Our Students Measuring up? Using Assessment to Align Our Practices and Inform Our Decisions. Dr. Peggy Maki Education Consultant Specializing in Assessing Student Learning. Foci. The Context of a Meaningful and Useful Assessment Process
E N D
Teaching Day 2013 How Are Our Students Measuring up?Using Assessment to Align Our Practices and Inform Our Decisions Dr. Peggy Maki Education Consultant Specializing in Assessing Student Learning
Foci The Context of a Meaningful and Useful Assessment Process The Principles and Processes of a Collaborative Commitment to Assessment
Coding Curricular Maps • Aligning for Learning and Assessing • Identifying or Designing Assessment Methods That Provide Evidence of Product and Process • Chronologically Collecting and Assessing Evidence of Student Learning • Reporting on Results of Scoring Evidence of Student Learning • Establishing Soft Times and Neutral Zones for Faculty and Other Professionals to Interpret Analyzed Results and Develop a Plan to Improve or Advance Student Learning
Problem? How to restructure incorrect understanding of physics concepts became the work of physics faculty at the University of Colorado (PhET project). That is, physics faculty became intellectually curious about how they could answer this question to improve students’ performance over the chronology of their learning.
Within a course or module or learning experience? • Along the chronology of their studies and educational experiences? • From one subject or topic or focus or context to another one such as from an exercise to a case study or internship?
Cognitive Psychomotor Affective Forms of Representation within Contexts Integrated Learning….
Some Research on Learning that Informs Teaching, Learning, and Assessment Learners Create Meaning
Threshold Concepts • pathways central to the mastery of a subject or discipline that change the way students view a subject or discipline, prompting students to bring together various aspects of a subject that they heretofore did not view as related (Land, Meyer, & Smith).
People learn differently and may hold onto folk or naive knowledge, incorrect concepts, misunderstandings, false information • Deep learning occurs over time transference
Learning Progressions • knowledge-based, web-like interrelated actions or behaviors or ways of thinking, transitioning, self-monitoring. May not be developed successfully in linear progression--thus necessitating formative assessment along the trajectory of learning. Movements towards increased understanding (Hess).
Deep Learning Occurs When Students Are Engaged in Their Learning
Learning Strategies of Successful Students in All Majors • Writing beyond what is visually presented during a lecture • Identifying clues to help organize information during a lecture • Evaluating notes after class • Reorganizing notes after class
Comparing note-taking methods with peers • Using one’s own words while reading to make notes • Evaluating one’s understanding while reading • Consolidating reading and lecture notes Source: Calvin Y. Yu: Director of Cook/Douglass Learning Center, Rutgers University
What do you expect your students to demonstrate, represent, or produce by the end of their program of study or by the end of their undergraduate or graduate studies? • What chronological barriers or difficulties do students encounter as they learn--from the moment they matriculate? • How well do you identify and discuss those barriers with students and colleagues and then track students’ abilities to overcome them so that increasingly “more” students achieve at higher levels of performance?
Research or Study Questions • Collaboratively developed • Open-ended • Coupled with learning outcome statements • Developed at the beginning of the assessment process
II. The Principles and Processes of a • Collaborative Commitment to Assessment • A. Coding in Curricular Maps: Where and How Students Learn • Helps us determine coherence among our educational practices that enables us, in turn, to design appropriate assessment methods (See handouts 3-4) • Identifies gaps in learning opportunities that may account for students‘achievement levels • Provides a visual representation of students’learning journey
Cognitive Levels of Learning: Revised Bloom’s Taxonomy (Lorin et. als.)
Helps students make meaning of the journey and holds them accountable for their learning over time • Helps students develop their own learning map—especially if they chronologically document learning through eportfolios
B. Aligning for Learning and Assessing (See Handouts 5, 6-9, and 10)
C. Identifying or Designing Assessment Methods that Provide Evidence of Product and Process
Some Direct Methods to Assess Students’ Learning Processes • Think Alouds: Pasadena City College, “How Jay Got His Groove Back and Made Math Meaningful” (Cho & Davis) • Word edit bubbles • Observations in flipped classrooms • Students’ deconstruction of a problem or issue (PLEs in eportfolios can reveal this - tagging, for example)
Student recorder’s list of trouble spots in small group work or students’ identification of trouble spots they encountered in an assignment • Results of conferencing with students • Results of asking open-ended questions about how students approach a problem or address challenges • Use of reported results from adaptive or intelligent technology
Use of reported results from adaptive or intelligent technology • Focus on hearing about or seeing the processes and approaches of successful and not so successful students • Analysis of “chunks of work” as part of an assignment because you know what will challenge or stump students in those chunks
Some Direct Assessment Methods to Assess Students’ Products • Scenarios—such as online simulations • Critical incidents • Mind mapping • Questions, problems, prompts
Problem with solution: Any other solutions? • Chronological use of case studies • Chronological use of muddy problems • Analysis of video • Debates • Data analysis or data conversion
Some Indirect Methods that Probe Students’ Learning Experiences and Processes • SALG (salgsite.org): Student Assessment of Their Learning Gains • Small Group Instructional Design • Interviews with students about their learning experiences--about how those experiences did or did not foster desired learning, about the challenges they faced and continue to face. (See Handout 11)
D. Chronologically Collecting and Assessing Evidence of Student Learning
Your Method of Sampling Ask yourself what you want to learn about your students and when you want to learn: • All students • Random sampling of students • Stratified random sampling based on your demographics—informative about patterns of performance that can be addressed for specific populations, such as non-native speakers
Scoring Faculty: • Determine when work will be sampled. • Identify who will score student work (faculty, emeritus faculty, advisory board members, others?). • Establish time and place to norm scorers for inter-rater reliability on agreed upon scoring rubric. (See Handout 12)
E. Reporting on Results of Scoring Evidence of Student Learning Office of IR, Director of Assessment, or “designated other”: • Analyzes and represents scoring or testing results that can be aggregated and disaggregated to represent patterns of achievement and to answer the guiding research or study question(s) • Develops a one-page Assessment Brief
The Assessment Brief Is organized around issues of interest, not the format of the data (narrative or verbal part of the brief). Reports results using graphics and comparative formats (visual part of the brief, such astrends over time, for example, or achievement based on representative populations).
F. Establishing Soft Times and Neutral Zones for Faculty and Other Professionals to Interpret Analyzed Results • Identify patterns against criteria and cohorts (if possible) • Tell the story that explains the results— triangulate with other reported data, such as results of student surveys. • Determine what you wish to change, revise, or how you want to innovate and develop a timetable to reassess once changes are implemented. (See Handouts 13-14)
What if we…. Collaboratively use what we learn from this approach to assessment to design the next generation of curricula, pedagogy, instructional design, educational practices, and assignments to help increasingly more students successfully pass through trouble spots or overcome learning obstacles;
and, thereby, collaboratively commit to fostering students’ enduring learning in contexts other than the ones in which they initially learned. (See Handout 15 to identify where you see the need to build your program’s assessment capacity.)
Works Cited • Cho, J. and Davis, A. 2008. Pasadena City College. “How Jay Got His Groove Back and Made Math Meaningful.”http://www.cfkeep.org/html/stitch.php?s=13143081975303&id=18946594390037 • Hess, K. 2008. Developing and Using Learning Progressions as a Schema for Measuring Progress. National Center for Assessment, 2008. http://www.nciea.org/publications/CCSSO2_KH08.pdf • Land, R., Meyer, J.H.F., and Smith, J. Eds. 2010. Threshold Concepts and Transformational Learning. Rotterdam: Sense Publishers. • Lorin, A.W., Krathwohl,D.R., Airasian, P.W., and Cruikshank, K.A. Eds. (2000). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: Boston, MA.: Allyn and Bacon.
Maki, P. 2010. 2nd Ed. Assessing for Learning: Building a Sustainable Commitment Across the Institution. VA: Stylus Publishing, LLC • National Research Council. 2002. Knowing What Students Know: he Science and Design of Educational Assessment. Washington, D.C. • Yu, C. Y. “Learning Strategies Characteristic of Successful Students.” Maki, P. 2010. p. 139.