360 likes | 464 Views
Performance Assessment, Rubrics, & Rating Scales. Trends Definitions Advantages & Disadvantages Elements for Planning Technical Concerns. Types of Performance Assessments. Performance Assessment. Who is currently using performance assessments in their courses or programs?
E N D
Performance Assessment, Rubrics, & Rating Scales • Trends • Definitions • Advantages & Disadvantages • Elements for Planning • Technical Concerns Deborah Moore, Office of Planning & Institutional Effectiveness
Types of Performance Assessments Deborah Moore, Office of Planning & Institutional Effectiveness
Performance Assessment Who is currently using performance assessments in their courses or programs? What are some examples of these assessment tasks? Deborah Moore, Office of Planning & Institutional Effectiveness
Primary Characteristics • Constructed response • Reviewed against criteria/continuum (individual or program) • Design is driven by assessment question/ decision Deborah Moore, Office of Planning & Institutional Effectiveness
Why on Rise? • Accountability issues increasing • Educational reform has been underway • Growing dissatisfaction with traditional multiple choice tests (MC) Deborah Moore, Office of Planning & Institutional Effectiveness
Exercise 1 • Locate the sample rubrics in your packet. • Working with a partner, review the different rubrics. • Describe what you like and what you find difficult about each (BE KIND). Deborah Moore, Office of Planning & Institutional Effectiveness
Advantages AsReported By Faculty • Clarification of goals & objectives • Narrows gap between instruction & assessment • May enrich insights about students’ skills & abilities • Useful for assessing complex learning Deborah Moore, Office of Planning & Institutional Effectiveness
Advantages for Students • Opportunity for detailed feedback • Motivation for learning enhanced • Process information differently Deborah Moore, Office of Planning & Institutional Effectiveness
Disadvantages • Requires Coordination • Goals • Administration • Scoring • Summary & Reports Deborah Moore, Office of Planning & Institutional Effectiveness
Disadvantages • Archival/Retrieval • Accessible • Maintain Deborah Moore, Office of Planning & Institutional Effectiveness
Disadvantages • Costs • Designing • Scoring (Train/Monitor) • Archiving Deborah Moore, Office of Planning & Institutional Effectiveness
Steps in Developing Performance Assessments 1. Clarify purpose/reason for assessment 2. Clarify performance 3. Design tasks 4. Design rating plan 5. Pilot/revise Deborah Moore, Office of Planning & Institutional Effectiveness
Steps in Developing Rubrics 1. Identify purpose/reason for rating scale 2. Define clearly what is to be rated 3. Decide which you will use a. Holistic or Analytic b. Generic or Task-Specific 4. Draft the rating scale and have it reviewed Deborah Moore, Office of Planning & Institutional Effectiveness
Recommendations Deborah Moore, Office of Planning & Institutional Effectiveness
Steps in Developing Rubrics (continued) 5. Pilot your assessment tasks and review 6. Apply your rating scales 7. Determine the reliability of the ratings 8. Evaluate results and revise as needed. Deborah Moore, Office of Planning & Institutional Effectiveness
Descriptive Rating Scales • Each rating scale point has a phrase, sentence, or even paragraph describing what is being rated. • Generally recommended over graded-category rating scales. Deborah Moore, Office of Planning & Institutional Effectiveness
Portfolio Scoring Workshop Deborah Moore, Office of Planning & Institutional Effectiveness
Subject Matter Expertise Experts like Dr. Edward White join faculty in their work to refine scoring rubrics and monitor the process. Deborah Moore, Office of Planning & Institutional Effectiveness
Exercise 2 • Locate the University of South Florida example. • Identify the various rating strategies that are involved in use of this form. • Identify strengths and weaknesses of this form. Deborah Moore, Office of Planning & Institutional Effectiveness
Common Strategy Used • Instructor assigns individual grade for an assignment within a course. • Assignments are forwarded to program-level assessment team. • Team randomly selects a set of assignments and assigns a different rating scheme. Deborah Moore, Office of Planning & Institutional Effectiveness
Exercise 3 • Locate Rose-Hulman criteria. • Select one of the criteria. • In 1-2 sentences, describe an assessment task/scenario for that criterion. • Develop rating scales for the criterion. • List traits • Describe distinctions along continuum of ratings Deborah Moore, Office of Planning & Institutional Effectiveness
Example of Consistent & Inconsistent Ratings Deborah Moore, Office of Planning & Institutional Effectiveness
Calculating Rater Agreement (3 Raters for 2 Papers) Deborah Moore, Office of Planning & Institutional Effectiveness
Rater Selection and Training • Identify raters carefully. • Train raters about purpose of assessment and to use rubrics appropriately. • Study rating patterns and do not keep raters who are inconsistent. Deborah Moore, Office of Planning & Institutional Effectiveness
Some Rating Problems • Leniency/Severity • Response set • Central tendency • Idiosyncrasy • Lack of interest Deborah Moore, Office of Planning & Institutional Effectiveness
Exercise 4 • Locate Generalizability Study tables (1-4). • In reviewing table 1, describe the plan for rating the performance. • What kinds of rating problems do you see? • In table 2, what seems to be the biggest rating problem? • In table 3, what seems to have more impact, additional items or raters? Deborah Moore, Office of Planning & Institutional Effectiveness
Generalizability Study (GENOVA) • G Study: identifies sources of error (facet) in the overall design; estimates error variance for each facet of the measurement design • D Study: estimates reliability of ratings with current design to project outcome of alternative designs Deborah Moore, Office of Planning & Institutional Effectiveness
Summary • Interpretation - Raters using the rubric in non-systematic ways • Reliability (phi) values range from .21 to .67 for the teams—well below .75 level desired Deborah Moore, Office of Planning & Institutional Effectiveness
What Research Says About Current Practice Deborah Moore, Office of Planning & Institutional Effectiveness
Use on the rise Costly Psychometrically challenging Summary Deborah Moore, Office of Planning & Institutional Effectiveness
Thank you for your attention. • Deborah Moore, Assessment Specialist • 101B Alumni GymOffice of Planning & Institutional Effectiveness • dlmoor2@email.uky.edu • 859/257-7086 • http://www.uky.edu/LexCampus/; http://www.uky.edu/OPIE/ Thank you for attending. Deborah Moore, Office of Planning & Institutional Effectiveness