1 / 48

Assessment Workshop

Assessment Workshop. College of San Mateo February 2006. Program Assessment. Program assessment is an on-going process designed to monitor and improve student learning. Faculty: develop explicit statements of what students should learn.

yadid
Download Presentation

Assessment Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Workshop College of San Mateo February 2006

  2. Program Assessment Program assessment is an on-going process designed to monitor and improve student learning. Faculty: • develop explicit statements of what students should learn. • verify that the program is designed to foster this learning. • collect empirical data that indicate student attainment. • use these data to improve student learning.

  3. Why so much emphasis on assessment? • Accreditation Expectations • Being Learning-Centered • The Bottom Line

  4. ACCJC Expectations • For general education • For assessment

  5. Pop Quiz 1. ACCJC expects institutions to integrate learning outcomes into a. programs b. program review processes c. syllabi d. grading practices e. all of the above

  6. 2. Who should control the assessment of student learning? • Administrators • External consultants • Faculty • Institutional research professionals

  7. Learning-Centered Institutions • Academic program goals and curriculum • How students learn • Course structure and grading • Pedagogy and course delivery • Faculty instructional roles • Assessment • Campus support for learning

  8. The Cohesive Curriculum • Coherence • Synthesizing Experiences • Ongoing Practice of Learned Skills • Increasing Sophistication and Application

  9. Curriculum Alignment Matrix • I = outcomes are introduced at the basic level • D = students are given opportunities to practice, learn more about, and receive feedback to develop more sophistication • M = students demonstrate mastery at a level appropriate for graduation Is this a cohesive curriculum?

  10. Course Planning • Course Outcome • Activity • Assessment

  11. Assessment Steps 1. Goals and outcomes 2. Alignment 3. A meaningful, manageable, sustainable assessment plan 4. Collect assessment data. 5. Close the loop. 6. Routinely examine the assessment process.

  12. Never test the depth of the water with two feet.

  13. Elements of an Assessment Plan • Who? • What? • When? How often? • Where? • How?

  14. Quotations from the Wise and Experienced

  15. We don’t have to assess every outcome in every student every year!

  16. Vocabulary • Direct vs. Indirect Assessment • Quantitative vs. Qualitative Assessment • Value-Added vs. Absolute Attainment • Embedded Assessment • Authentic Assessment • Formative vs. Summative Assessment

  17. Articulating Learning Outcomes: • Knowledge • Skills • Values

  18. Outcomes at Different Levels • Course Session Level • Course Level • Program Level • Institutional Level

  19. Program Learning Outcomes: • Focus on what students learn. • Should be widely distributed. • Should be known by all major stakeholders. • Guide course and curriculum planning. • Encourage students to be intentional learners. • Focus assessment efforts.

  20. Mission, Goals, and Outcomes • Mission • Goals • Outcomes

  21. Examples Mission Goals Outcomes

  22. Is each a mission, goal, or outcome?

  23. Tips to Develop Program Goals and Outcomes

  24. Possible Learning Goals • Institution-Wide Goals • Program-Specific Goals

  25. Bloom’s Taxonomy

  26. Effective Learning Outcomes • Use active verbs to describe behaviors. • Identify the expected depth of processing. • Distinguish between absolute and value-added expectations.

  27. Outcomes for Administrative and Academic Support Units • Processes • Learning Outcomes • Satisfaction Indicators

  28. Effective Outcomes • Consistent with unit and campus mission • Realistic • Few in number • Used by staff to set priorities and make decisions

  29. Assessment Strategies for Administrative and Academic Support Units • Counts • Client Satisfaction • External Evaluations • Learning Outcomes

  30. Implementation Ideas, Insights, and Brainstorms

  31. Assessment Techniques • Direct • Indirect

  32. Properties of Good Assessment Techniques • Valid • Reliable • Actionable • Efficient and cost-effective • Engaging to respondents • Engaging to us • Triangulation

  33. Strategies for Direct Assessment • Published Tests • Locally-Developed Tests • Embedded Assessment • Portfolios • Collective Portfolios

  34. Implementation Ideas, Insights, and Brainstorms

  35. Strategies for Indirect Assessment • Surveys • Interviews • Focus Groups

  36. Implementation Ideas, Insights, and Brainstorms

  37. Developing and Applying Rubrics • Holistic rubrics • Analytic rubrics

  38. Rubric Examples

  39. Online Rubrics

  40. Rubric Strengths • Complex products or behaviors can be examined efficiently. • Developing a rubric helps to precisely define faculty expectations. • Well-trained reviewers apply the same criteria and standards. • Rubrics are criterion-referenced, rather than norm-referenced. • Ratings can be done by faculty or others.

  41. Rubrics can be useful for grading, as well as assessment. • Points for grading vary among faculty • Categories are used for assessment • Opportunity to provide formative feedback to students

  42. Using Rubrics in Courses 1. Hand out rubric with assignment. 2. Use rubric for grading. 3. Develop rubric with students. 4. Students apply rubric to examples. 5. Peer feedback using rubric. 6. Self-assessment using rubric.

  43. Generic Rubric Check for inter-rater reliability to see if it works.

  44. Creating a Rubric • Adapt an existing rubric • Analytic approach • Expert-systems approach

  45. Managing Group Readings • One reader/document • Two independent readers/document • Paired readers

  46. Before Inviting Colleagues • Develop and pilot test the rubric. • Select exemplars. • Develop a recording system. • Consider pre-programming a spreadsheet for data collection.

  47. Rubric Orientation and Calibration

  48. Implementation Ideas, Insights, and Brainstorms

More Related