1 / 29

Re-visioning the General Education System

Re-visioning the General Education System. Dr. Darby Hiller, Office of Research, Planning & Effectiveness Northwestern Michigan College Michigan Association for Institutional Research November 6-8, 2013 Grand Rapids, Michigan. Ancient History. 2000-2005.

oprah
Download Presentation

Re-visioning the General Education System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Re-visioning the General Education System Dr. Darby Hiller, Office of Research, Planning & Effectiveness Northwestern Michigan College Michigan Association for Institutional Research November 6-8, 2013 Grand Rapids, Michigan

  2. Ancient History 2000-2005

  3. In the beginning…there was Accreditation • Develop common learning outcomes • Develop methods to assess them • Communications • Critical Thinking • Cultural Perspectives • Artifact Method + Rubrics • Monitoring Report to HLC (2003)

  4. Then IR happened… • CHECK • Rotating one outcome a semester • Larger artifact samples: at least 350 for near-graduates • Big scoring day: 30-50 faculty members • Surveys of student perception: current students, graduates • ACT’s CAAP test • Communications • Critical Thinking • Driving proof of ADJUST • College-Wide Assessment Team becomes Scholarship Action Group Progress Report to HLC (2005)

  5. Ideal Quality Improvement Academic leaders Academic faculty Academic faculty IR and Scholarship Action Group

  6. Artifact Scoring Results • 2013- Quantitative Reasoning Artifact Results • 2012- Critical Thinking Artifact Results • 2012- Communications Artifact Results • 2011- Quantitative Reasoning Artifact Results • 2011- Critical Thinking Artifact Results • 2010- Communications Artifact Results • 2009- Critical Thinking Artifact Results • 2008- Communications Artifact Results • 2007- Critical Thinking Artifact Results • 2006- Communications Artifact Results • Inter-reader reliability • 2005- Artifact Results (Spring 2005) • Inter-reader reliability • 2004- Artifact Results (Fall 2004) • Inter-reader reliability • 2003- Artifact Results (Fall 2003) • Assessment Updates • Assessment Update 2008 • What We Have Learned 2007 • What We Have Learned 2006 • What We Have Learned 2005 • What We Have Learned 2004 • 2004 addendum • Assessment Newsletter - Fall 2003 • Assessment Newsletter - Fall 2002 • Assessment Newsletter - Spring 2002 • Assessment Newsletter - Fall 2001 • CAAP Critical Thinking Test Results • 2009- CAAP After-action report • 2007- CAAP After-action report • 2005- CAAP After-action report • 2003- CAAP After-action report • 2002- CAAP After-action report 2002- CAAP Report addendum • CAAP Writing Test Results • 2002- CAAP After-action report • Student Perceptions of Learning Survey • 2006- After-action report • 2004- After-action report • 2003- After-action report • 2002- After-action report Our process led to lots of Assessment ResultsFaculty frequently joked about winning the lottery again

  7. Reality

  8. Results: % of near-grads scoring sufficient or above

  9. Some Adjustment 2005-2009

  10. Adjusting the “Plan” • Cultural Perspectives went away • Not pervasive enough in the curriculum • Research showed grads might not be exposed • Content vs. skill • Became a degree requirement – select a course

  11. Adjustment:Critical Thinking completely redefined

  12. Adjustment:Critical Thinking completely redefined

  13. Adjusting the “Plan” • Added Quantitative Reasoning • With a new scoring method • More one-piece flow • Scoring in the discipline • Instructor as first scorer • Logistical Issues

  14. Adjusting the “Check” • CAAP went away • Assessment method did not match our outcomes • Student Perceptions Survey went away • Indirect perceptions did not match the direct results • Assessment Coordinator • Half-time faculty member joined IR

  15. Revolutionary Adjustment 2010-2013

  16. Current State Value Stream Mapping WASTE: Compensating Steps Batched Processes

  17. Who is the Customer? • Higher Learning Commission • VP for Educational Services • Faculty • Students • Community Stakeholder • Other?

  18. Systems Thinking Institution The Box Kite Program Course Learner

  19. Systems Thinking YOU ARE HERE

  20. Systems Thinking Do Institution Check Aggregated Results Plan Adjust Program Artifacts from course that support the outcome Course Sample of Near-graduates Learner

  21. Future State VSM (Excel, One-Piece Flow, Kaizen)

  22. Systems Thinking

  23. Strategic Re-Visioning 2012-2013

  24. Strategic Goal Make it so…that we can use the results of assessment to improve student learning. Gen Ed Re-visioning Team, a sub-team of Curriculum Committee

  25. Incubating the Process • Fall 2013 – Communications • Sample of Near-graduates was selected (N=380) • Full-time faculty in selected courses • First scorers for all student in their class • Adjunct faculty • First scorers for sample of selected students • Send in scores and artifacts to IR • N=1110, near grads are a subset of this group

  26. Incubating the Process • Spring 2014 – Critical Thinking • Same process • May 2014 • General Education Day • Faculty score only the sample of near graduates artifacts as second scorers (and third if needed)

  27. What We’ll Get, Incrementally Results Overtime learners will know exactly where they are and how far they’ve come

  28. Challenges and Rewards • Perceived changes in faculty workload • Full-time vs. Part-time faculty participation • Data collection – the “Kaizen” technology is not available yet • Scannable rubrics • Electronic worksheets for faculty • Fuel for discussions in the discipline about improving student learning – how do we document the evidence of the follow-up and adjust? • The value proposition for learners – Here is what you’ll get…and we can prove it.

  29. Thank you! What questions do you have? Darby Hiller, dhiller@nmc.edu

More Related