1 / 16

 “Improving Your Program Assessment Report.” 

 “Improving Your Program Assessment Report.” . University Assessment Committee Debra Ballinger and Adam Mcglynn. Purpose. …to share what the UAC has learned from reviewing reports across campus. …to learn from others’ experiences.

sanjiv
Download Presentation

 “Improving Your Program Assessment Report.” 

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1.  “Improving Your Program Assessment Report.”  University Assessment Committee Debra Ballinger and Adam Mcglynn

  2. Purpose • …to share what the UAC has learned from reviewing reports across campus. • …to learn from others’ experiences. • …to work on reports, ask questions, and get real‐time help before reports are due October 15. • …to share how to submit reports and seek assistance, if needed • …to learn how the evaluation process is conducted

  3. What UAC has learned • Inconsistent formatting used by departments – makes reliable reviewing difficult. Some used previous reporting formats. (Cut and Paste from accreditation reports not sufficient.) • Misunderstanding by many departments about the purpose and use of the reports – that they are critical to writing university accreditation reports and to making informed decisions about university assessment, curriculum, and student learning. (* New Culture of Assessment) • Still lack of understanding about indirect and direct assessments – and that BOTH are recommended for all departmental assessments. • That departments don’t realize that reviewers don’t have backgrounds in the departments being assessed (too much use of professional jargon, assumptions). Need to report for “outside eyes”. • Some departments didn’t report student learning outcome goals….or goals were not aligned to University SLO’s in reports. SLOs are not being assessed – rather departments are reporting on program assessments or accreditation reports with information unrelated to student learning outcomes.

  4. Today and the Future • Although you may learn what you believe you need to know today to write your report, sometimes when you get back to the office/work group, you may have more questions. • Good news: there is additional assistance available! • UAC Assessment Consultant Team (ACT) members can help – but need “advance notice” before deadline of report writing. • T0 learn more about ACT or to schedule a meeting/session with an ACT member, contact Chris Dudley - [Cdudley@esu.edu]

  5. Getting started: the Report Template • Handout – report template • I. Program information – • reporting on LAST AY (2013 – 2014) only. • please include e-mail and phone contact for Chair and Dept. Assessment Coordinator • II. Program-Specific Student Learning Outcome (Education Objectives) Assessed During Last Academic Year. • If all SLO’s are not assessed each year, please explain why, and when they are assessed. • List in tabular or bullet form, the specific program SLO’s and tie to University SLO’s

  6. Report Template, cont. • III. Direct Measures Used • Course embedded assessments (tests, projects, oral presentations, graded homework) • Rubrics • Direct observations of internships with a rubric • Portfolios • External certification/licensure exams • Other? **Key is that assessments have observable behaviors with defined scoring rubrics • Rubrics ideally have been reviewed and agreed upon by all department faculty • Training in use of assessments (such as internship evaluations) usually occurs

  7. Report Template, cont. • IV. Indirect Measures used • Surveys (survey monkey)* • Student opinion polls * • Faculty course evaluations • Site supervisor/Internship placement personnel feedback/questionnaires • Other??

  8. Using Course Grades as assessments • Course grades are a valid assessment measure ONLY if departments clearly explain how those grades are derived and which part of the grade is indicative of achievement of which of the program’s SLOs. • Demonstrate scoring criteria on portion of grade derived for each SLO • Describe the number and percentage of students passing the item tied to the SLO, not the number and percentage passing a course • Why? • Grading criteria are often not clear nor the same for each professor teaching a course. • Grades often comprise a variety of components – some directly and some not at all related to one or more specific student learning outcomes. • Grades typically “comingle” different assessments to achieve a final score – and this is not a reliable or valid measure of any one specific student learning outcome achievement.

  9. Using APPENDICES • Appendices should only be used in rare circumstances. • Assessment documentation, results, and student learning outcomes should be included within the report, not as a separate document. • Use these appendices only when absolutely necessary and refrain from continued citations of appendices in the report (e.g. See Appendix 1 for explanation). • If an item needs to be explained, please do so in the report. • PLEASE REMEMBER – reviewers are volunteer faculty members who, like you, are extremely busy. The time available for reviews is limited, and clarity, simplicity, and completeness is helpful to their review process.

  10. Report Template, cont. • V. Student Performance Outcomes • Using table, fill in assessment name, target or passing scores, number of students assessed • Results (% who met or did not meet standards) • Key findings – • Briefly summarize (no more than a page) the results and how they compare to the SLOs.

  11. Report Template, cont. • VII. Describe the Process Used by Program Faculty to Discuss and Interpret Key Findings • (Be sure to include how, when, how often faculty met to discuss results.) • Discuss why faculty believe the results were as they were. • Changes Made as a Result of the Key Findings/Actions Taken • Program changes being recommended? • Changes in protocol? • Changes in assessments? • Changes in timeline for assessment? • Follow-up from previous report findings and suggestions? • Other?

  12. Report Template, cont. • IX. Adjustments to /Deviation from the Department Assessment Plan • Ex: On site visit of accreditation team precluded an assessment or added additional ones? • Faculty changes and teaching assignments may have interrupted assessment cycle…. • Other?

  13. What the levels mean in ratings: • If your report was rated as: • Level 4 – The program report demonstrates a consistent and significant process for the use of assessment data to improve student learning. This process includes the use of multiple direct and indirect measures to assess student learning. This program has created a “Culture of Assessment” where assessment has become institutionalized in order to achieve continual improvement of student learning. (*You are a model for other departments :}) • Level 3 - This program has developed an assessment program and has provided substantial evidence to demonstrate that they are employing assessment measures and using the data from these measures to improve student learning. This includes the usage of both direct and indirect measures to assess student learning. • Level 2 - This program has developed an assessment program and has provided some evidence that they are employing assessment measures to improve student learning. However, the assessment report demonstrates an inconsistent or sporadic implementation of its assessment plan. This may include an unbalanced usage of direct or indirect measures. • Level 1 - This program is in the early phases of developing an assessment program and has not yet provided evidence that they are employing assessment measures to improve student learning. • ** Remember, the ratings are based on the report that was submitted – which is the only information the reviewers have upon which to base decisions/ratings.

  14. Submitting the report • Reports must be electronically sent to the University Assessment Committee e-mail address: • esuac@esu.edu • They are then uploaded to TracDat, where reviewers access them and review them. • Reviews are returned to the Department Chair and Department Assessment Coordinator, once completed. • Eventually, each department will have access to Trac Dat.

  15. Group work • Leaders circulate to help and respond to questions….

  16. Your questions?

More Related