1 / 60

Using IDEA for Assessment, Program Review, and SACS

Using IDEA for Assessment, Program Review, and SACS. University of Alabama Birmingham September 11, 2012. Shelley A. Chapman, PhD. Plan for this Session. Program Evaluation & Assessment of Student Learning Group Summary Reports Aggregate Data File Benchmarking Reports Accreditation Guides.

tomas
Download Presentation

Using IDEA for Assessment, Program Review, and SACS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD

  2. Plan for this Session • Program Evaluation & Assessment of Student Learning • Group Summary Reports • Aggregate Data File • Benchmarking Reports • Accreditation Guides

  3. What makes IDEA unique? • Focus on Student Learning • Focus on Instructor’s Purpose • Adjustments for Extraneous Influences • Validity and Reliability • Comparison Data • Flexibility

  4. Student Learning Model: 2 Assumptions Assumption 1: Types of learning must reflect the instructor’spurpose.

  5. Student Diagnostic Form Assumption 2: Effectiveness determined by students’ progress on objectives stressed by instructor

  6. Diagnostic Report Overview • Page 1 – Big Picture • How did I do? • Page 2 – Learning Details • What did students learn? • Page 3 – Diagnostic • What can I do differently? • Page 4 – Statistical Detail • Any additional insights?

  7. The Big Picture 1If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.

  8. 4 Progress On Relevant Objectives 4.3 + 4.3 4.1 4.2 3.6 5

  9. Summary Evaluation: Five-Point Scale 50% 25% 25%

  10. Individual Reports to Group Reports

  11. The Group Summary Report How did we do? How might we improve?

  12. Defining Group Summary Reports (GSRs) • Institutional • Departmental • Service/Introductory Courses • Major Field Courses • General Education Program

  13. GSRs Help Address Questions • Longitudinal • Contextual • Curricular • Pedagogical • Student Learning-focused

  14. Adding Questions Up to 20 Questions can be added • Institutional • Departmental • Course-based • All of the above

  15. Local Code Use this section of the FIF to code types of data.

  16. Defining Group Summary Reports • Local Code • 8 possible fields • Example: Column one – Delivery Format • 1=Self-paced • 2=Lecture • 3=Studio • 4=Lab • 5=Seminar • 6=Online Example from Benedictine University

  17. Example Using Local code Assign Local Code • 1=Day, Tenured • 2=Evening, Tenured • 3=Day, Tenure Track • 4=Evening, Tenure Track • 5=Day, Adjunct • 6=Evening, Adjunct Request Reports • All Day Classes • Local Code=1, 3, & 5 • All Evening Classes • Local Code=2, 4, & 6 • Courses Taught by Adjuncts • Local Code=5 & 6

  18. Description of Courses Included in this Report Number of Classes Included Diagnostic From 42 Short Form 27 Total 69 Number of Excluded Classes 0 Response Rate Classes below 65% Response Rate 2 Average Response Rate 85% Class Size Average Class Size 20 Page 1 of GSR

  19. UAB Spring 2012 Page 1 of GSR

  20. Assessment of Learning What are our faculty emphasizing? How do students rate their learning? How do our courses compare with others? How do our students compare with others (self-rated characteristics)? What efforts can we make for improvement? (How can we “close the loop”?)

  21. UAB Core Competencies

  22. Are we targeting “Core Competencies” in the Core Curriculum? IDEA Learning Objectives

  23. Are we targeting “Core Competencies” in the Core Curriculum? IDEA Learning Objectives

  24. What are We Emphasizing? Page 2

  25. UAB Spring 2012

  26. What are We Emphasizing?

  27. How Did Students Rate their Learning on Core Competencies?

  28. Do Students’ report of learning meet our expectations? Objective 1: Gaining factual knowledge (terminology, classifications, methods, trends) Pages 5 and 6

  29. How do students rate their learning? Part 1: Distribution of Converted Scores Compared to the IDEA Database Page 3

  30. Overall Progress Ratings (Courses) Page 3 Percent of Classes at or Above the IDEA database Average

  31. Overall Progress Ratings (Courses) Part 3: Percent of Classes at or Above ThisInstitution’s Average Page 4

  32. Which teaching methods might we use to improve learning? Page 7 Teaching Methods and Styles

  33. Assessing the QEP for UAB

  34. Relationship of Learning Objectives to Teaching Methods

  35. How do students view course work demands? Page 8B Student Ratings of Course Characteristics

  36. Aggregate Data File Allows you to • Use Excel Spreadsheet • Use with SAS or SPSS • Ask other types of questions • Display data in different ways

  37. Instructors’ Reports on Course Emphases: Selected Pairings-Writing and Oral Communication

  38. Instructors’ Reports on Course Emphases: Selected Pairings-Critical Thinking & Writing

  39. Highlights for Sample U • Remarkably similar profiles across terms • Overall response rates ranged from 66% to 80% • 1stterm in which administration was primarily online achieved a 75% response rate • Transition from paper to online (fall 2009 to fall 2010) does not show major differences in profiles • Sample U faculty focus on 4-5 outcomes as essential/important • Over the last 3 terms, a significant increase on several objectives has been observed: application of course material, oral and written communication skills, & analysis and critical thinking skills (objectives 3, 8, & 11, respectively)

  40. Benchmarking Institutional and Discipline Reports

  41. Benchmarking Reports • Comparison to • 6-10 Peers • Same Carnegie Classification • IDEA database

  42. Comparison Groups * Peer group is based on 6-10 institutions identified by your institution

  43. Benchmarking Reports The student, rather than the class, is the unit of analysis Percentage of positive ratings is given rather than averages

  44. Response Rates Your University: Student participation is similar to that of each comparison group Your University=79% Peer=77% Carnegie=79% National=75%

  45. Students’ Perceptions

More Related