1 / 57

Using IDEA for Assessment, Program Review, and Accreditation

Using IDEA for Assessment, Program Review, and Accreditation. Texas A & M University November 8, 2012. Shelley A. Chapman, PhD. Plan for this Session. Program Evaluation & Assessment of Student Learning Group Summary Reports Aggregate Data File Benchmarking Reports

rania
Download Presentation

Using IDEA for Assessment, Program Review, and Accreditation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD

  2. Plan for this Session • Program Evaluation & Assessment of Student Learning • Group Summary Reports • Aggregate Data File • Benchmarking Reports • Accreditation Guides

  3. What makes IDEA unique? • Focus on Student Learning • Focus on Instructor’s Purpose • Adjustments for Extraneous Influences • Validity and Reliability • Comparison Data • Flexibility

  4. Underlying Philosophy of IDEA Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.

  5. Faculty Information Form

  6. Diagnostic Report Overview • Page 1 – Big Picture • How did I do? • Page 2 – Learning Details • What did students learn? • Page 3 – Diagnostic • What can I do differently? • Page 4 – Statistical Detail • Any additional insights?

  7. The Big Picture 1If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.

  8. 4 Progress On Relevant Objectives 4.3 + 4.3 4.1 4.2 3.6 5

  9. Summary Evaluation: Five-Point Scale 50% 25% 25%

  10. Using Evidence to Improve Student Learning

  11. Individual Reports to Group Reports

  12. The Group Summary Report How did we do? How might we improve?

  13. Defining Group Summary Reports (GSRs) • Institutional • Departmental • Service/Introductory Courses • Major Field Courses • General Education Program

  14. GSRs Help Address Questions • Longitudinal • Contextual • Curricular • Pedagogical • Student Learning-focused

  15. Adding Questions Up to 20 Questions can be added • Institutional • Departmental • Course-based • All of the above

  16. Local Code Use this section of the FIF to code types of data.

  17. Defining Group Summary Reports • Local Code • 8 possible fields • Example: Column one – Delivery Format • 1=Self-paced • 2=Lecture • 3=Studio • 4=Lab • 5=Seminar • 6=Online Example from Benedictine University

  18. Example Using Local code Assign Local Code • 1=Day, Tenured • 2=Evening, Tenured • 3=Day, Tenure Track • 4=Evening, Tenure Track • 5=Day, Adjunct • 6=Evening, Adjunct Request Reports • All Day Classes • Local Code=1, 3, & 5 • All Evening Classes • Local Code=2, 4, & 6 • Courses Taught by Adjuncts • Local Code=5 & 6

  19. Description of Courses Included in this Report Number of Classes Included Diagnostic From 42 Short Form 27 Total 69 Number of Excluded Classes 0 Response Rate Classes below 65% Response Rate 2 Average Response Rate 85% Class Size Average Class Size 20 Page 1 of GSR

  20. Assessment of Learning What are our faculty emphasizing? How do students rate their learning? How do our courses compare with others? How do our students compare with others (self-rated characteristics)? What efforts can we make for improvement? (How can we “close the loop”?)

  21. Texas A & M University

  22. Are we targeting TAMU SLOs in Core Curriculum? IDEA Learning Objectives

  23. Are we targeting TAMU SLOs in Core Curriculum? IDEA Learning Objectives

  24. What are We Emphasizing? Page 2

  25. What are We Emphasizing?

  26. Do Students’ report of learning meet our expectations? Objective 1: Gaining factual knowledge (terminology, classifications, methods, trends) Pages 5 and 6

  27. How do students rate their learning? Part 1: Distribution of Converted Scores Compared to the IDEA Database Page 3

  28. Overall Progress Ratings (Courses) Page 3 Percent of Classes at or Above the IDEA database Average

  29. Overall Progress Ratings (Courses) Part 3: Percent of Classes at or Above ThisInstitution’s Average Page 4

  30. Which teaching methods might we use to improve learning? Page 7 Teaching Methods and Styles

  31. Relationship of Learning Objectives to Teaching Methods

  32. How do students view course work demands? Page 8B Student Ratings of Course Characteristics

  33. Aggregate Data File Allows you to • Use Excel Spreadsheet • Use with SAS or SPSS • Ask other types of questions • Display data in different ways

  34. Instructors’ Reports on Course Emphases: Selected Pairings-Writing and Oral Communication

  35. Instructors’ Reports on Course Emphases: Selected Pairings-Critical Thinking & Writing

  36. Benchmarking Institutional and Discipline Reports

  37. Benchmarking Reports • Comparison to • 6-10 Peers • Same Carnegie Classification • IDEA database

  38. Benchmarking Reports The student, rather than the class, is the unit of analysis Percentage of positive ratings is given rather than averages

  39. Report Summary

  40. Report Summary

  41. Comparison Groups * Peer group is based on 6-10 institutions identified by your institution

  42. Students’ Perceptions: Gen Ed

  43. Background for Specialization

  44. Instructional Objectives Selected by Instructors Instructors’ Intentions/ focus Students’ Self-Reported Progress on Learning

  45. IDEA Objective 3 Learning to apply course material (to improve thinking, problem solving, and decisions)

More Related