1 / 26

Online Course Evaluations

Online Course Evaluations . Jackie Charonis Assistant Vice Provost for Student Affairs & Associate University Registrar. Collecting, Consolidating, and Leveraging Data. Stanford University. Private research University founded in 1891 Our Student Body 6,689 Undergraduates 8,201 Graduates

Download Presentation

Online Course Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Course Evaluations Jackie Charonis Assistant Vice Provost for Student Affairs & Associate University Registrar Collecting, Consolidating, and Leveraging Data

  2. Stanford University • Private research University founded in 1891 • Our Student Body • 6,689 Undergraduates • 8,201 Graduates • Our Faculty • 1,807 Tenure-line faculty • 65 Academic departments and interdisciplinary programs • Over 3,000 classes scheduled each quarter

  3. Student Systems at Stanford • PeopleSoft SA 8.0 – student records database • Resource 25/Schedule 25 3.3 – scheduling software • OnBase - document imaging • What Do You Think – course evaluation software

  4. Developing the System: Our Requirements • Vendor-delivered ASP solution: CollegeNET • Integrated approach: “Stanford” presence and security via our portal • Self-service application collects and displays results • Paper-free process including email reminders & announcements • Existing reports replicated in new online system • Ability to distribute raw data to key users

  5. Our Design Considerations • No changes to existing evaluation forms • One approach for all participating schools • Ease of use (self-service model) • Ease of management (paper-free, hassle-free) • Confidentiality • at least 3 students enrolled in the course or section • combined courses receive one summary of aggregate data • Validity - only one evaluation per enrolled student per course • Accuracy - data validated by third-party statistical consultant

  6. Our Implementation Timeline • Winter 2005-06 Pilot • 2 volunteer departments • 85 classes and no discussion sections • Spring 2005-06 Early Adopters • 25 volunteer departments • 659 classes and 241 discussion sections • Summer 2005-06 No Paper • Autumn 2006-07 All departments (excl. professional schools) • 1758 courses and 721 discussion sections • Autumn Quarter 2007-08: Law School joins

  7. Two forms currently used Course Form Section Form Only officially scheduled courses and enrolled students Independent study courses are not included Collecting Data: The Forms

  8. Email notifications to instructors and students when evaluation period opens Email reminders Email notifications to instructors when data is available for viewing Collecting Data: Communication

  9. Evaluation Period Evaluations open two weeks at end of term incl. finals Results available following grading deadline Grade Withholding Grades are always viewable in system to staff and available on official transcripts Grades are released daily for students’ view who complete all their evaluations Grades available for those who do not complete evaluations two weeks after the grading deadline Collecting Data: Timing & Incentives

  10. Our Response Rates

  11. Consolidating Data: Basic Reporting • Means by Department, School, and Areas • Histograms by Department, School, and Areas • Individual Course Summaries • Student Comments

  12. Sample Instructor Summary

  13. Sample Mean Summary

  14. Sample Histogram

  15. Leveraging Data: Aspects of Courses • Class size and course level • Class type (e.g., lecture vs. seminar) • Effects of team teaching • Comparisons with grade distributions

  16. Sample Analysis: Single vs. Multiple Instructors

  17. Sample Analysis: Single vs. Crosslisted Courses

  18. Leveraging Data: Aspects of Instructors • Same course but different instructors • Tenured vs. untenured instructors • Male vs. female instructors

  19. Sample Analysis: Faculty vs. Visiting Instructors

  20. Sample Analysis: Gender of Instructors

  21. Leveraging Data: Aspects of Students • Self-reported demographic information • Male vs. female students

  22. Evaluating the Evaluation Process • Did the change in medium effect the results? • Did the change in timing of the evaluation period adversely effect the results? • Is there meaning behind blank submissions or all ‘1s’, all ‘3s’, all ‘5s’?

  23. Examples of Analysis: Paper vs. Electronic

  24. The Future of Course Evaluations • Benchmarking • Evaluating learning rather than teaching • Evaluating the impact of instructor training programs • Developing systems for reviewing qualitative data • Expanding the use of course evaluation data

  25. The Future: Mashing Data

  26. Questions & Answers Jackie Charonis Assistant Vice Provost for Student Affairs & Associate University Registrar charonis@stanford.edu Paul CaseyAssociate Vice President - Salespcasey@collegenet.com

More Related