1 / 53

Evaluation in K-12 Online Learning

Evaluation in K-12 Online Learning. Lessons from the Field. Presenters. Cathy Cavanaugh , University of Florida Martha Donaldson , Alabama ACCESS Mickey Revenaugh , Connections Academy Donna Scribner , VHS Inc. Moderator : Tom Clark , TA Consulting Special Thanks to Brian Lekander ,OII.

hollie
Download Presentation

Evaluation in K-12 Online Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation in K-12 Online Learning Lessons from the Field

  2. Presenters • Cathy Cavanaugh , University of Florida • Martha Donaldson, Alabama ACCESS • Mickey Revenaugh, Connections Academy • Donna Scribner, VHS Inc. • Moderator: Tom Clark, TA Consulting Special Thanks to Brian Lekander,OII

  3. Agenda

  4. Office of Innovation and ImprovementU.S. Department of Education . Evaluating Online Learning: Challenges and Strategies for Success The latest guide in the Innovations in Education series Office of Innovation & Improvement, U. S. Department of Education

  5. Why Did OII Prepare This Guide? • Continued skepticism in some quarters about quality of online learning • Desire for greater accountability in Gov’t funding programs • Concern that innovation often outpaces what we know about educational technology • Evaluators of federally-funded projects are sometimes unfamiliar with online technologies

  6. Guide Based on Seven Case Studies • Alabama ACCESS • Algebra I Online (Louisiana) • Appleton eSchool (Wisconsin) • Arizona Virtual Academy • Chicago Public Schools Virtual High School • Digital Learning Commons (Washington) • Maryland Public Television’s Thinkport

  7. Who is this Guide For? • Online program administrators who need to think strategically about evaluation – and how they will use it as their program evolves • Evaluators who are not very experienced with online learning and the challenges it presents

  8. What is the Guide’s Focus? • Evaluation Challenges Commonly Encountered in Online Evaluations • Instructive Examples of Responses to these Challenges

  9. Six Evaluation Challenges Featured • Meeting the Needs of Multiple Stakeholders • Building on the Existing Base of Knowledge • Evaluating Multifaceted Online Resources • Finding Appropriate Comparison Groups • Interpreting the Impact of Program Maturity • Translating Evaluation Findings into Action

  10. Some Things OII Has Learned • Evaluation of online learning is inherently political • Programs will always be pressed for information/results before they would like to be • It’s best to be proactive by anticipating the needs of your stakeholders • You won’t always be able to control the spin on your results • It’s helpful to think of evaluation as an ongoing process

  11. Where Can I Get the OII Guide? • To view online: http://www.ed.gov/about/pubs/intro/innovations.html • From Ed Pubs: Order online at http://www.edpubs.gov or call 1-877-433-7827 (request order number ED004344P)

  12. . Office of Innovation and ImprovementU.S. Department of Education Quality and Effectiveness in Online Learning Issues Brief Research Committee North American Council for Online Learning

  13. NACOL Research Committee Quality and Effectiveness in K – 12 Online Teaching Research Based Practices Practices currently adopted by Online course providers Practices and Policies for K-12 Online Teaching and Learning Online Professional Development Standards Across North America Features of Teaching in Virtual Schools

  14. NACOL Research Committee • Online Teacher Support Programs: Mentoring and Coaching Models • Description of the Mentoring/Coaching relationships in vignettes from perspectives of several virtual high schools • Alabama ACCESS Distance Learning • Colorado Online Learning • Florida Virtual School • Idaho Digital Learning Academy • Mississippi Virtual School • Missouri Virtual Instructional Program • Tennessee: e4TN • Virtual High School Global Consortium

  15. Introduction • Why evaluate online learning? • To demonstrate the value or worth of your program • To improve your program over time • To document participant outcomes related to program goals • To meet stakeholder interests/accountability needs

  16. Introduction • What happens when evaluation is neglected? • Program set up without clear goals • Focus on activities and simple outputs • Desired change in participant outcomes undefined • Data essential to studying success remains undefined, ungathered • Focus on anecdotal evidence, testimonials • Stakeholder information needs neglected • Program unable to demonstrate value or worth

  17. Preparing to Evaluate • Question posed to panelists(program managers & evaluators):Please think back to the start of the online learning program. What did you do to prepare to evaluate quality, effectiveness, and impact? • Question for current programs: How can you set up your online learning program to study quality & effectiveness over time?

  18. Alabama ACCESS: Look at Your Specific Objectives Preparing to Evaluate To Provide: • Equal Access to High Quality Instruction • An Infrastructure That Delivers Quality Learning Opportunities • Greater Equity for all Alabama Public High School Students Through Cutting-Edge Technology • Wide Range of Courses Available to Relatively few Alabama Students Today (“Advanced Diploma” Courses, Advanced Placement Courses, Additional Course Offerings, Remediation and Supplemental Resources)

  19. ACCESS—Established Need for Evaluation—Quality, Effectiveness, Impact Preparing to Evaluate • Built Into Program Design • Purpose • Provide Evidence that Goals of Initiative are Met • Gauge Satisfaction with Courses • Recommend Changes in Procedures and Resources to Improve and Strengthen Program and Increase its Positive Impact • Determine Effectiveness of Regional Support Centers

  20. ACCESS—Preparation for Evaluation Preparing to Evaluate • Development of RFP at Beginning of Initiative • Development of Evaluation Plan with ISTE • Identification of: • Questions to be Answered • Data that will be Needed • Data that will not be Available • Evaluation Methods to be Used • Development of Data Collection Instruments

  21. ACCESS—What Worked Financial support from the state. Rapid expansion of the infrastructure. Highly rated Regional Support Centers. Thousands of students take classes that would be otherwise unavailable. Teaching practices change with increased technology integration and student-centered pedagogy. Preparing to Evaluate

  22. ACCESS—Strategy-in-Progress Seek out teachers with virtual and F2F students. Compare outcomes for students in similar classes with the same teacher. Preparing to Evaluate

  23. ACCESS—Evaluation Issues Availability of data (e.g., lack of common end-of-course tests). Access to data (e.g., Advanced Placement records held by testing vendor). Technological barriers (e.g., firewalls prevent submission of survey responses). Research design challenges. Preparing to Evaluate

  24. VHS Global Consortium Mission To develop and deliver standards based student centered online courses to expand students’ educational opportunities and 21st century skills and to offer professional development to teachers to expand the scope and depth of their instructional skills. Preparing to Evaluate

  25. VHS Believes that: Preparing to Evaluate • Student-centered online courses can be designed and delivered to students to promote a high quality collaborative learning environment. • in which student exchange and interaction is a valued component of the instructional process. • Educational opportunity need not be limited by barriers of time/place/lack of qualified faculty. • Rather, we believe that high-quality education is possible-today-for all students in all locations. • Online education offers any school with Internet connectivity a wealth of trained, experienced faculty members qualified in numerous disciplines, for teaching a wide array of courses designed to meet the needs of all students. An innovative, standards-based curriculum delivered online offers diverse, exciting learning choices for students, and the opportunity and skills to participate in a national and global community.

  26. VHS Believes that: • Online teaching should augment rather than replace traditional classroom teaching. • The Virtual High School's online courses are a proven, flexible solution for schools needing an expanded curriculum, teachers seeking new horizons, parents wanting more involvement with their children's education, and a society grappling with ways to offer opportunity to all its citizens. • The goals of education are advanced best by putting value and service first. • When schools work together in a collaborative network such as VHS, they become part of an abundant and generous educational community that promotes the affordable sharing of professional resources

  27. VHS Global Consortium Preparing to Evaluate • 1996 – Technology Innovation Grant • 5-year, $7.4 million • US Department of Education • Non-profit; non-degree granting • Consortium of schools • 575+ member schools • 30 states; 39 countries • 11,000+ students

  28. VHS Global Consortium Preparing to Evaluate • Program Evaluation • Outsider reviewer • Quality, Growth, Program Goals • Surveys Superintendents, Principals, VHS Teachers, VHS Students, VHS Site Coordinators • Published Annually • Available to Public via Website (www.goVHS.org) • Quality Benchmark Indicators (QBIs)

  29. VHS Global Consortium Preparing to Evaluate • QBIs • Growth Indicators measured against Growth Goals • Quality Indicators – tie back to Mission & Beliefs • Quality of Courses • Quality of Professional Development • Quality of Services & Program

  30. VHS Global Consortium Preparing to Evaluate

  31. VHS Global Consortium Preparing to Evaluate

  32. VHS Global Consortium Preparing to Evaluate

  33. VHS Global Consortium Preparing to Evaluate

  34. Connections Academy Connections Academy programs are mostly full-time and include K-8 Unique research challenges: Seeking data from younger children and parents; no additional “program ally” such as site facilitator Unique research benefits: Address whole learner, gather all demographics, include state test results/NCLB data Preparing to Evaluate

  35. Connections Academy Preparing to Evaluate • Built into Connections Academy program: • SIS data within our LMS • Data analysis: Data views • Log: Teacher communication, action • Parent Satisfaction Survey • StarTracker: Embedded feedback on every lesson plus school as a whole • Measurable school and company goals

  36. Lessons Learned/Next Steps What are some lessons learned about effective practices in evaluating online learning? What can we learn from the results of an evaluation? How can we use results to improve the program? To inform stake-holders & decision-makers?

  37. Lessons Learned/Next Steps Some Lessons Learned (OII, 2008) – Evaluations should: Effectively inform stakeholder groups Share tools and research methods Focus on outcomes, not activities Recruit willing research populations early Obtain data access, or plan to gather it Move from formative to summative Disseminate timely information to internal & external decision-makers

  38. Lessons Learned/Next Steps What can we learn from an evaluation? Satisfaction measures Quality & effectiveness measures Changes in knowledge & skills via participation (participant outcomes) How can we use results to improve the program? To inform stakeholders & decision-makers?

  39. Lessons Learned/Next Steps Question posed to panelists (program managers and evaluators): Based on your experience using or conducting evaluations of online learning programs, please share: 1) What are some lessons you’ve learned about effective evaluation practices? 2) How has evaluation helped you improve an online learning program or demonstrate its worth?

  40. Connections Academy When Evaluation Pays Off Early MoVIP K-5 Results: User satisfaction high, teachers make the difference – validation of model Mississippi K-8 Pilot: Even a small, short pilot can be positively revealing if designed with evaluation in mind Ongoing Parent Satisfaction Surveys: Overall high rates (90%+) persist, and improvements in “iffy” areas absolutely trackable Lessons & Next Steps

  41. Connections Academy Lessons Learned No substitute for familiarity: Evaluators need to be equipped to really dig into curriculum and platform Data transmission is an art: Foster friendships between the evaluators and the program data wonks Positive results are no guarantee: As in Mississippi example – can’t make up for lack of support Lessons & Next Steps Lessons & Next Steps

  42. Connections Academy Lessons Learned Transparency takes some getting used to: Educators are not accustomed to having practice so visible Evaluation is only half the battle: True continuous improvement takes precision and persistence Patience is a virtue: User satisfaction and academic results may diverge in beginning, but will converge if students stay Lessons & Next Steps

  43. VHS—Lessons Learned Don’t make decisions on a snap-shot in time… use longitudinal data and look for trends Evaluation criteria derives from Objectives; Objectives derive from Mission It may be interesting but does it inform? Lessons & Next Steps

  44. VHS—Lessons Learned 2 Lessons & Next Steps • Continuous Course Improvement • Student • Professional Development • Need for Teacher Support • Progress • Elluminate Sessions • Faculty Advising on graduated scale

  45. How has the evaluation helped to improve the ACCESS program? Year I Findings Implications/Improvements Lessons & Next Steps Course Changes Needed • Course Revisions Made • Increased Use of Voice Tools, Addition of Improved Speaking Assignments and Examples, and Use of Headphones in Foreign Language Courses • More Detailed Alignment and Gap Analysis Process • Addition of Course Development/Revision Component

  46. Year I Findings Implications/Improvements How has the evaluation helped to improve the ACCESS program? Lessons & Next Steps Need for Better/Increased Communication and Interaction Among Teachers, Students, Facilitators, and Support Center Staff Introduction of Regular Faculty Meetings via Web Conferencing Assignment of SDE Liaison to Each Support Center Region Expansion of Contract for Web Conferencing Capability Additions to Training Agenda Review of Teacher Pay Issues

  47. Year I Findings Implications/Improvements How has the evaluation helped to improve the ACCESS program? Lessons & Next Steps Need for Additional Professional Development and Training Modules Development of Additional Training Modules Modification of Professional Development Plan Addition of SDE Staff Member to Coordinate Professional Development Establishment of Teacher Mentoring Plan Development of C.A.S.T. Site for Teachers

  48. Year I Findings Implications/Improvements How has the evaluation helped to improve the ACCESS program? Lessons & Next Steps Need for Assistance With Scheduling, Registration, and Enrollment Issues Increased Number of Students Not Prepared/Ready for Assigned Class Development of Training Module for Counselors on the Registration Process Expansion of Meetings with Counselors (Regional and State Meetings) Decision to Develop a New Student Registration Site Onsite Visits and Individualized Telephone Calls to Assist With Process Further Look at Course Prerequisites

  49. Year I Findings Implications/Improvements How has the evaluation helped to improve the ACCESS program? Lessons & Next Steps Need for Assistance With Technical Issues Use of SDE and Regional Support Center Helpdesks Additional School Visits Addition of Staff at SDE Equipment/Connectivity Checks by SDE, Support Centers, and Alabama SuperComputer Authority Increased Communication With School Staff Identification of Key Areas of Concern

  50. Year I Findings Implications/Improvements How has the evaluation helped to improve the ACCESS program? Lessons & Next Steps Need for Increased Number of Responses on Interviews and Surveys • Identification of Reasons for Poor Response Rate • Timing • Surveys Used • Filters/Blocks at School Level • Content/Clarity Issues

More Related