1 / 60

Office for Exceptional Children

Special Education Performance Profiles and SPP Compliance Indicator Reviews . Office for Exceptional Children. Intended Outcomes . Participants will understand: How the OSEP visit and the SEA Determination have impacted ODE and LEAs

yamal
Download Presentation

Office for Exceptional Children

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Special Education Performance Profiles and SPP Compliance Indicator Reviews Office for Exceptional Children

  2. Intended Outcomes Participants will understand: • How the OSEP visit and the SEA Determination have impacted ODE and LEAs • Critical elements of OEC’s SPP Compliance Indicator Reviews and actions they must take

  3. Reasons for Changes • October 2009 – OSEP Verification Visit and Focused Monitoring • March 2010 – OSEP Letter of Findings • June 2010 – SEA Determination

  4. OSEP Report Highlights Identified critical areas in which ODE must improve: • Ensure accurate and reliable data • Expand monitoring system • Monitor spending of Part B funds • Ensure LRE

  5. Other Important Findings OSEP also found ODE noncompliant with SPP Indicators 5, 11, 12, and 13 Indicator 5 – LRE Indicator 11 – Child Find Indicator 12 – Transition from Part C to B Indicator 13 – Post secondary transition

  6. The Big Picture Indicators 5, 11, 12 and 13 represent the entire life span of a student with a disability while he/she is in school.

  7. SEA Determinations • Meets Requirements 31 states • Needs Assistance 27 states • Needs Intervention (Ohio & DC) • Needs Substantial Intervention 0 states

  8. ODE’s Corrective Action Plan ODE’s Corrective Action Plan (CAP) submitted to OSEP addresses the issues identified and is designed to improve ODE’s federally required system of general supervision of IDEA.

  9. OEC’s Comprehensive System of Monitoring for Continuous Improvement As part of ODE’s CAP: • All LEAs will be reviewed annually at varying levels of intensity • IDEA on-site reviews include fiscal, early childhood and data verification • Selection and scheduling of LEAs for on-site reviews coordinated with PACTS (Federal Program Reviews)

  10. OEC’s ComprehensiveMonitoring System for Continuous Improvement Monitoring Methods Level of Intensity Few LEAs Some LEAs Less Intensive All LEAs

  11. OEC’s ComprehensiveMonitoring System for Continuous Improvement Monitoring Methods Level of Intensity Few LEAs Some LEAs Less Intensive All LEAs

  12. SPP Compliance Indicator Review Purpose of the SPP Compliance Indicator Review is to ensure that LEAs meet SPP targets and are compliant with IDEA in order to improve services and results for students with disabilities.

  13. State Performance Plan (SPP) • Accountability for 20 Indicators • Drives the work of OEC • Progress on indicator targets measured yearly • Results on performance reported to OSEP through Annual Performance Report or APR

  14. SPP Indicators

  15. Review of Data SEAs must review data at least annually for the purpose of identifying noncompliance with IDEA.

  16. Notification of Noncompliance • SEAs must notify LEAs of noncompliance with IDEA in writing • ODE’s written notification = the “Summary Report” (for Compliance Indicator Reviews) • LEAs must correct noncompliance within one year of notification

  17. SPP Compliance Indicator Review • All LEAs receive a Special Education Performance Profile annually • Profile identifies LEAs’ performance on ALL indicators • Includes longitudinal data on ALL targets • LEAs receive a Summary Reportthat outlines corrective action activities required for compliance on missed indicator targets

  18. Indicators Requiring Action by LEAs • Indicator 4 (Discipline Discrepancy) • Indicators 9 & 10 (Disproportionality) • Indicator 11 (Initial Evaluations) • Indicator 12 (Early Childhood Transition) • Indicator 13 (Secondary Transition Planning) • Indicator 15 (Timely Correction) • Indicator 20 (Timely and Accurate data) Compliance with submitting surveys • Indicator 8 (Parent Involvement) • Indicator 14 (Postschool Outcomes)

  19. Special Education Profile Review of Sample Profile

  20. Timeline Nov. 2010 – Special Education Profiles & Summary Reports (findings issued) Sept. 2010 – OEC reviews final 2009-2010 data, discovery of noncompliance Dec. 2010 – Sept. 2011 Compliance Indicator Reviews – LEAs develop & implement action plans, OEC reviews data to verify correction June 2011 – LEA determinations (based on 2009-2010 data)

  21. Components of Monitoring • Review Data/Student Records • Identification of Noncompliance • Corrective Actions • Verification of Correction (2 prong) • Individual Cases of Noncompliance • Systemic Noncompliance • Verification of Accurate and Timely Reporting • Clearance or Sanctions Applied

  22. Corrective Action Plan • The Corrective Action Plan (CAP) must address individual and systemic issues • Activities must ensure 100% correction • Plan must be submitted 30 days from written notification

  23. Verification of Correction - 2 Prong Approach • Prong 1 – LEA must correct each individual case of noncompliance; and • Prong 2 – LEA must show that it is correctly implementing the specific regulatory requirements, i.e. it has achieved 100% compliance, based on a review of updated data.

  24. Verification of Correction Required by OSEP Can OEC verify correction: • When a CAP is submitted? NO • When a CAP is approved? NO • When the CAP activities are completed? No • When new policies and/or procedures are approved? No • When OEC has documentation that individual cases have been corrected and LEA practice has changed? YES!!

  25. Verification of Accurate and Timely Reporting • In addition to verifying correction by reviewing updated student records, OEC will also verify that the information in the records matches the data reported in EMIS • Example: Indicator 11 – OEC will compare the dates reported in EMIS to the dates on the consent form and initial evaluation team report

  26. Correction Process Clearance Progressive Sanctions LEAs have NOT met the two prongs within one year: Required PD/TA from the SST Revision of CAP to address identified issues Redirect Part B funds to areas of need Withhold funds LEAs have demonstrated they have met the two prongs of correction within one year of the finding.

  27. Data Verification Ensuring Timely & Accurate Reporting

  28. Basic Questions Why do we have to do this? • Compliance with Federal Law – IDEA requires reporting Why EMIS? • Ohio Revised Code defines EMIS as the state data system for student records • Student special ed data can be linked to Report Card, financial, and additional data required for federal reporting

  29. So, now what do I have to do? • Remember some basics: Your check ledger is not an IRS 1040 form and… • Your district software IS NOT EMIS • EMIS deadlines are not negotiable

  30. The EMIS Coordinator • Keeps abreast of EMIS communications. • Disseminates any new EMIS information within the district • Monitors general issues and other EMIS reports: • Dec Child Count • Student Disab Not Funded • General Issues • COMMUNICATEs with special education staff

  31. EMIS Coordinator and Monitoring • Translate SSIDs to student names • Extract records from district software • Identify dates of district submission to EMIS

  32. The Special Education Administrator • Provides SSIDs of monitored records to EMIS staff • Pulls records of students with requested IDs • Examines records to understand reason for non-compliance, if it exists • Ensures requested copies are provided to ODE

  33. Preventing Data Goofs • Reports generated by OEC and sent via “gen issues” • Data that will be used for Indicators 11, 12 and13 • Meant to HELP districts identify data errors, and ODE to identify additional business rules

  34. Most Frequent Data Errors • Fat finger errors • Missing non-compliance reasons • Missed reporting timelines • Data in district software, but not uploaded to EMIS

  35. Compliance Timeline Codes • Describes reasons why an activity (e.g. ETR) was not completed according to the federally mandated timelines • Some provide reasons to remove a missed timeline from non-compliance calculations

  36. Reporting Cut-Offs Why the “prior to June 1” cut-off for reporting events? • SPED staff need time to communicate with EMIS staff • EMIS staff need time to enter and verify data • EMIS & SPED staff may be off during summer months

  37. District Software isn’t EMIS • District records contain much more than ODE needs for reporting • Entering into DASL or ESIS or IEP Anywhere is only first step • Not all systems have automatic weekly uploads

  38. Data Resources Visit www.education.ohio.gov, search using the following keywords: • EMIS Manual • EMIS Newsflash • Data Collection Tool for Students with Disabilities

  39. It’s not just about Compliance

  40. SPP & Continuous Improvement • Profile promotes continuous improvement of results for students with disabilities • Provides longitudinal data on targets for all indicators • SSTs will provide TA/PD to improve performance of children with disabilities

  41. How does this connect to the Ohio Improvement Process? As part of ODE’s State System of Support (SSOS), SST/OIP facilitators must focus District Leadership Teams’ attention to the required aspects of their special education service delivery system by assisting them in reviewing their performance on SPP Indicators.

  42. Does the SPP connect to OIP? • SST/OIP facilitators will assist district leadership teams in review of performance on SPP indicators • All districts can seek technical assistance on SPP indicators from SSTs

  43. Using the Profile for Continuous Improvement • Review SPP data and identify “systemic” issues – targets that have not been met over multiple years • Put in place plans that will correct systemic issues • Monitor progress in correcting systemic issues

More Related