1 / 30

Evaluation in Michigan ’ s Model Steve Goodman sgoodman@oaisd.org National PBIS Leadership Forum October, 2011

Evaluation in Michigan ’ s Model Steve Goodman sgoodman@oaisd.org National PBIS Leadership Forum October, 2011. http://miblsi.cenmi.org. Mission Statement.

blaise
Download Presentation

Evaluation in Michigan ’ s Model Steve Goodman sgoodman@oaisd.org National PBIS Leadership Forum October, 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation in Michigan’s Model Steve Goodman sgoodman@oaisd.org National PBIS Leadership Forum October, 2011 http://miblsi.cenmi.org

  2. Mission Statement To develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success.

  3. MiBLSi Project Evaluation Team Anna Harms Evaluation Coordinator Ed Huth Data Analyst Nicole Matthews Data Entry Jennifer Rollenhagen Measurement and Evaluation Specialist • Evaluation contributes to Project by: • Developing and providing resources to enhance local capacity related to measurement and evaluation, consistent with the implementation research. Evaluation supports the competencies and capacity necessary for implementation specialists and local districts to engage in effective data-based decision making as part of an integrated behavior and reading RtI model. • Reporting on program activities and project outcomes to evaluate and improve effectiveness and efficiencies of the project to ensure value added to consumers and stakeholders. This is accomplished by implementing Plan-Do-Study-Act cycles.

  4. Several Purposes of MiBLSi Assessments • Audit • for “taking stock” of current strengths/weaknesses and action planning • Formative evaluation • for improving program while it is in the process of being implemented • Summative evaluation • for improvement of future reiterations

  5. Internal Evaluation(within the project)

  6. MiBLSi Value-Added Work System Feedback Stakeholders/Funders Management/Coordination Resources • Investments: • Funding • Visibility • Political support Capital People Materials Information • Returns: • Addressing critical issues (Discipline/Ethnicity) • Addressing program directives (State Performance Plan) Work Systems: Providing the RtI practices and the supports for the these practices to take place successfully within schools and districts Consumers (schools, districts, ISDs) Professional Learning Technical Assistance Valued RtI Products/ Services Evaluation Financial

  7. Organizational Level Worker Worker Worker Worker Stake Holders Process Level Finance Consumers Evaluation Tech. Assistance Prof. Learning Evaluation At Organizational, Process and Performer (worker) level: At each , measurement takes place at determined interval. This information is compared to established standards and provided as feedback for the system.

  8. Levels of Internal Evaluation

  9. Job Model: Measurement and Evaluation Specialist

  10. File Maker Pro Data Base

  11. External Evaluation(outside the project)

  12. Collecting information to evaluate implementation effects and using this information for continuous improvement MiBLSi Project • Fidelity of implementation (state) • Systems integrity (project) • Student success (project-wide) ISD Leadership Team • Fidelity of implementation (across districts) • Systems integrity (district-ISD) • Student success LEA District Leadership Team • Fidelity of implementation (across schools) • Systems integrity (district-LEA) • Student success (district-wide) Building Leadership Team • Fidelity of implementation (across grades) • Systems integrity (school) • Student success (school-wide) Building Staff • Student success/Intervention effectiveness

  13. Assessments Elementary Schools • Major Discipline Referrals • PBIS Self-Assessment Survey • PBIS Team Implementation Checklist • Benchmarks of Quality (BOQ) • Schoolwide Evaluation Tool (SET) • Benchmarks for Advanced Tiers (BAT) • Dynamic Indicators of Basic Early Literacy Skills (DIBELS) • Planning and Evaluation Tool (PET) for Effective Schoolwide Reading Programs • Effective Reading Support Team Implementation Checklist • Special Education Data Collection Form • Schoolwide Reading Analysis Support Page Middle/Junior High Schools • Major Discipline Referrals • PBIS Self-Assessment Survey • PBIS Team Implementation Checklist • Benchmarks of Quality (BOQ) • Schoolwide Evaluation Tool (SET) • ORF/MAZE through AIMSWeb • School-Wide Evaluation and Planning Tool for Middle School Literacy (SWEPT) • Middle School Reading Team Implementation Checklist • Special Education Data Collection Form

  14. Building Level

  15. Assist Teams in Using Data for Decision-making • First Year • Winter systems review • Spring Data Review • Second Year • Fall data review • Winter data review • Spring data review • Third Year • Fall data review • Winter data review • Spring data review

  16. Assessment Booklet Description of assessments Data collection schedule Data summary Data forms and assessment forms

  17. Team Evaluation of Outcome, Process and Systems Data

  18. Assessment Schedule (for Cohort 7 from MiBLSi website)

  19. Video examples for completing and submitting PBIS assessments

  20. Improving the accuracy and Consistency of Recording Office Discipline Referrals

  21. Developing Fluency with Discipline Referral Categories Example Exercise 2: Match the example situation below to the correct problem behavior on the discipline categories answer sheet. Write the letter in the column for Exercise 2.

  22. District Level

  23. Focus on Implementing with Fidelityusing Benchmarks of Quality (BoQ)/ODR ’06-’07 and ’07-’08 Decrease 14.6% Increase 8%

  24. District Implementation Tracking Form

  25. Leadership-Implementation Support Team Self-Assessment

  26. Lesson Learned Teams need to be taught how to analyze and use data Emphasis on directing resources to need and removing competing activities As we grow, it is even more important to systematic gather data that is accurate and then act on the data for continuous improvement More work is needed in developing feedback cycles

  27. “Even if you’re on the right track, you’ll get run over if you just sit there” - Will Rogers

More Related