1 / 24

Building Stronger Professional Development Systems: Using Data to Improve Implementation

Building Stronger Professional Development Systems: Using Data to Improve Implementation. Jennifer Coffey, Ph.D. OSEP Project Director and SPDG Program Lead Audrey Desjarlais Knowledge Mobilization Coordinator, Signetwork.

Download Presentation

Building Stronger Professional Development Systems: Using Data to Improve Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Stronger Professional Development Systems: Using Data to Improve Implementation Jennifer Coffey, Ph.D. OSEP Project Director and SPDG Program Lead Audrey Desjarlais Knowledge Mobilization Coordinator, Signetwork

  2. Implementation: Research on Tools; Research as Tool Sonja K. Schoenwald Professor of Psychiatry & Behavioral Sciences Medical University of South Carolina August 20, 2013, Global Implementation Conference.

  3. Today’s Agenda

  4. AGENDA – Day One Continued

  5. Thank You! Planning Committee Members, Staff, and Facilitators • SPDG Members • Susan Bailey-Anderson, Montana • Don Briere, Connecticut • Kathy Cox, Illinois • Alice Henley, Connecticut • David Merves, Delaware & New Hampshire • Brenda Oas, North Dakota • Mary Steady, New Hampshire RRCP Implementation Core Team • Jeanna Mullins, Mid-South RRC • Kim Hartsell, Southeast RRC • Nancy O’Hara, Mid-South RRC OSEP • Jennifer Coffey • Tina Diamond • David Guardino • Pat Gonzalez • Shedeh Hajghassemali • Terry Jackson • Greg Knollman • Ingrid Oxaal • Corinne Weidenthal • Susan Weigert • Grace Zamora Duran Signetwork Staff • Linda Lynch • Melissa Moseley • Leslie Stephenson • Audrey Desjarlais

  6. Invited Guests OSERS • Michael Yudin, Acting Assistant Secretary, US Department of Education OSEP • Melody Musgrove, Director, OSEP • Ruth Ryder, Deputy Director, OSEP • Larry Wexler, Director, Research to Practice Division Parent Centers/ Regional PTACs • Barbara Buswell, Region 5 • Debra Jennings, Region 1 • Connie Hawkins, Region 2 • Nora Thompson, Region 6

  7. SPDG Participation • 43 States Represented • State teams range from • 1 rep (8 states) CA, CO, MA, MN, NE, PA, VA, WI • 2 reps (4 states) AR, MT, RI, SC • 3 reps (6 states) AL, ID, NC, OR, VT, WY • 4 reps (9 states) AK, CT, IL, IN, KS, KY, MO, NM, TN • 5 reps (8 states) DE, FL, GA, MD, MI, NH, OH, OK, UT • 6 reps (1 state) MS • Partners include representatives from: Parent Organizations, Institutes of Higher Education

  8. Welcome!

  9. Fiscal Year (FY) 2014 • FY 2015 • What you can do to prepare • GPRA/Program Measures • Partners • Current/past data – telling your story • Start sharing with others what a new SPDG could accomplish • IDEA Statute – SPDG • www.Signetwork.org

  10. The Letter and Spirit of the Law • Partnerships • Serving children with disabilities • Fulfilling the purpose of your project

  11. GPRA/Program Measures – Pilot Review • External Review Results for FY 2012 • Program Measure 1: 11 of 22 initiatives (50%) met the set targets • Program Measure 2: 4 of 22 (18%) initiatives met the targets they set • Program Measure 3: 7 out of 22 (32%) initiatives me the targets they set • Program Measure 4: 1 out of 1

  12. Reliability for the pilot • For Program Measure 1 more variation in scores – from complete agreement to 44% difference. • For Program Measure 2 there were no disagreements for the first pair of reviewers and just 1 disagreement for the 2nd pair. • For Program Measure 3, there was 1 disagreement for the first pair of reviewers and 4 disagreements for the 2nd pair. • For Program Measure 4, there was 1 disagreement for the first pair and none for the second pair.

  13. Current External Review • Westat – Data Quality Initiative • Will produce a 1-page document that summarizes each project’s results • OSEP will provide another FAQ document to improve the data coming from projects

  14. This Meeting • Why focus on implementation fidelity? • Implementation Science • We are the resource • Other resources – AI Hub

  15. Introductions – Allison MetZ

  16. Small Group Discussions

  17. Discussion Process PURPOSE: • To learn from others about tools and approaches in using data to: • define your practice, • improve the fidelity of the practice • effectively engage the various drivers to support the use of data

  18. Discussion Process • Pick a topic • ?s to define the what and how – of the ‘what’ • ?s to improve the fidelity of ‘what’ through the implementation drivers • ?s about coaching systems to improve the fidelity of ‘what’

  19. Discussion Process • State Teams – Split up • Groups – 8-10 people • Facilitator – OSEP & RRCP • Listen & Engage • Write your questions • Take Notes • Return to Academy Hall for Allison & Jennifer’s Q&A session – 11:30 AM

  20. Large Group Q & A

  21. What’s Next 11:45-1:00pm - Lunch on your own, dining options in packet 1:00-1:30pm – State team time, Action Planning, Meet with your evaluators or Meet with Project Officers

  22. AFTERNOON SESSIONS

  23. DAY TWO – BEGINS @ 8:00am • Project Sharing • U.S. Dept of Ed Presentation

More Related