1 / 58

Looking at Data

Looking at Data. Presented by The Early Childhood Outcomes Center. Revised January 2013. Using data for program improvement = EIA. E vidence I nference A ction. Evidence. Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable. Inference.

Download Presentation

Looking at Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Looking at Data Presented by The Early Childhood Outcomes Center Revised January 2013

  2. Using data for program improvement = EIA Evidence Inference Action Early Childhood Outcomes Center

  3. Evidence Evidence refers to the numbers, such as “45% of children in category b” The numbers are not debatable Early Childhood Outcomes Center

  4. Inference How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence) Early Childhood Outcomes Center

  5. Inference Inference is debatable -- even reasonable people can reach different conclusions from the same set of numbers Stakeholder involvement can be helpful in making sense of the evidence Early Childhood Outcomes Center

  6. Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Early Childhood Outcomes Center

  7. What can we infer? Poll results A: Candidate I.M. Good 51%, Candidate R.U. Kidding 49% (+ or – 3%) Poll results B: Candidate I.M. Good 56%, Candidate R.U. Kidding 44% (+ or – 3%) Early Childhood Outcomes Center

  8. Program improvement: Where and how At the state level – TA, policy At the regional or local level – supervision, guidance Classroom level -- spend more time on certain aspects of the curriculum Child level -- modify intervention Early Childhood Outcomes Center

  9. Key points Evidence refers to the numbers and the numbers by themselves are meaningless Inference is attached by those who read (interpret) the numbers You have the opportunity and obligation to attach meaning Early Childhood Outcomes Center

  10. E – I – A Jeopardy $100 $100 $100 $200 $200 $200 $300 $300 $300 Early Childhood Outcomes Center

  11. Use of Data: Activity Evidence-Inference-Action Early Childhood Outcomes Center

  12. Continuous Program Improvement Reflect Are we where we want to be? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Early Childhood Outcomes Center

  13. Tweaking the System Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

  14. Continuous means… ….the cycle never ends. Early Childhood Outcomes Center

  15. Outcome questions for program improvement, e.g. Who has good outcomes = Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Family outcomes? Education level of parent? Early Childhood Outcomes Center

  16. Examples of process questions Are ALL services high quality? Are ALL children and families receiving ALL the services they should in a timely manner? Are ALL families being supported in being involved in their child’s program? What are the barriers to high quality services? Early Childhood Outcomes Center

  17. Working Assumptions There are some high quality services and programs being provided across the state. There are some children who are not getting the highest quality services. If we can find ways to improve those services/programs, these children will experience better outcomes. Early Childhood Outcomes Center

  18. Numbers as a tool Heard on the street “Why are we reducing children to a number?” So why do we need numbers? Early Childhood Outcomes Center

  19. Early Childhood Outcomes Center

  20. Early Childhood Outcomes Center

  21. Early Childhood Outcomes Center

  22. Early Childhood Outcomes Center

  23. Examining COS data at one time point One group - Frequency Distribution Tables Graphs Comparing Groups Graphs Averages Early Childhood Outcomes Center

  24. Distribution of COS Ratings in Fall We are using fake data for illustration Early Childhood Outcomes Center

  25. Frequency on Outcome 1 - Fall Early Childhood Outcomes Center

  26. Frequency on Outcome 1 - Fall Early Childhood Outcomes Center

  27. Comparison of two classes - Fall Early Childhood Outcomes Center

  28. Frequency on Outcome 1 - Fall Early Childhood Outcomes Center

  29. Frequency on Outcome 1 – Class 1 Early Childhood Outcomes Center

  30. Average Scores on Outcomes by Class – Fall, 2008 Early Childhood Outcomes Center

  31. Average Scores on Outcomes by Class – Fall, 2008 Early Childhood Outcomes Center

  32. Average Scores on Outcomes by Class – Fall, 2008 Early Childhood Outcomes Center

  33. Looking at change over time Extent of change on rating scale The OSEP categories Developmental trajectories Maintaining Changing Early Childhood Outcomes Center

  34. Extent of change on rating scale: Time 1 to Time 2 Early Childhood Outcomes Center

  35. OSEP progress categories Looking at information across time Reducing the information to fewer categories to allow easier comparisons Early Childhood Outcomes Center

  36. Working with data Different levels of analysis are required for different levels of questions Aggregation will work for you – but loses detail about individual children. 50 assessment items on 20 children in 5 classes in Fall and Spring 50 x 20 x 5 x 2 = 10,000 pieces of information Early Childhood Outcomes Center

  37. Using assessment data at the classroom level Looking at the data by child At a single point in time Over time Looking at data for areas that cut across children At a single point in time Over time Early Childhood Outcomes Center

  38. Example: Item Results for 5 Imaginary Children A=Accomplished; E= Emerging; NY= Not yet Early Childhood Outcomes Center

  39. Example: COS Outcome Ratings for Class 3c by Child Early Childhood Outcomes Center

  40. Example of an Aggregated Report for Program: Percentage of Children Scoring 5 or Higher on COS by Class What do you see in these data? Early Childhood Outcomes Center

  41. Outcome questions for program improvement, e.g. Who has good outcomes = Do outcomes vary by Region of the state? Level of functioning at entry? Services received? Age at entry to service? Type of services received? Family outcomes? Education level of parent? Early Childhood Outcomes Center

  42. Looking at Data by Region Percentage of Children Who Changed Developmental Trajectories After One Year of Service Possible inference? Early Childhood Outcomes Center

  43. Looking at Data by Age at Entry Percentage of Children Who Changed Developmental Trajectories After One Year of Service Possible inference? Early Childhood Outcomes Center

  44. Take Home Message You will want to look at your data in lots of different ways You will want to think about the possible inferences You may need other information to decide among possible inferences Act on what you have learned Early Childhood Outcomes Center

  45. Tweaking the System Is there a problem? Reflect Are we where we want to be? Why is it happening? Is it working? What should be done? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement Is it being done? Early Childhood Outcomes Center

  46. How will/might these data be used? Federal level Overall funding decisions (accountability) Resource allocation (e.g., what kind of TA to fund?) Decisions about effectiveness of program in individual states State level Program effectiveness?? Program improvement?? Local level Program improvement?? Early Childhood Outcomes Center

More Related