1 / 65

Understanding and Closing Achievement Gaps

Understanding and Closing Achievement Gaps. August 13, 2012 8:00 a.m. – Noon by Doug Greer & Laurie Smith. Introductions and something good that is working to support your struggling students …. New metrics … Same focus. RELATIONSHIPS RELEVANCE RIGOR RESULTS.

lionel
Download Presentation

Understanding and Closing Achievement Gaps

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding and Closing Achievement Gaps August 13, 2012 8:00 a.m. – Noon by Doug Greer & Laurie Smith

  2. Introductions and something good that is working to support your struggling students …

  3. New metrics … Same focus • RELATIONSHIPS • RELEVANCE • RIGOR • RESULTS

  4. What is working to help struggling students? Please take a moment to briefly describe your answer during sessions of review and at designated times to the following questions: • What is working to help struggling math students? • What is working to help readers who struggle? • What is working to help writers who struggle? • What is working to help struggling science students? • What is working to help students who struggle with social studies (content and skills)?

  5. Essential Questions • What are others doing to help struggling learners? Take 10-20 minutes … • What does the new accountability data mean for our schools and our students? • How do we use the tools provided by MDE to improve teaching and learning? • What other data should we consider when closing the achievement gaps?

  6. What other specific questions did you walk in with today?

  7. Recap from July 30 session… • Tuesday, July 31: “Embargoed” notice to district superintendents of Priority and Focus schools • Thursday, August 2: Public release likely of the following: • Ed YES! Report Card (old letter grade) • AYP Status (old pass or fail system) • Top to Bottom Ranking and possibly: • Reward schools (Top 5%, Top improvement, BtO) • Focus schools (largest achievement gap top vs. bottom) • Priority schools (Bottom 5%) Doug Greer 877-702-8600 x4109 DGreer@oaisd.org

  8. ESEA Waiver Basics Principal 2 of 4 – Accountability & Support • Top to Bottom Ranking given to all schools with 30 or more students tested, full academic year (0 – 99th percentile where 50th is average) • NEW designation for some schools • Reward schools (Top 5%, Significant Improvement or Beating the Odds) • Focus schools (10% of schools with the largest achievement gab between the top and bottom) • Priority schools (Bottom 5%, replaces PLA list) • NEW in 2013, AYP Scorecard based on point system replacing the “all or nothing” of NCLB.

  9. Understanding the TWO Labels

  10. Z-scores (Standard Deviations)

  11. Z-scores (Standard Deviations) • Z-scores are centered around zero or the “state average” • Positive is ABOVE the state average • Negative is BELOW the state average State Average Z-score = Zero -1 -3 -2 1 2 3 0.5 -0.5 98% 84% 69% 31% 50% 16% 2% Percentile State Average

  12. Table Talk … • In terms of achievement gaps, how well do you think your school (or schools in your district) compare to all schools in the state? • Specifically, which content areas do you feel will have the smallest gaps versus the largest gaps relative to the state average?

  13. 2011-12 Top-to-Bottom Individual School Lookup … Download MS Excel file at Accountability Page of OAISD

  14. What if we have Focus school(s)? • Some schools may be exempt from Focus school designation in year 2 IF they are deemed Good-Getting-Great (G-G-G): • Overall achievement is above 75th percentile • Bottom 30% meets Safe Harbor improvement (or possibly AYP differentiated improvement) • G-G-G schools will be exempt for 2 years, then will need to reconvene a similar deep diagnostic study in year 4. Note: See ESEA Approved Waiver pp. 151-152

  15. What if we have Focus school(s)? • Unlike Priority label, Focus label may only be one year. (Title I set-aside lasts 4 years) NOTE: AYP Scorecard, Top to Bottom Ranking and Reward/Focus/Priority designation for August 2013 determined by Fall MEAP, 2012 and Spring MME, 2013.

  16. What if we have Focus school(s)? Requirement for all Focus schools: • Notification of Focus status by August 21, 2012 via the Annual Ed Report • Quarterly reports to the district board of education • Deep diagnosis of data prior to SIP revision (if Title I by Oct 1) • Professional Dialogue, toolkit available to all (if Title I requires DIF with time range of Oct – Jan.) • Revision of School Improvement Plan with activities focused on the Bottom 30% included (if Title I additional revisions to Cons App, both by Jan 30) • NOTE:  Additional requirements of Title I schools regarding set-asides and specific uses of Title I funds.

  17. What if we have Focus school(s)? • Supports Available: • OAISD work session on August 13 • OAISD follow up session ??? TBD • OAISD work session on “Data Utilization driving Instruction and School Improvement” October 25 • “Defining the Problem (Data  Planning)” work session at OAISD on January 22, 2013 • “SIP Planning Session” at OAISD on March 22, 2013 • Individualized support by OAISD per request • MDE Toolkit available in September, 2012 • Sept. MDE assigns DIF for Title I schools only • MDE Regional meeting on September 11 in GR

  18. Understanding the TWO Labels

  19. Top to Bottom Ranking: 95th 2 points possible: 2 = Achievement > linear trajectory towards 85% by 2022 (10 years from 11/12 baseline) 1 = Achievement target NOT met; Met Safe Harbor 0 = Achievement target NOT met; Safe Harbor NOT met

  20. STATUS: Lime Green Top to Bottom Ranking: 75th

  21. STATUS: Orange Top to Bottom Ranking: 50th

  22. Normal “Bell-Shaped” Curve Below Avg. Bottom 10% target Above Avg. Top 10% target Average

  23. Average Scale Scoreor Average % Correct % at Level 4 or Lv 3 & 4 or below set % % at Level 1 or Lv 1 & 2 or above set % Average

  24. Goal: All students will be proficient in math. • SMART Measureable Objective: All students will increase skills in the area of math on MEAP and Local assessments: • The average scale score for all students in math on the MEAP will increase from 622 (10/11) to 628 by 2013/14 school year (2 points per year) • The percentage of all students reaching Level 1 on the math portion of the MEAP will increase from 28% (2010-11) to 40% by 2013/14 school year (4% per year) • The percentage of all students at Level 4 on the math portion of the MEAP will decrease from 18% (10/11) to 6% by 2013/14 school year (4% per year) • The average proficiency across the grade levels on the Winter Benchmark in Delta Math will increase from 74% (2010-11) to 85% by the January, 2013. • The number of students identified as “At Risk” on Delta Math on the Fall screener will reduce from 58 (2010-11) to 40 by the Fall of 2012.

  25. Goal: All students will be proficient in math. • SMART Measureable Objective: All students will increase skills in the area of math on MEAP and Local assessments: • The average percentage correct for all students in math on the MEAP will increase from 52% (10/11) to 61% by 2013/14 school year (3% per year) • The percentage of all students reaching 80% accuracy on math portion of the MEAP will increase from 28% (2010-11) to 40% by 2013/14 school year (4% per year) • The percentage of all students reaching 40% accuracy on math portion of the MEAP will increase from 82% (10/11) to 94% by 2013/14 school year (4% per year) • Percent Correct example from 2010/11 • New Cut Score Proficiency and Scale Score on previous slide from 2011/12

  26. Caution about Bottom 30% tool

  27. Caution about Bottom 30% tool

  28. Discussion and Break … Take a break then discuss (or vice versa) the following two questions: • Why should MDE use Full Academic Year (FAY) students (those who have 3 counts in the year tested) to hold schools accountable? • Why should local school districts NOT use FAY student data to set goals to improve instruction?

  29. Top to Bottom Ranking Ranks all schools in the state with at least 30 full academic year students in at least two tested content areas (Reading, Writing, Math, Science and Social Studies weighted equally plus graduation). • Each content area is “normed” in three categories: • 2 years of Achievement (50 – 67%) • 3 – 4 years of Improvement (0 – 25%) • Achievement gaps between top and bottom (25 – 33%) • Graduation rate (10% if applicable) • 2 year Rate (67%) • 4 year slope of improvement (33%)

  30. How Is the Top to Bottom Ranking Calculated • For science, social studies, writing, and grade 11 all tested subjects Two-Year Average Standardized Student Scale (Z) Score School Achievement Z-Score 1/2 Four-Year Achievement Trend Slope School Performance Achievement Trend Z-Score School Content Area Index Content Index Z-score 1/4 Two-Year Average Bottom 30% - Top 30% Z-Score Gap School Achievement Gap Z-Score 1/4

  31. Z-scores (Standard Deviations) • Z-scores are centered around zero or the “state average” • Positive is ABOVE the state average • Negative is BELOW the state average State Average Z-score = Zero -1 -3 -2 1 2 3 0.5 -0.5 98% 84% 69% 31% 50% 16% 2% Percentile State Average

  32. 2011-12 Top-to-Bottom Individual School Lookup … Download MS Excel file at www.mi.gov/ttb or OAISD Accountability page

  33. When finished with the worksheet please add to the Google “Chalk Talk” about what works. -.4 to .4 .5 to 2.0 -2.0 to -.5

  34. … …

  35. Suppose there are 20 students (most of whom are shown) and the average Z-score of all 20 is 0.28, this represents the Achievement Score before it is standardized again into the Z-score … … …

  36. Top 30% of students (n=6) has average score of 1.62 … … … Mid 40% (n=8) has average score of -0.34 Bottom 30% (n=6) has average score of -1.12 Gap = -1.12 – 1.62 or -2.74 then standardized

  37. Performance Level Change (“growth”)

  38. Levels and Uses of Data • GLOBAL data • District level  School level  Grade Level • Best used to study trends of student performance over time (3-5 years) & across different subgroups. • Informs school-wide focus, must drill deeper • STUDENT level data • Use only when timely reports (less than 2 weeks) are available at a more specific diagnostic level. • DIAGNOSTIC levels • Cluster (formerly Strands in HS/GLCEs) • Standards (formerly Expectations in HS/GLCEs) • Learning Targets

  39. Global & Trend Data • Have you seen this new IRIS report? • What are your predictions around what the historic cut scores will look like? • Do you have assumptions about strengths and weaknesses at certain grade levels and content areas?

  40. “Bright spot” (Switch by Chip & Dan Heath) • You may have noticed many of the green lines are stagnant. Did you notice any bright spots with a steady increase and separation from state & county average?

  41. I: Activate & Predict • Surfacing experiences and expectation • Make predictions, recognize assumptions and possible learning II: Record Observations • Analyzing the data in terms of observable facts • Search for patterns, “ah-ha”, be specific (avoid judging & inferring)

  42. II: Record Observations • Within the Google Doc Collection: • Dialogue in small groups and record what is observable in the district data at ALL grade levels. • Do NOT judge, conjecture, explain or infer. • Make statements about quantities (i.e. 3rd grade math fluctuated between 57-72%; however the past three years have been stagnant around 64% • Look for connections across grade levels (i.e. A sharp increase was seen in 5th grade math in 2009 (5380%), then the same group of students increased in 7th grade math in 2011 (5476%)

More Related