510 likes | 718 Views
Utilizing Data Informed Decision-Making to Improve the PSSA Performance of CTE Students. PACTA PIL October 18, 2010. Agenda. 11:15 – 12:45 12:45 – 1:30 1:30 – 3:00 3:00 – 3:15 3:15 – 5:00 5:00 – 5:45 5:45 – 6:00. Overview of the School Improvement Process Data, Data, Everywhere
E N D
Utilizing Data Informed Decision-Making to Improve the PSSA Performance of CTE Students PACTA PIL October 18, 2010
Agenda • 11:15 – 12:45 • 12:45 – 1:30 • 1:30 – 3:00 • 3:00 – 3:15 • 3:15 – 5:00 • 5:00 – 5:45 • 5:45 – 6:00 • Overview of the School Improvement Process • Data, Data, Everywhere • LUNCH • Data Analysis • Break • Root Cause Analysis Procedures • Sharing & Reporting Out • Wrap-up and Evaluation
What impacts (has an effect on) heart health? • Family History • Diet • Exercise • Smoking
What are indicators of heart health? • Family History • Diet - Weight • Exercise – Resting heart rate • Cholesterol • Triglycerides • Blood pressure
What are the impacts and indicators of heart disease? If we are to improve the health of our hearts, we need to be aware of both the impacts of heart health as well as the indicators of heart health. Another example: STEELERS FOOTBALL!
Data Informed Decision Making Cycle for __________ IMPROVEMENT Data Data Data analysis Data Remember: Numbers are our friends Data Strategic Planning Data Did it work? Data Data Data Resources
Using Data to Improve Learning for All: A Collaborative Inquiry Approach by Nancy Love et al. Added by Shula
Demographics School Processes Perceptions Student Learning Types of Data a la Bernhardt – Indicators & Impacts
What are your Impacts and Indicators? • Identify your Impact and Indicators by their type • Blue Dots – Student Learning • Yellow Dots – Demographics • Red Dots – School Processes and Programs • Green Dots – Perceptions
Summative Assessments PSSA NOCTI NAEP Formative Assessments Informal teacher observations Interim Assessments 4Sight Grades Diagnostic Assessments CDT Multiple Measures of Student Learning Indicators
Multiple Measures of Student Learning 4Sight PSSA NOCTI
Longitudinal Data Analysis of annual performance Analysis of across the years Analysis of cohort groups across the years (8th grade vs. 11th grade) PVAAS Multiple Measures of Student Learning – over time
PSSA NOCTI Grades What are YOUR measures of Student Learning Data?(summative, formative, interim, diagnostic)
Typical Data: Ethnicity IEP Economically Disadvantaged Gender Mobility Enrollment Attendance Teacher Demographics? Multiple Measures of Demographics Impacts
Demographics to Disaggregate Disaggregation is not a problem solving strategy….. It’s a problem finding strategy.
Are all students performing at the same level? IEP students? LEP students? Economically disadvantaged students? Is the achievement gap (between high and low poverty students) decreasing or increasing? Do students who attend school every day get better grades? Are achievement levels higher for those students who stay in a school building for two or more years? Student Learning AND Demographics
What are your demographic measures? Teachers Students Community
Typical Data: Description of school programs and processes. How are students identified for programs and services? Multiple Measures of School Processes (Programs) Impacts and Indicators
Student Learning AND Demographics AND School Processes Are there differences in achievement scores (or in rates of progress) for 11th grade females and males by the type of career program in which they are enrolled?
What are your programs or procedures/processes? • Tutoring • Title I • Grading policy • Enrollment into a CTC • Part time CTC transportation issue
Typical Data: Perceptions of Learning Environment School Climate Values and Beliefs Observations Multiple Measures of Perceptions Impacts
Student Learning AND Demographics AND Perceptions Do students of different ethnicities perceive the learning environment differently, and do they score differently on standardized achievement tests consistent with these perceptions?
Teachers’ What are your measures of perception? Parents’ Sending Districts’ Students’
Identify your Impact and Indicators by their type • Blue Dots – Student Learning • Yellow Dots – Demographics • Red Dots – School Processes and Programs • Green Dots – Perceptions • What’s missing? • What additional data should be collected? Examined? Considered?
Data Analysis Now that the data is gathered (or it is on the to be gathered list), it’s time to analyze the data. Remember…..”Numbers are our friends”
Data Analysis • “Gathering” your PSSA data using the Feeder Report from eMetric • What percent of 11th graders (in 2010) scored Below Basic, Basic, Proficient, and Advanced in Reading? Math? (These are this years’ 12th graders) • What about the class of 2010 (PSSA grade 11 in 2009)? The class of 2009 (PSSA grade 11 in 2008)? • Examine the three year trend of 11th grade performance in reading and math. • Observations – just the facts! • Are more students reaching proficiency? • Are fewer students below basic? • Repeat the above looking at: • Current 9th graders (8th graders in 2010) • Current 10th graders’ 8th grade PSSA scores (from 2009) • Current 11th graders’ 8th grade PSSA scores (from 2008)
Data Analysis • Disaggregation is a problem finding strategy! • 11th Grade • By Program • By Gender • By Sending District • By Reporting Categories • 8th Grade • By Program • By Gender • By Sending District • By Reporting Categories
Why all this data? “Root Cause Analysis”
Observation and Reflection What are you seeing? JUST THE FACTS! What are you thinking about the results? What’s ‘causing’ these results? • More females are proficient than males. • Over the past three years, the percent of students reaching proficiency has increased. • The percent of students below basic has remained constant over three years. • Students don’t arrive at ‘my grade level’ as prepared as they should be. • Support programs are lacking.
Root Cause Analysis (Paul Preuss) • Definition – the deepest underlying cause, or causes, of positive or negative symptoms within any process that if dissolved would result in elimination, or substantial reduction, of the symptom. • Root cause analysis eliminates patching and wasted effort. • Root cause analysis conserves scarce resources. • Root cause analysis induces discussion and reflection.
How do you know you’ve ‘found’ the root cause? • You run into a dead end asking what caused the proposed root cause. • Everyone agrees that this is a root cause. • The cause is logical, makes sense, and provides clarify to the problem. • The cause is something that you can influence and control. • If the cause is dissolved, there is realistic hope that the problem can be reduced or prevented in the future.
“School improvement teams and others using root cause analysis often wonder when to stop seeking cause and make the decision that sufficient data and effort have been used to arrive at a reasonable root. This is often a judgment call that will improve with experience. Often, the lack of data and the pressures of time frustrate the effort and force it to halt at a level below the surface symptom, but perhaps not as deep as it must ultimately go.” (Preuss 2003)
Root Cause Analysis – prerequisites • Key Indicators of Student Success • Measures of each indicator • Desired Ideal Condition of the indicator (e.g., 56% proficient or better) • Gap between the desired ideal condition and the present condition • Is this gap a priority issue? • Goal statement • Search for Root Cause • Possible strategies for improvement
Root Cause Processes • Questioning the Data • The Diagnostic Tree • The Five Whys • Force Field Analysis • Throughout each process, reflect back on to your list of impacts and indicators
Questioning Data • “What do you see?” • “What questions do you have about what you see?” Questioning the Data a la Dr. Shula: • What do you see? – JUST THE FACTS • What are you thinking/feeling/believing about what you see? • What other data or data analysis might shed more light on the issue?
The Diagnostic Tree • The “Red Flag” event or priority issue • Location Level • Hypotheses Level • PSSA Math scores are below AYP target • Location - incoming 9th graders from X Middle Schools • Hypotheses – Is this related to Student Demographics? Curriculum? Instruction? System Processes? Organizational Culture? External Factors:
The Five Whys • Why? • Why? • Why? • Why? • Why? • Team: Whydo we have so many class tardies? • Students: Because we do not have enough time. • Team: Why don’t you have enough time to get from one class to another? • Students: Because 4 minutes isn’t enough time to get from one end of the building to the next and go to locker or rest room. • Team: Why only 4 minutes? • Principal: Because we wanted to reduce the time that students were in the halls. • Team: Why did we want to reduce the hall time? • Principal: Because we wanted to reduce disciplinary problems. • Team: Why did we want to reduce disciplinary problems? • Principal: We wanted to improve school safety and climate.
Force Field Analysis • Driving Forces and Restraining Forces • Driving Forces apply pressure to move in a direction of change • Restraining Forces apply pressure to remain in place • Either the driving forces have to be increased or the restraining forces have to be decreased.
The Final Report – PACTA PIL Program • Brief introduction • Three year analysis of reading and math scores • Strengths • Deficiencies • Root Causes for each CTE Program • Action Plans to address the Root Cause • Timeline for Implementing & Monitoring
Sharing & Reporting Out New insights? Additional data/information to be gathered and examined? New theories? Next steps