1 / 72

FloridaRtIf

Problem Analysis. A collaborative project between the Florida Department of Education and the University of South Florida. FloridaRtI.usf.edu. Advance Organizer. SBLT Data - Beliefs, Practices, Skills Review of Problem Identification Big Ideas/Concepts of Problem Analysis

baird
Download Presentation

FloridaRtIf

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Problem Analysis A collaborative project between the Florida Department of Education and the University of South Florida FloridaRtI.usf.edu

  2. Advance Organizer • SBLT Data - Beliefs, Practices, Skills • Review of Problem Identification • Big Ideas/Concepts of Problem Analysis • Hypothesis/Prediction Statement • Assessment & Hypothesis Validation • Examples of Hypothesis Generation and Evaluating

  3. SBLT Data • Did your building’s beliefs change from the first to the second administration? If yes, in what areas did the greatest change occur? • What do you think these changes mean in the context of implementing a PS/RtI model in your building? • What “practices” occurring in your building do you think are most consistent with the PS/RtI model and which ones do you think might be a threat to the implementation of the model? • How consistent are the overall beliefs of your building with your building’s perceptions of the practices occurring? What does the level of consistency mean in terms of implementing a PS/RtI model in your building? • To what extent do you believe that your building possesses the skills to use school-based data to evaluate core (Tier 1) and supplemental (Tier 2) instruction? Based on what your building has learned about using data to make decisions, how consistent are the skills your building possesses with what you are doing in your building (i.e., to what degree does your building evaluate the effectiveness of core and supplemental instruction)?

  4. Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity • Of longer duration • Intensive, Individual Interventions • Individual Students • Assessment-based • Intense, durable procedures • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Universal Interventions • All students • Preventive, proactive • Universal Interventions • All settings, all students • Preventive, proactive Problem ID Review Academic Systems Behavioral Systems 1-5% 1-5% 5-10% 5-10% 80-90% 80-90% Horner & Sugai

  5. Problem ID Review In order to identify a problem, you’ve got to start with three pieces of data- • Benchmark level of performance • Target Student level of performance • Peer level of performance

  6. Peers Benchmark Student Problem ID ReviewIndividual Student Data

  7. Peers Benchmark Student Problem ID ReviewIndividual Student Data

  8. Benchmark Peers Student Problem ID ReviewIndividual Student Data

  9. Benchmark Problem ID ReviewBuilding Level Data % Students at Benchmark Bldg. Level Performance

  10. Problem ID ReviewBuilding Level Data % Students referred to office Bldg. Level Performance Benchmark

  11. Data Required for Problem Identification • Replacement Behavior • Current Level of Functioning • Benchmark/Desired Level • Peer Performance • GAP Analysis

  12. Example- ORF • Target Student’s Current Level of Performance: • 40 WCPM • Benchmark • 92 WCPM • Peer Performance • 98 WCPM • GAP Analysis: • Benchmark/Target Student 92/40= 2+X difference SIGNIFICANT GAP • Benchmark/Peer 92/98= <1 X difference NO SIGNIFICANT GAP • Is instruction effective? • Yes, peer performance is at benchmark.

  13. Problem ID

  14. Example- Behavior • Target Student’s Current Level of Performance: • Complies 35% of time • Benchmark (set by teacher) • 75% • Peer Performance • 40% • GAP Analysis: • Benchmark/Target Student 75/35= 2+X difference SIGNIFICANT GAP • Benchmark/Peer 75/40= 1.9 X difference SIGNIFICANT GAP • Peer/Target Student 40/35= 1.1X difference NO SIGNIFICANT GAP • Is behavior program effective? • No, peers have significant gap from benchmark as well.

  15. Problem ID

  16. Tier One Problem Identification

  17. Tier One Problem Identification 1. Rank from highest to lowest the groups and years for which core instruction is most effective. Be sure to include all 6 possibilities in your response. 2. Which group(s) of students should receive highest priority for monitoring while modifications to core instruction are being made? Justify your decision. • Which group(s) of students is most likely to be referred for additional intervention—regardless of any label they might have? Justify your decision. 4. Based on the data from the previous two school years, for which of the three groups of students depicted above, if any, will core instruction potentially be effective at the end of this school year (i.e., 2007-08)? Justify your decision. 5. Assume that modifications were made between the 05/06 and 06/07 school years for all groups of students at all levels of risk. Which group(s) of students at what level(s) of risk made the greatest improvement across the two years? Justify your decision.

  18. Tier One Problem Identification

  19. Tier One Problem Identification 1. Rank from highest to lowest the groups and years for which core instruction is most effective. Be sure to include all 6 possibilities in your response. 2. Which group(s) of students should receive highest priority for monitoring while modifications to core instruction are being made? Justify your decision. • Which group(s) of students is most likely to be referred for additional intervention—regardless of any label they might have? Justify your decision. 4. Based on the data from the previous two school years, for which of the three groups of students depicted above, if any, will core instruction potentially be effective at the end of this school year (i.e., 2007-08)? Justify your decision. 5. Assume that modifications were made between the 05/06 and 06/07 school years for all groups of students at all levels of risk. Which group(s) of students at what level(s) of risk made the greatest improvement across the two years? Justify your decision.

  20. Tier One Problem Identification Worksheet Your project ID is: • Last 4 digits of SS# • Last 2 digits of year of birth

  21. Steps in the Problem-Solving Process • PROBLEM IDENTIFICATION • Identify replacement behavior • Data- current level of performance • Data- benchmark level(s) • Data- peer performance • Data- GAP analysis • PROBLEM ANALYSIS • Develop hypotheses • Develop predictions/assessment • INTERVENTION DEVELOPMENT • Develop interventions in those areas for which data are available and hypotheses verified • Proximal/Distal • Implementation support • RESPONSE TO INTERVENTION (RtI) • Frequently collected data • Type of Response- good, questionable, poor

  22. Steps in the Problem-Solving Process: Problem Analysis • PROBLEM ANALYSIS • Develop hypotheses • Develop predictions/assessment

  23. Problem Analysis in Context Evaluate Intervention Effectiveness Monitor Progress Analyze the Problem Identify the Problem Implement Intervention Design Intervention J L Timeline

  24. The Role of Assessment in Problem Analysis Completing Problem Analysis activities will enable the team to answer: • Why is there a difference between what is expected and what is observed? That is, why is the replacement behavior not occurring? What is the most likely reason? • How do you target the intervention that would have the highest probability of being successful?

  25. Purpose of Assessment in Problem Analysis • Assessment should link to instruction for the purpose of designing an educational intervention • The focus should be on collecting information that will lead us to decisions about: what to teach (curriculum) and how to teach (instruction)

  26. Purpose of Assessment in Problem Analysis • Focus only on gathering information that is directly linked to the defined problem and that will guide you to answering the question “Why is this problem occurring?”. • Do not collect information for the sake of collecting information. • Do not collect what you already have. REMEMBER: Our assessment must focus on gathering information that will DIRECTLY impact student gains in their classroom environment.

  27. Determining What Data to Collect Less Educationally Relevant & Unalterable Educationally Relevant & Alterable Disregarded or Low Priority (Height, eye color) Known Information Gather this Existing Information (Classroom DIBELS data, ODRs) Conduct Assessments to Gather this Information (Behavior observations, specific skill assessments) These are assessment questions Unknown Information Don’t Go Here! (Cognitive processing?)

  28. Here’s what we’re gonna do • Look at the information we have • Gather some more we want, but don’t have • Make a few educated guesses (Why is the replacement behavior not occurring?) • If needed, gather more information to fine tune • Decide on the most likely reason(s) why.

  29. Here’s what we’re gonna do • Look at the information we have • Gather some more we want, but don’t have • Make a few educated guesses (Why is the replacement behavior not occurring?) • If needed, gather more information to fine tune • Decide on the most likely reason(s) why.

  30. Steps in Problem Analysis • Fact Finding • Generate ideas about possible causes (hypotheses) • Sort out which possible causes seem most viable and which don’t (validation) • Link the things we’ve learned to intervention

  31. Generate Hypotheses Evidenced-Based Knowledge of Content + Specific Knowledge of Current Problem = Good Hypotheses

  32. Domains for Hypotheses

  33. Generate Hypotheses Developing Assumed Causes Developing evidence-based statements about WHY a problem is occurring.

  34. Generate Hypotheses Hypotheses… • Are developed to determine reasons for why the replacement behavior is not occurring • Should be based on research relevant to the target skills • Focus on alterable variables • Should be specific, observable, and measurable • Should lead to intervention

  35. Generate Hypotheses Hypotheses… • Must consider both SKILL and PERFORMANCE deficits: • Skill Deficit • Student does not have the skills to perform the task • Student lacks fluency skill for grade level • Student lacks private speech for self control • Performance Deficit • Student does perform existing skill or performs at lower level • Student reads slowly because of fear of ridicule by peers for mistakes • Peers reinforce bad choices more than teacher reinforces good choices

  36. Writing A Hypothesis Statement(What are possible causes?) Identify known information about the identified problem. Discard Irrelevant Information Do you have enough information to identify possible causes? Gather unknown information with additional RIOT procedures. NO YES Make hypothesis and prediction. The problem is occurring because _________. If ____________ would occur, then the problem would be reduced.

  37. Hypothesis / Prediction Statement The Problem is occurring because _________________________________. If ___________________ would occur, then the problem would be reduced.

  38. Prediction Statements • Developed to INFORM ASSESSMENT and decision-making for hypotheses • The purpose is to make explicit what we would expect to see happen if: • The hypothesis is valid and • We intervened successfully to reduce or remove the barrier to learning • Written in if/then or when/then form • Used to develop assessment questions to help validate/not validate hypotheses

  39. Hypotheses Validation Why do Problem Solving Teams need to Validate a Hypothesis? If the hypothesis is inaccurate and the wrong intervention is implemented valuable time could be wasted on an intervention that was not an appropriate instructional match for the student.

  40. Assessment Problem Analysis is the process of gathering information in the domains of instruction, curriculum, environment and the learner (ICEL) through the use of reviews, interviews, observations, and tests (RIOT) in order to evaluate the underlying causes of the problem. That is, to validate hypotheses.

  41. AssessmentHow Do We Validate Hypotheses? • Review • Interview • Observe • Test

  42. Assessment Procedures that are used: R : Review I : Interview O: Observe T: Test Assessment Domains are not limited to the student: I: Instruction C: Curriculum E: Environment L: Learner EVALUATION

  43. Content Of Assessment Domains INSTRUCTION • instructional decision-making regarding selection and use of materials, placement of students in materials • frequency of interaction/reinforcement • clarity of instructions • communication of expectations and criteria for success (behavioral and academic) • direct instruction with explanations and criteria for success (behavioral and academic) • sequencing of lessons designs to promote success • variety of practice activities (behavioral and academic)

  44. Content Of Assessment Domains CURRICULUM • long range direction for instruction • instructional materials • intent • arrangement of the content/instruction • pace of the steps leading to the outcomes • stated outcomes for the course of study • general learner criteria as identified in the school improvement plan and state benchmarks (behavioral and academic)

  45. Content of Assessment Domains ENVIRONMENT • physical arrangement of the room • furniture/equipment • clear classroom expectations • management plans • peer interaction, expectations, reinforcement, support • schedule • task pressure • home/family supports

  46. Content Of Assessment Domains LEARNER • skills • motivation • health • prior knowledge

  47. Domains for Assessment RIOT by ICEL

  48. RIOT by ICEL

  49. Format for Hypothesis Validation Hypothesis Prediction Mary is noncompliant because she does not have the skills to complete the work successfully. If we reduce the academic demand or improve her skills, Mary will become more compliant. Validated?: Assessment Question(s): Is task difficulty appropriate for Mary’s skill level? Where are the answers?: Review Learner records for evidence of skills; Review Curriculum to understand expectation. Answers: Review of records and review of curriculum indicates that Mary has the skills to complete the requested tasks. No

More Related