820 likes | 997 Views
Advanced Problem Analysis in Reading: Curriculum Based Evaluation & other functional academic assessments. Indian Prairie District 204 Problem Solving October, 2008. Acknowledgements. Kerry Bollman : NASP & Flex West CBE presentations Sue Gallagher: Flex West CBE presentations
E N D
Advanced Problem Analysis in Reading: Curriculum Based Evaluation & other functional academic assessments Indian Prairie District 204 Problem Solving October, 2008
Acknowledgements • Kerry Bollman: NASP & Flex West CBE presentations • Sue Gallagher: Flex West CBE presentations • Madi Phillips: NSSED presentations • Heartland AEA 11, Des Moines, Iowa • Ken Howell & Victor Nolet : CBE book • Joe Witt: STEEP Model/1 minute Academic Assessment • Ed Shapiro: Academic Skills Problems • Ed Daly: Functional Analysis of Academics
Objectives • State thefundamental componentsof functional academic assessment, including CBE • Be introduced to some of thebasic skillsinvolved in CBE and advanced problem analysis in reading • Practicethe thought process through training exercises
Agenda • Overview of CBE as thought process 12:00-12:45 • How is this similar to or different from your previous thinking related to CBE? • Jigsaw Activity 12:45-1:45 • In letter group – read sections, create summary with visuals • In number group – jigsaw through each section, teaching the other members of your group • The Flowcharts 1:45-2:30 • Examples Activity 2:15-3:30
Taking a temperature • Medically, temperature general indicator of health • Academically, • CBM as GOM
Sometimes you need more information . . . • Some Tier 2 • Most Tier 3 • More in-depth Problem Analysis
•Define the Problem What is the problem and why is it happening? • Develop a Plan What are we going to do? • Implement Plan Carry out the intervention The Problem Solving Process • Evaluate Did our plan work?
Curriculum Based Evaluation • A process of evaluation and decision making that may use CBM or other derivations of CBA with the goal of maximizing student learning. • Core components – comparison, judgment, and problem-solving - not measurement • Decision-making framework for thinking • A network of curriculum-driven if/then precepts • Howell, Hosp, & Kurns (2008), BPV Chapter 20.
CBEandCBM: How they work together within problem solving The Problem expected actual CBM Why? CBE CBM Problem Analysis Monitor Why? Intervention
CBE Within a Problem Solving Process What is the problem and why is it happening?
Assessment Guidelines • Must be aligned with the curriculum • Must be easy to use • Must have clearly defined purposes • Should be standardized • Should sample clearly defined content domains • Should assess relevant types of knowledge • Some should collect rate data • Should collect an adequate sample • Should use appropriate scoring rules • Some should be complex and interactive • Howell & Nolet (2000) p 148
Procedures for Assessing Academic Skills Structured Teacher Interview & Rating Scales Direct Classroom Observation Student Interview Permanent Product Review Curriculum-BasedAssessment of Academic Skills Ed Shapiro (2004)
Can’t Do vs. Won’t Do • Obtain 3 previously completed assignments. Each • should be one which the student has performed much • below expectations. • Present first assignment (answers removed) with • incentive. If increases score by 25% or scores 80% • or above, then move to next step. • Have student choose reinforcers (teacher approved) • that he/she would like to work for in the future. • Test by presenting another assignment with a reinforcer • to the student. • Evaluate outcomes. If student markedly increases • performance when offered incentives, likely WON’T DO. • 6. Create incentive plan. Joe Witt & Ray Beck, 2001 Ed Daly, 1999
Step 1: Can’t Do Assess thru Review No? Step 3: CIE Assess Curriculum, Instruction, Environment Low Performance? Yes? Stop Performance Improved? Step 4: SLA’s Survey Level & Specific Level Assessments No? Stop Yes? Step 2: Won’t Do Interview Reassess Motivator Needs? No? Stop Yes? Provide appropriate intervention Adding in Can’t Do vs. Won’t Do CBE C.Martin, 2005
The CBE Process of Inquiry Can you define the problem? No Yes No Select & conduct screening assessments • Step 1: Fact-Finding & Problem Validation • Step 2: Develop Assumed Causes • Step 3: Validating • Step 4: Summative Decision Making • Step 5: Formative Decision Making Summarize results Problem Identification Can you validate the problem? Hypothesis Can you plan instruction? Yes Set the goal(s) Test Hypothesis Generate assumed causes No Yes Plan & conduct assessments No Hypothesis true? Design Instruction Summarize results Were assumed causes validated? Yes Progress Monitor & Plan Evaluation Design & implement instruction Make formative decisions Progress monitor Howell, Hosp, & Kurns (2008), BPV Chapter 20
Rules for Developing Assumed Causes • Clarify the purpose • If entitlement, then student-to-group comparison needed • If what to teach, then current performance to expected performance comparison needed • If how to teach, then formative data needed to determine effectiveness • Target relevant information • Alterable variables • Essential characteristics • Think about instruction • Information/data can’t focus exclusively on student • ICEL • Formative Howell, Hosp, & Kurns (2008), BPV Chapter 20
Rules for Developing Assumed Causes • Think about the Curriculum • Skill sequences • Proficiency • Response type • Conditions • Think about Types of Knowledge & Beyond Knowing How • When, why, & under what circumstances skill should be used • Check the Student’s Awareness of Skills • Self-monitoring, self-control, metacognition • Ask to rate task difficulty before doing • TEST DOWN/TEACH UP • Expected level first • Then work backward Howell, Hosp, & Kurns (2008), BPV Chapter 20
Rules for Developing Assumed Causes • Pick the Most Likely Targets First • The most likely explanation for the student’s lack of proficiency with a skill, or the most likely solution for the problem, should be checked before going on to those that are more complex or exotic. Howell, Hosp, & Kurns (2008), BPV Chapter 20
READING Early Reading Advanced Reading
Review R-CBM At Grade Level Easy Button Fluency + Fluency - Accuracy + Accuracy -
Activity • Beginning Reading - “Learning to Read” • Group 1: Read “Phonological Awareness” pp.378-380 & “Phoneme Segmentation Fluency” p.385; Review curriculum maps • Group 2: Read “Alphabetic Principle” pp.380-381 & “Letter Sound Fluency” & “Nonsense Word Fluency” p. 385; Review curriculum maps • Group 3: Read “Accuracy & Fluency” pp.381 & “Word Identification Fluency” & “Oral Reading Fluency” pp. 385-386; Review curriculum maps • Advanced Reading - “Reading to Learn” • Group 1: Read “Content of the Reading-Decoding Strand”, “Content of the Prior/Background Knowledge Strand” pp.400-401 & “Cloze and Maze”, “ORF” pp.404-405 • Group 2: Read “Content of the Vocabulary Strand” pp.400-402, “Review of Text-Dependent Grades and Assignments” pp.405-405, “Vocabulary Matching” p.406 • Group 3: Read “Content of the Comprehension Strategy Strand” pp.402-403, Written and Oral Retell Measures, Think-Aloud Interview pp.406-407
Early Reading Flowchart Are reading skills acceptable? R – Curriculum, Permanent Products I – Teacher O – Student while reading T – Using CBM Are oral reading skills acceptable? Go to Comprehension yes no Missing early literacy skills? Survey Early Literacy Skills yes K-2 or older student who decodes few words? Build Fluency w/ rereading Evaluate phonics no Is oral reading accurate but slow? Yes, Phonics patterns Do Pencil Tap no Do Rereading assmt Are there patterns? Yes, whole word no yes yes Provide Balanced Instruction Build Self Monitoring Did accuracy improve? Did rate increase? Correct Patterns Categorize errors no yes More errors on harder passages? no Do Error Sample & Analysis no
Primary Measures for Early Reading • Phoneme Segmentation Fluency • Letter Sound Fluency • Nonsense Word Fluency • Word Identification Fluency • R-CBM (ORF)
- / - Basic Reading Survey Level Assessment Testing down grade levels using R-CBM until student is at or above 25th%-ile
- / - Early Reading Decision Point Reading at least 40 wrc in 1st grade material? YES Provide instruction at instructional level with emphasis on phonics & fluency NO Provide intensive phonics and phonemic awareness instruction
- / - Intervention suggestions • Direct Instruction • Corrective Reading • Horizons • Reading Mastery • REWARDS • Read Well
Early Literacy Skills • Phonemic awareness • Blending • Segmenting • Manipulation • Identifying • Sounds • Rhyming • Concepts of print • Page conventions • Word/sentence/book length & boundaries • Environmental print/logos
Phonological Awareness Assessment Phoneme blending & segmentation Onset rime segmentation & blending More complex Syllable segmentation & blending Sentence segmentation Rhyming & song Less complex
Intervention suggestions Earobics Sounds & Letters K-PALS Great Leaps Road to the Code Scott Foresman Early Reading Intervention Lindamood Bell LiPS Program
- / - Checking for Decoding Skill… • Have the student read a grade level passage aloud • Write down each incorrectly read word on a piece of paper • Have the student attempt to read each incorrectly read word in isolation from your paper • Can the student correctly decode words in isolation?
- / - Analyzing Errors in Reading • Select a passage you estimate that the student will read with about 80-85% accuracy. • Remember: 80% accuracy = 1 error every 5 words! • Try different levels of passage until you find the right fit • You will need at least 50 errors for kids grades 2 & above (25 errors for grade 1) • Passage will need to be 250 words or more
- / - Pattern of Error Types • Compare each error in the passage with the Error Pattern Checklist • Make a mark next to the category in which the error seems to fit • Come up with a total of all errors • Identify the categories in which most errors occur
Basic Reading - / + slow rate / adequate accuracy Re-read strategy Student reads for 2-minutes, note # wrc at end of the 1st minute. Say, NOW READ AGAIN AS QUICKLY AND ACCURATELY AS YOU CAN. Student reads for 1-minute, determine wrc. Compare 1st read to 2nd read scores
Basic Reading - / + Decision Point Rate improved by approximately 25%? NO Recheck phonics Needs and Can’t Do/Won’t Do YES Use a fluency building Intervention (re-reading)
Intervention suggestions - / + Re-reading techniques Soar to Success Read Naturally PALS Great Leaps Six Minute Solution Quick Reads Choral reading Cloze reading
Basic Reading + / - Adequate fluency / Poor accuracy • Pencil Tap • Using passage where student is approximately 85% • accurate, tell student to try and fix the word every • time you tap the table • Count the number of self-corrections student makes • Compare to total number of errors
Determine if student has skills to correct errors using the pencil tap test (assisted monitoring)“Whenever you make an error, I’m going to tap the table with my pen. When I tap the table, I want you to fix the error.” • If student can fix errors when you point them out, you know he/she has the decoding skills to read the passage, but needs assistance learning to self-monitor for accuracy. Intervene with strategies for self-monitoring decoding. • If the student cannot fix errors when you point them out, a skill deficit in decoding may be indicated. Further analyze errors to isolate patterns of difficulty, and intervene with targeted decoding strategies.
Basic Reading + / - Decision Point Self-corrected majority of errors? YES Use self-monitoring intervention NO Reassess Can’t Do/ Won’t Do
Interventions + / - • Design an intervention to increase attention • to accuracy. • Does this make sense? • Does it match what is on the page? • Reinforcement for accuracy • Goal setting & progress monitoring If did not make more errors on more difficult passages, use intensified balanced instruction. If did make more errors, categorize the errors, look for patterns, and correct.
Interventions (cont.) + / - Spelling through Morphographs Word sorting/word study Great Leaps REWARDS Making sense of phonics Phonics and Spelling through Grapheme Mapping Soundabet
Research Behind the Vocabulary-Matching Measures Christine Espin – University of Minnesota Our research team at the University of Minnesota has conducted a series of studies to examine the reliability and validity of vocabulary matching as an indicator of content-area learning. The results of this research show that the vocabulary-matching measure is a valid and reliable indicator of performance and progress in social studies and science. Performance on the vocabulary-matching measure is related to performance on other content-area tasks, including research-made content tests, content-area subtests of standardized achievement tests, and teacher-made content measures. In addition, students who grow more on the vocabulary-matching measures score higher on criterion measures of content-area performance. As an aside, our research also shows that students must read the measures themselves (as opposed to having the measures read to them by the examiner) to obtain reliable and valid growth rates. http://www.teachingld.org/expert_connection/cbm.html
Creating Vocabulary-Matching Probes • Create a pool of potential vocabulary terms. Develop a pool of important vocabulary terms from the content to be covered over the entire school year (or semester if the class is offered on a semester basis.) Terms can be selected from the classroom textbook, from teacher notes and lectures, or from both sources. Selected terms should be germane to the content being covered. If the textbook is fairly representative of the content being covered, theterms can be created from the glossary of the textbook or from terms in the text that are highlighted or italicized. • Develop definitions for each term. For each term, develop a short definition. The easiest method for developing definitions is to use the glossary of the textbook. Other methods are to rely on teachers' notes and lectures or to use a school-based dictionary. Limit the length of each definition to approximately 15 words. Make them clear and unambiguous. • Create weekly measures that are similar. For each measure, randomly select 20 terms and definitions from the pool created in steps 1 and 2. In addition, select two definitions that do not match any of the terms. Thus, each probe will have 20 terms and 22 definitions. • One practical way to develop the measures is to write each vocabulary term on the front of an index card with its definition on the back. For each measure, shuffle all of the cards, and randomly select terms and definitions. Place the terms on the left-hand side of the page and the definitions in random order on the right-hand side. Number the terms, leaving a blank space by each term; put letters by each definition. The students write the letter for the correct definition in the blank next to each term. http://www.teachingld.org/expert_connection/cbm.html