550 likes | 659 Views
CECV Intervention Framework Module 6 Evaluation. Purpose of this Module. As a result of participating in this module, you will: Evaluate the effectiveness of the Intervention Framework in guiding your school through the process of: Identifying students with additional learning needs;
E N D
Purpose of this Module As a result of participating in this module, you will: • Evaluate the effectiveness of the Intervention Framework in guiding your school through the process of: • Identifying students with additional learning needs; • Assessing students with additional learning needs; • Analysing & interpreting the data collected; • Designing & carrying out the teaching; and • Evaluating& monitoring the student’s progress and the effectiveness of the teaching. 2
Core Principles 1. All students can succeed 2. Effective schools promote a culture of learning 3. Effective teachers are critical to student learning success 4. Teaching and learning is inclusive of all 5. Inclusive schools actively engage and work in partnership with the wider community 6. Fairness is not sameness 7. Effective teaching practices are evidence-based 4
“…research seeks to prove, evaluation seeks to improve…” M Q Patton (2003) 5
Focus Questions • How do you define evaluation? 1. Why do you evaluate? 2. What do you evaluate? 3. For whom do you evaluate? 4. How do you evaluate? Educational Evaluation Around the World, Danish Evaluation Institute, 2003 6
Defining Evaluation In implementing any change it is necessary to evaluate the effect. • In considering implementation of the Intervention Framework it is necessary to evaluate the effect on individual student outcomes and more broadly on teacher practice, teacher knowledge, school policies and processes. 8
Defining Evaluation Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness. The American Evaluation Association 9
Defining Evaluation Evaluation is the systematic collection and analysis of data needed to make decisions, a process in which most well-run programs engage from the outset. The American Evaluation Association 10
Defining Evaluation Evaluation is about finding answers to questions such as, “are we doing the right thing” and “are we doing things right?” The American Evaluation Association 11
Defining Evaluation Rossi and Freeman (1993) define evaluation as "the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of ... programs." 12
appraise assess critique judge justify predict prioritise choose Source: Anderson & Krathwohl, 2001 monitor select rate rank prove decide conclude argue as cited in Biggs & Tang, 2007. Defining Evaluation 13
Steps in Evaluation Step 1: Define what you hope to achieve Step 2: Collect data (pre & post) Step 3: Analyse the data Step 4: Formulate conclusions Step 5: Modify the program 14
Types of Evaluation Process Evaluation Process Evaluations describe and assess the actual program materials and activities. Outcome Evaluation Outcome Evaluations study the immediate or direct effects of the program on participants. Impact Evaluation Impact Evaluations look beyond the immediate results of policies, instruction, or services to identify longer-term as well as unintended program effects. 15
Process Evaluationto Inform School Improvement Phases of the process of improvement 0 Preparation/ 1. Identification • Diagnostic phase / 2. Assessment • Strategic planning phase / 3. Analysis and Interpretation • Developmental phase / 4. Teaching and Learning • Evaluation phase / 5. Evaluation R Bollen 1997 16
Outcome Evaluation The ultimate goal of the Intervention Framework process is to improve student outcomes. How do you know whether it did? • One commonly used way to find out whether the process (i.e. the T&L cycle) improved student outcomes is to ask whether the process caused the expected outcome. • If the process caused the outcome, then one could argue that the process improved student outcomes. • On the other hand, if the process did not cause the outcome, then one would argue that, since the process did not cause the outcome, then the process did not improve student outcomes. 17
Outcome EvaluationHow to figure this out Determining whether a process caused the outcome is one of the most difficult problems in evaluation, and not everyone agrees on how to do it. The approach you take depends on how the evaluation will be used, who it is for, what the evaluation users will accept as credible evidence of causality, what resources are available for the evaluation, and how important it is to establish causality with considerable confidence. Michael Quinn Patton One way could be to evaluate the teaching programs implemented. 18
Impact Evaluation Impact Evaluations look beyond the immediate results of policies, instruction, or services to identify longer-term as well as unintended program effects. 19
Why Evaluate? It is important to evaluate programs/the teaching for many reasons: • to ensure that the program is not creating any unintended harm; • to determine if the program is making a positive contribution (improved student outcomes); and • to improve and learn (i.e. to learn what were the positive elements, how it can be replicated, how challenges can be overcome in the future and how to make the process sustainable). 21
Why Evaluate? The four main reasons evaluation is conducted: accountability; learning; program management and development; ethical obligation. Green and South, 2006. 22
How does your school evaluate its current programs? How would you evaluate whether the child/children progressed as a result of participation in this intervention process. 26 What & How?
Is the student progressing satisfactorily against the set goals? How will you monitor and interpret the student’s progress against the set goals? How will you evaluate the effectiveness of the program/approach? 27 What & How?
What & How? Making the results useful (student outcomes) • How will you use the results to inform future program development for students? • How will the results be reported so that they can be used by the school to make improvements? 28
“evaluation seeks to improve” 32
1. What is an effect size? 2. Why use effect sizes? 3. How can schools use effect sizes to evaluate the effectiveness the intervention? 33 Effect Sizes
Effect Sizes (d) 1a. What is an effect size? An effect size provides a common expression of the magnitude of study outcomes, across variables, such as improving reading levels in accuracy and comprehension. An effect size of 1.0 indicates an increase of one standard deviation (1SD) on the outcome. One SD increase is typically associated with advancing students’ reading levels by two to three years, improving the rate of learning by 50%. 34
Effect Sizes (d) 1b. What is a reasonable effect size? Cohen (1988) suggests that: d = 0.2 is small, d = 0.5 is medium, d = 0.8 is large Whereas the results from Hattie’s meta-analyses could suggest when judging educational outcomes: d = 0.2 is small, d = 0.4 is medium, d = 0.6 is large Reference: Cohen, J. (1988). Statistical power analysis for the behavioural sciences (2nd ed.). Hillsdale, NJ: L. Erlbaum Assoc. 35
John Hattie - Visible Learning What is John Hattie on about, in a nutshell? 15 years of research 800+ meta-analyses 50,000 studies 200+ million students Outcome: What are the major influences on student learning? 36
Effect Sizes (d) The Formula Effect size = Average (post) - Average (Pre) (d) Average Standard Deviation (the spread) 38
2. Why use effect sizes? • To compare progress over time on the same test. • To compare results measured on different tests. • To compare different groups doing the same test. 39 Effect Sizes (d)
Effect Sizes (d) 3. How can schools use effect sizes? Discussion in school groups 40
What is Self Reflection? 42
Self Reflection "Test all things; hold fast what is good" I Thessalonians 5:21 43
What is Self Reflection? Self Reflection is simply a form of self evaluation undertaken by participants in social situations in order to improve the rationality and justice of their own practices, their understanding of these practices, and the situations in which the practices are carried out. Adapted fromCarr and Kemmis, 1986 44
Self Reflection in Schools Self reflection is a process in which teachers examine their own educational practice systematically and carefully, using the techniques of action research. 45
Self Reflection Leads to Improvement? SELF REFLECTION 46
Follow effective action with quiet reflection. Out of the quiet reflection will come even more effective action. Peter. F. Drucker 47
Teachers account for about 30% of the variance in student achievement “It is what teachers know, do, and care about, which is very powerful in this learning equation, not just what they know” (p. 2). They must put this knowledge into practice if they are to produce gains in studentlearning outcomes. Hattie (2003) 48
Each school to reflect on: Existing evaluation processes of: existing intervention programs currently in use in the school. teacher performance Student performance 50 Where to from here?