120 likes | 328 Views
MIRROR: Evaluating the Effectiveness of Reflective Learning at Work. Marina Brati ć , Gudrun Wesiak, Angela Fessl. Agenda. Evaluation in MIRROR Specifying Summative Evaluation Criteria Evaluation Toolbox and Procedure Challenges with regard to summative evaluation.
E N D
MIRROR: Evaluating the Effectiveness of Reflective Learning at Work Marina Bratić, Gudrun Wesiak, Angela Fessl
Agenda • Evaluation in MIRROR • Specifying Summative Evaluation Criteria • Evaluation Toolbox and Procedure • Challenges with regard to summative evaluation
I. Evaluation in MIRRORIntroduction • Why summative evaluation? • Methodologyfor the evaluation • Indicators of reflection and its effects at • individual • inter-individual • organizational levels • A set of tools and instruments • Individual extensions of the research methodology
I. Evaluation in MIRROR Evaluating learning effectiveness “in the wild” • Diversity of test beds • Challenging aspects of evaluating learning by reflection • Different types of evaluations in MIRROR • Formative evaluations • Evaluation with summative aspects • Summative evaluation
I. Evaluation in MIRRORApproaches • Theory-based approach: the evaluation of MIRROR guided by the conceptual model of reflective learning at work (CSRL model) • Goal-based approach: assess whether reflective learning goals and objectives of relevant stakeholders within and beyond the project consortium are met • i* model (Yu & Mylopoulos, 1994) • Adapted evaluation approach of Kirkpatrick (Kirkpatrick & Kirkpatrick, 2006)
II. Specifying Summative Evaluation Criteriai* Model of Reflective Learning at Work A: Worker B: Individual Reflector C: Individual Team Reflector D: Collaborative Team Reflector E: Organisational Reflector
II. Specifying Summative Evaluation Criteria Kirkpatrick Model – 4 levelsofsummativeevaluation • Level 1: Reaction To what degree do participants react favourably to our MIRROR apps? • Level 2: Learning To what degree do participants acquire knowledge, skills, attitudes, confidence, and commitment? • Level 3: Behaviour To what degree do participants apply what they learn? • Level 4: Results To what degree do targeted outcomes occur as a result of MIRROR?
II. Specifying Summative Evaluation Criteria Levels of Evaluation and Evaluation Criteria i* model summativeeval.criteria Level 4 Results Business Impact Level 3 Behaviour Work-related Criteria Work Kirkpatricklevels Outcome Criteria Learning Outcome Level 2 Learning Process Criteria Learning Process (Reflection) Level 1 Reaction General Criteria App Usage
II. Specifying Summative Evaluation Criteria Levels of Evaluation and Evaluation Criteria • KPI measures • Work improvement Learning Process: • Short Reflection Scale • App-specific reflection questions Learning Outcome • Log file data of app usage • Self-report of app usage Evaluation criteria type Processes & outcomes Realisation in MIRROR
III. Evaluation Toolbox and Procedure • Evaluation Toolbox specifies research instruments and measures • E.g. Questionnaires, Log file data, Interviews after app usage, Observations, Self-assessments, KPIs … • Evaluation procedure specifies how is the Evaluation Toolbox used by the developers and test beds to evaluate the success of the MIRROR apps • transparent evaluationprocess • assist the developers and test beds during the app testing • a high degree of freedom for developers and test beds • a strong summative evaluation at the end of the project
IV. Challenges with regard to summative evaluation • Our evaluation approach... • requires time upfront to specify the conceptual model, links between stakeholders, processes, activities, and outcomes • requires clearly articulated goals and objectives • must capture all relevant processes and all important aspects of reflective learning at work • must be able to deal with unexpected results • expects strong commitment from the test beds
Thank you for your attention! Do you have any questions?