160 likes | 241 Views
Institute for Learning and Research Technology. Dr. Grainne Conole. An integrated approach to evaluating learning technologies. Grainne Conole, Ed Crewe - University of Bristol Martin Oliver - University College London Jen Harvey - Dublin Institute of Technology. Background.
E N D
Institute for Learning and Research Technology Dr. Grainne Conole
An integrated approach to evaluating learning technologies Grainne Conole, Ed Crewe - University of Bristol Martin Oliver - University College London Jen Harvey - Dublin Institute of Technology
Background • Evaluation Assistant is • an online toolkit to help practitioners carry out evaluations • a six-month project funded by the UK JISC Committee for Awareness, Liaison and Training (JCALT) • a collaboration between • Bristol University, • University College London • the Dublin Institute of Technology
Toolkits - a definition • Toolkits lie between • Frameworks • provide theoretical overview or point of reference • less restrictive than toolkits, also less supportive • Wizard • easy to use software ‘black boxes’ • software tools which can make decisions • draws on pre-defined templates • easier to use than a toolkit, but more restrictive
Underlying assumptions • Designed to • be easy-to-use for practitioners and produce demonstrable benefit • provide guidance, but not be prescriptive • be adaptable and easy to customise to the local context • establish a comprehensive resource of relevant material
Outline • Provides a structured resource that • that can be used to plan, scope and co-ordinate an effective evaluation • provides progressively more detailed information on particular topics and links to appropriate resources • allows the user can follow up relevant issues • reduces the time taken to plan an evaluation • can be used iteratively over time
Toolkit architecture • The toolkit consists of three components • Evaluation Planner • helps define and scope the evaluation • Evaluation Advisor • guides the user through the process • Evaluation Presenter • provides support for presenting the results
Evaluation planner • Consists of seven stages • About planner • What are you evaluating? • Reasons • Context • Who is it for • Devising the question • Summary
Who’s the evaluation for? • This stakeholder analysis: • defines the stakeholders for the evaluation • ie anyone who may have a stake in the evaluation • looks at the different types of stakeholders • and their associated concerns and issues • maps the stakeholders to the evaluation process
Formulating the question(s) • Aligned to stakeholders analysis • Question(s) refined to: • explore stakeholder concerns • provide comparison • yield measurement • challenge or expose stakeholder’s perspective
Evaluation advisor • Consists of four stages • About advisor • Data capture methods • Data analysis • Summary
Filtering of choices • At each stage of advisor • users are asked a series of questions • these help formulate the evaluation • this is linked to a filtering process • leading to a defined choice of data capture methods and data analysis
Evaluation presenter • Consists of four stages • About presenter • Closing the loop • Presentation tools
Presenting the findings • The final step • closes the loop by relating findings back to the stakeholders • Options for presenting findings include: • Data sets or spreadsheet of costings • Executive summary of activity • Narrative account of the evaluation • Oral presentation or poster of findings • Research reports
Further information • Web site: http://www.ltss.bris.ac.uk/JCALT/ • Email: g.conole@bristol.ac.uk • Project team • University of Bristol • Grainne Conole, Ed Crewe, Martin Belcher • University College London • Martin Oliver • Dublin Institute of Technology • Jen Harvey
Relevant references • Conole, G. & Oliver, M. (1998), A pedagogical framework for embedding C and IT into the curriculum. ALT-J, 6 (2), 4-16 • Oliver, M. & Conole, G. (1998), Evaluating Communication and Information Technologies: a toolkit for practitioners. Active Learning, 8, 3-8 • Conole, G., Oliver, M. & Harvey, J. (2000), ‘Scoping study’, a report for the ‘Evaluation Toolkit for practitioners JISC JCALT project, University of Bristol • Oliver, M. & Conole, G., (2000), Assessing and enhancing quality using toolkits. Journal of Quality Assurance in Education, 8, 1, 32-37 • Conole, G., Oliver, M. & Harvey, J. (2000), ‘Toolkits for practitioners’, ALT-C 2000, Manchester • Conole, G., Oliver, M. & Harvey, J. (2000), ‘Toolkits as an approach to evaluating and using learning materials’, ASCILITE 2000, Coffs Harbour, New South Wales • Conole, G., Oliver, M. & Harvey, J. (2000), ‘An integrated approach to evaluating learning technologies’, IWALT 2000, Palmerston North, New Zealand