1 / 7

Evaluation and dissemination

Evaluation and dissemination. University of Lincoln OER Change Programme 22-23 March 2012. Evaluating change. Change initiatives are: complex non-linear emergent unpredictable l ong term A different approach to evaluation is needed…. Developmental evaluation.

Download Presentation

Evaluation and dissemination

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and dissemination • University of Lincoln OER Change Programme 22-23 March 2012

  2. Evaluating change • Change initiatives are: • complex • non-linear • emergent • unpredictable • long term A different approach to evaluation is needed…

  3. Developmental evaluation “Developmental evaluation is about rigorous inquiry for development… using data in a meaningful way to inform innovation in progress” (Gamble 2008) • an integral part of the planning and implementation cycle, not a bolt-on activity at the end • balances the creative and the critical, the formative and the summative • uses a range of methods, drawing on both ‘hard’ and ‘soft’ data

  4. Choosing your methods • Developmental evaluation is an approach not a method – it involves choosing the right method for the right purpose • Think about what you are evaluating: • ‘simple’ output (sphere of control) • ‘complicated’ outcome (sphere of influence) • ‘complex’ impact (sphere of interest)

  5. Choosing your methods

  6. Sources of data • What counts as ‘evidence’ – for what purpose and for whom? • Data from evaluation activities – surveys, focus groups, interviews, case studies etc • Data generated by initiative itself – planning documents, decision logs, network maps, influence wheels, meeting notes, emails etc • Naturally occurring institutional data – NSS, satisfaction surveys, retention and achievement data etc

  7. Three ‘take home’ messages • Take a broad view of what evaluation is and what counts as evidence – if it helps, do it • Build your evaluation in early and often – use your planning tools to help you evaluate and your evaluation tools to help you plan • Look at your initiative from multiple perspectives – those inside the initiative, those outside it, and those affected by it

More Related