1 / 15

Evaluation Research and Engineering Education

Evaluation Research and Engineering Education. Lesley Jolly For AaeE ljolly@bigpond.net.au ERM wiki at www.aaee-scholar.pbwiki.com. What is evaluation research?. Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability

emera
Download Presentation

Evaluation Research and Engineering Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Research and Engineering Education Lesley Jolly For AaeE ljolly@bigpond.net.au ERM wiki at www.aaee-scholar.pbwiki.com

  2. What is evaluation research? • Periodic assessment of results • Appropriateness, effectiveness, efficiency, impact, sustainability • Identifies intended and unintended effects • Identifies what worked and what didn’t • Provides level of judgement about overall worth of intervention

  3. What has it got to teach us? • Systematic approach to what we’re doing already • Extracting lessons learned • Legacy of sustained evaluation frameworks for ongoing data collection • Arguments against ‘popularity contest’ course and program evaluation

  4. Step 1: describe the program • Aims and objectives • Need to be clearly articulated • Does everyone involved share the same aims • Program logic diagram • Describes how we think the program produces results • May be called logframe

  5. Program logic identifies what we need to evaluate

  6. Evaluating PBL tutor training

  7. Monitoring facilitates data collection • Can be process (outputs) or impact (short to medium outcome) monitoring • Need to develop indicators of progress • Targets may be included in indicators or separate e.g. • In 2010 50% of 2nd yrs will have used the new facility for more than 20 hrs • OR percentage of students using facility (indicator) • 50% in 2010 (target)

  8. PBL Tutor training • GOAL: provide timely well-placed supportive guidance to encourage tutors to scaffold and facilitate student learning • OBJECTIVES: at the end of training the successful tutor will be able to • articulate a good understanding of the objectives and methods of PBL • guide student learning through providing appropriate support and guidance rather than information • contribute to curriculum development within a staff team

  9. Monitoring Tutor Training

  10. Evaluation asks formative and summative questions • Are the questions cohesive and logical? • Do evaluation questions link to monitoring data? • Have ethical issues been addressed? • What mechanisms are in place to gather the learnings generated by evaluation? • What needs to be retained from this evaluation process in future years?

  11. Vary questions to suit program

  12. PBL Tutor Training • Appropriateness • Did the training model the target behaviour? • Has the purpose of the training been achieved? • Indicator 1: changes in Tutor behaviour indicating deeper knowledge of and commitment to PBL • Indicator 2: changes in student behaviour • Efficiency • Was the time invested by staff good value? • Sustainability • what needs to retained as core material from year to year to retain benefit of training

  13. Making use of evaluation Owen, J. (2006) "Program Evaluation" Allen & Unwin: Crows Nest

  14. Dissemination and reporting • Findings may be communicated throughout project to multiple audiences • Evidence, conclusions, judgements, recommendations. • Different occasions call for different styles • Oral, interactive workshops, posters, reports, summaries, papers, conference presentations • Must be well timed

  15. Developing capacity • “process use of evaluation” • Taking part develops skills • Taking part sensitises staff to issues • Improved communication • Ongoing 360 degree dissemination • Development of local discourse

More Related