1 / 52

Geoffrey Crisp ALTC National Teaching Fellow

Assessment Design and E-learning. Geoffrey Crisp ALTC National Teaching Fellow Director, Centre for Learning and Professional Development University of Adelaide. Assessment 2.0 examples from ALTC Fellowship. http://www.transformingassessment.com. 4/09/2014. 2. Outline of presentation.

lore
Download Presentation

Geoffrey Crisp ALTC National Teaching Fellow

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Design and E-learning Geoffrey Crisp ALTC National Teaching Fellow Director, Centre for Learning and Professional Development University of Adelaide

  2. Assessment 2.0 examples from ALTC Fellowship http://www.transformingassessment.com 4/09/2014 2

  3. Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 3

  4. Typical learning and assessment today? http://www.flickr.com/photos/cristic/359572656/ http://www.pharmtox.utoronto.ca/Assets/Photos/pcl473+classroom+editted.JPG 4

  5. New learning environments for students and teachers 4/09/2014 5

  6. “Big problem” learning and assessment http://www.nasaimages.org/luna/servlet/detail/nasaNAS~9~9~58363~162207: 6

  7. Assessment tasks should be worth doing if students can answer questions by copying from the web, they are being set the wrong questions if students can answer questions by using Google, they are being set the wrong questions if students can answer questions by guessing, they are being set the wrong questions Why the hell am I doing this course? 4/09/2014 7

  8. Students’ perception of learning strategies? http://blog.oregonlive.com/pdxgreen/2008/02/chimp.jpg 4/09/2014 8

  9. Different assessment types

  10. Operationalising assessment for current and future learning Authentic tasks Life-long learning Authentic tools Meaningful feedback Self-review and critique Standards Learning-oriented assessment 10

  11. Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 11 11

  12. What role for learning (virtual) management systems? http://4.bp.blogspot.com/_hBiBaUg_1rA/SJTQE0ymK7I/AAAAAAAABxI/OIKhMiaaQ2E/s400/confusing_signs2.jpg http://www.teach-ict.com/ecdl/module_1/workbook15/miniweb/images/stressed.jpg 4/09/2014 12

  13. Question types - LMS 4/09/2014 13

  14. Effective assessment design Graham Gibbs, David Nicol and David Boud Nicol, D E-assessment by design: using multiple-choice tests to good effect. Journal of Further and Higher Education, 31(1) 2007, 53–64

  15. JISC - Reports and papers on (e)-assessment http://www.jisc.ac.uk/whatwedo/programmes/elearning/ assessment/digiassess.aspx

  16. Report on Summative E-Assessment Quality (REAQ) MCQ (selected response) effort is front loaded start with easier questions and make later questions more difficult checking assessments with subject matter experts and high performers identifying ‘weak’ questions and improving or eliminating them http://www.jisc.ac.uk/media/documents/projects/reaqfinalreport.pdf

  17. Report on Summative E-Assessment Quality (REAQ) reviewing question content to ensure syllabus coverage assisting academics who may have limited experience of psychometrics attending to security using accessibility guidelines http://www.jisc.ac.uk/media/documents/projects/reaqfinalreport.pdf

  18. Why do we assess students? it encourages (current and future) learning it provides feedback on learning to both the student and the teacher it documents competency and skill development it allows students to be graded or ranked it validates certification and licence procedures for professional practice it allows benchmarks to be established for standards hasn’t changed much for decades in some cases 4/09/2014 18 18

  19. Types of assessment responses convergent type, in which one ‘correct’ response is expected, and divergent type, in which the response depends on opinion or analysis assessment requiring convergent responses has its origins in mastery-learning models and involves assessment of the learner by the master-teacher assessment requiring divergent responses is associated with a constructivist view of learning, where the teacher and learner engage collaboratively within Vygotsky’s zone of proximal development 4/09/2014 19 19

  20. You need to think about: whether a norm-referenced or criterion-referenced assessment scheme is more appropriate for the particular learning outcomes whether the process of solving a problem and the productof solving a problem are both assessed, and what is the relative weighting for the two components whether constructed or selected responses are appropriate 4/09/2014 20

  21. Why use e-assessment for selected respoinses? flexibility in test delivery providing timely feedback easy reporting and analysis of student responses construction of questions is straightforward, but designing good assessment items difficult can reduce overall workload for academics, but effort is frontloaded

  22. Design for MCQ exams for summative use use of question banks security guessing

  23. http://www.economicsnetwork.ac.uk/qnbank/

  24. Kryterion - http://www.kryteriononline.com

  25. Guessing answers

  26. Different question types

  27. Guessing answers

  28. Certainty-Based Marking (CBM) Tony Gardner-Medwin - Physiology (NPP), UCL • CBM rewards thinking: • identification of uncertainty • or of justification • Highlights misconceptions • negative marks hurt! • Engages students more • Enhances reliability & validity

  29. Nuggets of knowledge ? ? ? ? ? ? ? ? EV I DENCE Inference Confidence-based marking places greater demands on justification, stimulating understanding Networks of Understanding Thinking about uncertainty and justification stimulates understanding Confidence (Degree of Belief) Choice To understand = to link correctly the facts that bear on an issue.

  30. Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 30 30

  31. Report on Summative E-Assessment Quality (REAQ) The design principles for preparing quality assessment tasks in higher education have been well documented (Biggs, 2002; Bull & McKenna, 2003; Case & Swanson, 2001; Dunn, Morgan, Parry & O'Reilly, 2004; James, McInnis & Devlin, 2002; McAlpine, 2002a; PASS-IT) There is also an extensive body of work in the discipline of validity and reliability testing for assessments and there are numerous descriptions that are readily available for academics on how to apply both psychometric principles and statistical analyses based on probability theories in the form of Classical Test Theory and Item Response Theory, particularly the Rasch Model (Baker, 2001; Downing, 2003; McAlpine, 2002b; Wright, 1977)

  32. Report on Summative E-Assessment Quality (REAQ) Thus, there is no shortage of literature examples for academics to follow on preparing and analysing selected response questions; academics and academic developers should be in a position to continuously improve the quality of assessment tasks and student learning outcomes However, the literature evidence for academics and academic developers generally using these readily available tools and theories is sparse (Knight, 2006)

  33. Analysing student responses Engaging academics with a simplified analysis of their multiple-choice question (MCQ) assessment results. G. T. Crisp E. J. Palmer. Journal of University Teaching & Learning Practice Volume 4, Issue 2 2007 Article 4

  34. Survey of academics Academics were familiar with common statistical terms such as mean, median, standard deviation and percentiles Some were familiar with the different types of terms used to describe validity, but very few were aware of the formal psychometric approaches associated with Classical Test Theory, the Rasch Model or Item Response Theory

  35. Rasch analysis

  36. Score distribution 50-60%

  37. Facility Index – Classical Test Theory 0.3 to 0.8

  38. Discrimination Index – Classical Test Theory >0.3

  39. How effective are distracters? 20-30% each distracter

  40. Summary information

  41. Person item map

  42. Outline of presentation introduction to learning and assessment e-assessment design evaluating your assessment questions future assessment tasks 4/09/2014 43 43

  43. Process of problem solving - IMMEX http://www.immex.ucla.edu

  44. IMMEX output Kong, S.C., Ogata, H., Arnseth, H.C., Chan, C.K.K., Hirashima, T., Klett, F., Lee, J.H.M., Liu, C.C., Looi, C.K., Milrad, M., Mitrovic,A., Nakabayashi, K., Wong, S.L., Yang, S.J.H. (eds.) (2009). Proceedings of the 17th International Conference on Computers in Education

  45. Role Plays http://www.ucalgary.ca/fp/MGST609/simulation.htm http://www.roleplaysim.org/papers/default.asp?Topic=toc9 46

  46. What Happens in a Role Play? Reflection & Learning Adopt a role Issues & problems occur Interaction & debate 47

  47. Scenario-based learning http://www.pblinteractive.org 48

  48. Future assessments? • Will we see universal development of immersive and authentic learning and assessment environments? • Will assessments measure approaches to problem solving and student responses in terms of efficiency, ethical considerations and the involvement of others? • Will teachers be able to construct future assessments or will this be a specialty activity? 49

  49. Bobby Elliot and assessment 1.5 to 2.0

More Related