1 / 35

Training the OSCE Examiners

Training the OSCE Examiners. Katharine Boursicot Trudie Roberts. Programme. Principles of OSCEs for examiners Video marking Marking live stations Strategies for enhancing examiner participation in training. Academic principles of OSCEs. The basics What is an OSCE? More academic detail

medea
Download Presentation

Training the OSCE Examiners

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Training the OSCE Examiners Katharine Boursicot Trudie Roberts

  2. Programme • Principles of OSCEs for examiners • Video marking • Marking live stations • Strategies for enhancing examiner participation in training

  3. Academic principles of OSCEs • The basics • What is an OSCE? • More academic detail • Why use OSCEs? • The role of examiners • Examiners in OSCEs

  4. The basics • For examiners who don’t know about OSCEs • A brief reminder for those who are familiar with OSCEs

  5. What is an OSCE? • Objective • Structured • Clinical • Examination

  6. OSCE test design Station

  7. OSCEs - Objective • All the candidates are presented with the sametest

  8. OSCEs - Structured • The marking scheme for eachstation isstructured • Specific skill modalities are tested at each station • History taking • Explanation • Clinical examination • Procedures

  9. OSCEs –Clinical Examination • Test of performance of clinical skills: not a test of knowledge • the candidates have to demonstrate their skills

  10. More academic detail • Why use OSCEs in clinical assessment? • Improved reliability • Fairer test of candidate’s clinical abilities

  11. Why use OSCEs in clinical assessment? • Careful specification of content • Observation of wide sample of activities • Structured interaction between examiner and student • Structured marking schedule • Each student has to perform the same tasks

  12. Characteristics of assessment instruments • Utility = • Reliability • Validity • Educational impact • Acceptability • Feasibility Reference Van der Vleuten, C. The assessment of professional competence: developments,research and practical implications Advances in Health Science Education 1996, Vol 1: 41-67

  13. Test characteristics • Reliability of a test/ measure • reproducibility of scores across raters, questions, cases, occasions • capability of differentiating consistently between good and poor students

  14. Test Sample Test Sample Sampling Domain of Interest  

  15. Reliability • Competencies are highly domain-specific • broad samplingis required to obtain adequate reliability • across content i.e. range of cases/situations • across other potential factors that cause error variance i.e. • testing time, examiners, patients, settings, facilities

  16. OSCE : blueprint

  17. Test characteristics • Validityof a test/measure • the test measures the characteristic (eg knowledge, skills) that it is intended to measure

  18. Behaviour~ skills/attitudes Does Knowshow Cognition~ knowledge Knows Showshow Model of competence Professional authenticity Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.

  19. Does Knowshow Knows Showshow Validity of testing formats Professional practice assessment Performance assessment: OSCEs, long/short cases, OSLERs, etc Problem-solving assessment: EMQs, SEQs Knowledge assessment: MCQs

  20. Test characteristics: Educational impact Relationship between assessment and learning Curriculum Assessment Teacher Student

  21. Test characteristics • Feasibility • cost • human resource • physical resources

  22. Test characteristics • Acceptability • tolerable effort • reasonable cost • Acceptability • doctors • licensing bodies • employers • patients/consumer groups • students • faculty

  23. The role of examiners in OSCEs • General • Types of stations • Standard setting • Practice at marking

  24. The role of examiners in OSCEs • To observe the performance of the student at a particular task • To score according to the marking schedule • To contribute to the good conduct of the examination

  25. The role of examiners in OSCEs • It is NOTto: • Conduct a viva voce • Re-write the station • Interfere with the simulated patient’s role • Design their own marking scheme • Teach

  26. Types of OSCE stations • History taking • Explanation • Clinical examination • Procedures

  27. Communication skills • Stations involving patients, simulated patients or volunteers • Content vs process i.e what the candidate says vs how the candidate says it

  28. Clinical skills • People • Professional behaviour • Manikins • Describe actions to the examiner

  29. The examiner’s role in standard setting • Use your clinical expertise to judge the candidate’s performance • Allocate a global judgement on the candidate’s performance at that station • Remember the levelof the examination

  30. Global scoring Excellent pass Very good pass Clear pass Borderline Clear fail

  31. Borderline method Test score distribution Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL    Borderline score distribution   Pass, Fail, Borderline P/B/F Passing score

  32. 1 2 3 4 5 Regression based standard Checklist 1. Hs shjs sjnhss sjhs sjs sj 2. Ksks sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk dd 6. Hskl;s skj sls ska ak akl ald 7. Hdhhddh shs ahhakk as TOTAL  X= passing score  Checklist Score   X  Overall rating 1 2 3 4 5 1 = Clear fail 2 = Borderline 3 = Clear pass 4 = V Good pass 5 = Excellent pass Clear Borderline Clear V Good Excellent fail pass pass pass

  33. Practice at marking • Videos • Live stations • Mini-OSCE

  34. Strategies for enhancing examiner participation • CME • Job plan/ part of contract • Specific allocation of SIFT • Experience for post-graduate examinations • Payment

More Related