1 / 63

Ann C. Jobe, MD,MSN Executive Director Clinical Skills Evaluation Collaboration (CSEC)

Promoting the Development of Clinical Skills throughout the Continuum of Medical Education University of North Carolina – Chapel Hill School of Medicine November 9, 2011. Ann C. Jobe, MD,MSN Executive Director Clinical Skills Evaluation Collaboration (CSEC). Clinical Skills in Practice.

viet
Download Presentation

Ann C. Jobe, MD,MSN Executive Director Clinical Skills Evaluation Collaboration (CSEC)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Promoting the Development of Clinical Skills throughout the Continuum of Medical Education University of North Carolina – Chapel Hill School of MedicineNovember 9, 2011

  2. Ann C. Jobe, MD,MSNExecutive DirectorClinical Skills Evaluation Collaboration (CSEC)

  3. Clinical Skills in Practice • The physician-patient encounter is central to the identity of physicians in the US • Clinical skills of trainees and young physicians have been described as deficient since at least the 1970’s • Good evidence supports the diagnostic and therapeutic value of the clinical encounter but… • …..Technology, fragmented care, reimbursement, and practice culture affect the clinical encounter Weiner,A. & Nathonson M; JAMA 1976; 236:852-855 Verghese, A et al; Annals Int Med 2011;155:550-553

  4. Clinical Skills in Practice • The clinical encounter is often buried in process measures, such as HEDIS or other guidelines • The ritual value of the clinical encounter is important, and must be balanced by its documented utility • The environment determines most of what and how trainees learn about the clinical examination Weiner,A. & Nathonson M; JAMA 1976; 236:852-855 Verghese, A et al; Annals Int Med 2011;155:550-553

  5. COMMUNICATION • The essence of the patient-physician relationship • Includes communicating verbally, non-verbally, as well as actions and interactions during a physical examination

  6. Communication • It is all about COMMUNICATING with patients and families and health professionals • It is all about improving communication to improve the quality and safety of health care

  7. Why Assess Communication Skills? • Essential physician competency • (LCME, ACGME, ABMS, USMLE) • Clinical outcomes require effective communication • Public expectations: need for more information and supportive interactions. • Quality measures now incorporate patient-centeredness

  8. Patient-Centered Communication • Exploring the patient’s illness experience • Understanding the patient as a whole person • Picking up on patient cues • Involvement of the patient in problem definition • Involvement of the patient in decision-making • (now >50% expect such involvement) • Finding common ground regarding management • Enhancing the doctor/patient relationship by being responsive to the patient IOM,2001; Street,2008

  9. Communication Skills • Prospective study of 80 medical outpatients with new or previously undiagnosed conditions • Internists asked to list their differential diagnoses and to estimate their confidence in each diagnostic possibility • after the history, • after the physical examination, and • after the laboratory investigation.

  10. Communication Skills • In 61 of 80 cases (76%), the leading diagnosis after taking the history agreed with the diagnosis accepted at the time the record was reviewed • The physical examination led to the diagnosis in 10 patients (12%) • The laboratory investigation led to the diagnosis in 9 patients (11%) • These data support the concept that most diagnoses are made from the medical history

  11. Communication Skills • Authors suggest that more time should be devoted to improving history-taking skills during clinical training. Peterson MC, Holbrook JH, Hales D, Smith NL, Staker LV: Contributions of the history, physical examination, and laboratory investigation in making medical diagnoses. West J Med 1992 Feb; 156:163-165

  12. Communication Skills • Numerous publications confirm that poor skills in patient communication are associated with: • Lower levels of patient satisfaction • Higher rates of complaints • Increased risk of malpractice claims • Poorer health outcomes

  13. High level skills in “bedside medicine” – “clinical skills” • Ability to elicit a patient’s story/history • Correct use of evidence-based PE maneuvers in a focused manner based on history • Ability to synthesize information gathered • Ability to communicate and negotiate plans for management are the cornerstone of patient safety and quality of care

  14. Why Does It Matter? • Initiatives focused on improving clinical skills, especially communication – through teaching and assessment - will be most successful in improving the quality and outcomes of care provided by health professionals

  15. Comprehensive Program • Overarching Competencies and Objectives • Map for addressing teaching and assessing throughout the continuum of education • Course content • Assessment methodologies

  16. AAMC Recommendations ForClinical Skills Curricula For UndergraduateMedical Education(2008) • Professionalism • The ability to understand the nature of, and demonstrate professional and ethical behavior in, the act of medical care. • Patient Engagement and Communication Skills • The ability to engage and communicate with a patient, develop a student-patient relationship, and communicate with others in the professional setting • Biomedical Knowledge Application Skills • The ability to apply scientific knowledge and method to clinical problem solving.

  17. AAMC Recommendations ForClinical Skills Curricula For UndergraduateMedical Education(2008) • History Taking • The ability to take a clinical history, both focused and comprehensive. • Patient Examination • The ability to perform a mental and physical examination • Clinical Testing • The ability to select, justify and interpret selected clinical tests and imaging • Clinical Procedures • The ability to understand and perform a variety of basic clinical procedures

  18. AAMC Recommendations ForClinical Skills Curricula For UndergraduateMedical Education(2008) • Diagnosis • The ability to diagnose and explain clinical problems in terms of pathogenesis, to develop basic differential diagnosis, andto learn and demonstrate clinical reasoning and problem identification. • Clinical Information Management • The ability to record, present, research, critique and manage clinical information • Clinical Intervention • The ability to understand and select clinical interventions in the natural history of disease, including basic preventive, curative and palliative strategies

  19. AAMC Recommendations ForClinical Skills Curricula For UndergraduateMedical Education(2008) • Prognosis • The ability to understand and formulate a prognosis about the future events of an individual’s health and illness basedupon an understanding of the patient, the natural history of disease, and upon known intervention alternatives. • Personalizing Clinical Care • The ability to provide clinical care within the practical context of a patient’s age, gender, personal preferences, family, healthliteracy, culture, religious perspective, and their economic circumstances

  20. Core Competencies & Assessment • Patient Care/Clinical Skills • Students must be able to provide care that is compassionate, appropriate, and effective for treating health problems and promoting health

  21. Core Competencies & Assessment • Interpersonal & Communication Skills • Students must demonstrate interpersonal and communication skills that facilitate effective interactions with patients and their families and other health professionals

  22. Developing a Comprehensive Program • Types of assessments • Examinees • Timing of assessments

  23. Types of assessments • Formative • Designed to provide feedback to facilitate acquisition of new skills or improvement of performance • Part of continuous professional development • Part of performance and quality improvement

  24. Types of assessments • Summative • “High stakes” • Associated with an important decision – like graduation, licensure, certification or credentialing • Utilized to distinguish between those who are competent and those who are not

  25. Types of assessments • “Snapshot” • One time assessment • Longitudinal • Repeated over various periods of time

  26. Timing of assessments • At planned intervals for promotion decisions • Ongoing for continuous professional development and/or performance improvement • One-time “snapshot” for initial licensure • Repeat assessment for license renewal • For credentialing or granting privileges • Review for re-entry into practice

  27. Program Elements • Depend on PURPOSE of the assessment and • LEVEL of the examinee

  28. Assessing Skills and Performance • What is included in an assessment of skills and performance? • What are some of the assessment methods and how are they assembled? • How do the methods perform against the criteria for good assessment?

  29. Miller’s Pyramid for Assessing Clinical Competence Does Shows How Knows How Knows Action Performance Competence Knowledge

  30. Kirkpatrick Criteria • Results Change in organizational practice Benefits to patients/clients • Behavior Transfer learning to workplace Learners apply new knowledge and skills • Learning Change attitudes/perceptions Change knowledge/skills • Reaction Customer satisfaction related to participation in educational activities

  31. Simulation • Simulation • Real patients are replaced with realistic but artificial experiences • Trainee interacts with the re-creations • Judgments are made about their performance

  32. Simulation • Methods can be divided according to how faithful they are to reality • Intermediate fidelity • Task specific models • Instructor driven models • High fidelity • Virtual reality • Standardized patients (SPs)‏

  33. Method: Task Specific Models • Designed around a specific task • Venipuncture model • Animal cadavers • Usually not automated • Relatively inexpensive

  34. Method: Instructor Driven Models • Physical representation • Responses driven by an instructor • Little feedback • Moderate cost

  35. Method: Virtual Reality Simulators • Simple physical representation • Sensing device that informs computer of user actions • Computer models realistic reactions • 3D imaging • Haptics

  36. Method: Standardized Patients • Individuals trained to portray a patient • Scripted and standardized • USMLE Step 2 CS example • Integrated Clinical Encounter • Data gathering • SP completing checklists • Written communication • Doctor rating a patient note • Communication & Interpersonal skills • SP Rating • Spoken English • SP Rating

  37. Ideal Assessment of Communication Skills • Evidence-based construct • Assessment instrument consists of observable behaviors • Realistic stimuli • SPs trained to use instrument reliably • Appropriate scoring decisions

  38. Putting it Together: Objective Structured Clinical Examination (OSCE)‏ • Multiple stations • Each focused on a specific aspect of competence • Stations might include • Manikins • SPs • ECG or X-ray interpretation • Heart sounds • Animal cadavers • Anastomosis • Laparoscopic vessel ligation • Simulators “In a way the OSCE is not an examination method; rather it is an examination format or framework into which many different types of test methods can be incorporated” Ian Hart, 2001

  39. Putting it Together: OSCE • Stations are usually short: 10-15 minutes • Test is composed of 8-25 stations • Round-robin format • At a bell, examinees rotate to next station • Can accommodate as many examinees as stations • Total score is calculated across all stations

  40. Work-based Methods • Work-based assessment • Real patient encounters • Trainees are observed • Judgments are made about their performance “When your work speaks for itself, don't interrupt.” Henry Kaiser

  41. Work-based Assessment • Foundation Programme (in the UK) • Two-year program • Bridge between medical school and advanced training • Series of clinical placements • Assessment Purpose • Determine fitness to progress to the next level • Identify trainees in difficulty • Provide feedback • Establish accountability • Three methods • Mini-Clinical Evaluation Exercise (mCEX) • Directly Observed Procedures (DOPs)‏ • Case-Based Discussion (CbD)‏

  42. Mini-Clinical Evaluation Exercise (mCEX) • Process • List of patient problems • Trainee picks a patient • Assessor observes the encounter • Focused clinical task • Assessor rates: • Hx, PE, Communication, Clinical Judgment, Professionalism, Organization/Efficiency • Assessor provides feedback • Takes 15-20 minutes

  43. Directly Observed Procedures (DOPs)‏ • Process • List of procedures • Trainee picks a patient • Assessor observes the encounter • Procedure • Assessor rates: • Preparation, Sedation, Asepsis, Technical skill, etc. • Assessor provides feedback • Takes 15-20 minutes

  44. Case-Based Discussion (CbD)‏ • Process • List of patient problems • Trainee picks 2 case records • Assessor selects one • Discussion centered on the trainee’s notes • Assessor rates: • Diagnosis, Treatment, Planning, Professionalism, etc. • Assessor provides feedback • Takes 15-20 minutes

  45. Putting it Together: Work-based Assessment • An OSCE “on the hoof” • Multiple encounters are needed • Captured as feasible during clinical training • Multiple examiners are needed • Encounters can be made to conform loosely to a problem list • Ongoing, longitudinal assessments

  46. Criteria for Judging an Assessment • How do simulation and work-based assessment perform against the criteria? • Validity • Reliability • Equivalence • Educational effect • Opportunity for feedback • Feasibility

  47. Validity • What is validity? • Degree to which the inferences based on scores are correct • Does the test measure what it is supposed to measure? • Simulation • Good content coverage • Rare conditions • Errors cause no harm • Good fidelity • Work-based methods • Excellent content coverage • Includes difficult to simulate conditions • High fidelity

  48. Reliability • What is reliability? • If an assessment process is repeated with the same trainees, they should get the same scores • Physician performance varies considerably from patient to patient • The trainee must be observed with several patients • Assessors differ in stringency • The trainee must be evaluated by different examiners

  49. Equivalence • What is equivalence? • To compare examinees they must have taken assessments that are equal in difficulty • Fairness • Comparable meaning • Simulation • Different examinees can be given the same items • Security • Statistical techniques help with different versions • Work-based methods • Equivalence is a problem that can be mitigated but not eliminated

  50. Educational Effect “Students respect what you inspect.” • Both simulation and work-based methods signal the importance of working with patients • Drives learning

More Related