1 / 29

Traditional Expert-Based Information Delivery Systems

Traditional Expert-Based Information Delivery Systems. Using an Expert, Being an Expert. Roles of Experts. Consultation CME Review articles Practice guidelines Decision analysis. Using an Expert/Being an Expert. Definition of an expert

trista
Download Presentation

Traditional Expert-Based Information Delivery Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Traditional Expert-Based Information Delivery Systems Using an Expert, Being an Expert

  2. Roles of Experts • Consultation • CME • Review articles • Practice guidelines • Decision analysis

  3. Using an Expert/Being an Expert • Definition of an expert • Subspecialist or primary care clinician with special interest • Anyone/anything you go to for an answer to a question

  4. Using an Expert/Being an Expert • “Never ask the barber whether you need a haircut” • “So many specialists fall into the habit of looking where the light is -- that is, offering solutions only in territory familiar to them. . . Wonderful examples exist of otherwise excellent researchers who are unable and unwilling to recognize evidence contrary to their beliefs.”

  5. Usefulness Score • Work: Low • Significant potential for usefulness • Relevance: Varies • Validity: Expert dependent • If either relevance or validity is zero, usefulness is zero

  6. Types of Experts • Content Expert • Clinical Scientist • YODA

  7. Content Expert • Experienced, particularly diagnosis and procedures, not necessarily therapy • Not trained in clinical epidemiology (validity) • Traditional education favors DOEs (relevance) • May not be current, may rely on anecdotes • Risky extrapolation: Information is only as current as the last consultation

  8. Clinical Disagreement Between/Within Experts • Same film: disagree 29% of time • Previous read: disagree with self 20% of time • Studied with venograms, fundi, MRI, angiography, mammograms, pathology (melanoma diagnosis) • March 97 Bandolier on the Web: “Histology as Art Appreciation”

  9. “Never ask a barber . . .” • Chalmers: Recommendation highly correlated with training and source of income • Management of acute GI bleed • Surgeons: surgery- 50%; conservative- 15% • Internists: surgery- 15%; conservative- 50%

  10. Clinical Scientist • Good at evaluating evidence; up-to-date, don’t have to be content experts • Separation of therapeutics • Medical Librarian, PharmD

  11. YODA: Your Own Data Analyzer • Content expert and clinical scientist • Consider POEMs first, even if this information conflicts with DOEs or clinical experience • When POEMs not available, use best DOEs with an open mind • Demonstrate appropriate validity assessments • Not to be confused with YUCKs

  12. YUCK • YOUR • UNSUBSTANTIATED • CLINICAL • KNOW IT ALL

  13. Experts gone wrong: YUCKs

  14. YUCK • Your Unsubstantiated Clinical Know-it-all • Maladaptive • Rigid, Dogmatic • All personality types, but people who see things in Red and Green can fall into the YUCK trap

  15. The Golden Question: “That’s interesting . . . Is there any evidence that . . . ?”

  16. If it’s not a valid POEM, it’s just not necessarily so

  17. Making the Most of a CME Presentation

  18. Dilbert’s Take on CME

  19. Continuing Medical Education • People remember 90% of what they do, 75% of what they say, but only 10% of what they hear • How to make the 10% count

  20. Do We “Get” Something From CME?

  21. Is post-test performance improved? (DOE) • YES • Beware “Chinese-Dinner Memory Dysfunction”

  22. Are patient outcomes improved? (POEM) • No . . .Multiple RCTs have failed to find a benefit from traditional lecture format (passive) • Maybe . . . with active (hands-on) workshops combined with close follow-up

  23. Usefulness • Validity: Depends on the speaker • Relevance: Depends on POEM:DOE ratio • Work: Higher than it seems • NBA analogy (only last two minutes count) • Tracking down validity of new POEMs

  24. Role of the Speaker • Present a good mix of POEMs highlighted by clinically relevant DOEs • Augment POEMs with clinical experience • Identify Level of Evidence (LOE)for listener

  25. Role of the Listener • Identify, before the talk begins: • What you want to learn • What are the POEMs you need to know? • Actively evaluate information (CME worksheet) • When a change-inducing POEM is presented, validate: • By questioning the speaker • By cross-checking with other sources

  26. Identifying “Common” POEMS • Will this information have a direct bearing on the health of my patients (is it something they care about)? • Is the problem common to my practice? • Is the intervention feasible? • If true, will it require me to change my current practice?

  27. Newer Models for CME • Practice-based small group CME • Educational prescriptions • Point of care Sources • Team-based learning • Audience response systems • CME worksheet • Social media

More Related