1 / 57

“Maintaining Accreditation: Meeting the Challenges of Compliance”

“Maintaining Accreditation: Meeting the Challenges of Compliance”. AATOD 20 th Anniversary National Conference October, 2004 Mary Cesare-Murphy, PhD Executive Director, Behavioral Healthcare Accreditation Megan Marx, MPA Associate Director, OTP Accreditation Project

sammy
Download Presentation

“Maintaining Accreditation: Meeting the Challenges of Compliance”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Maintaining Accreditation: Meeting the Challenges of Compliance” AATOD 20th Anniversary National Conference October, 2004 Mary Cesare-Murphy, PhD Executive Director, Behavioral Healthcare Accreditation Megan Marx, MPA Associate Director, OTP Accreditation Project Joint Commission on Accreditation of Health Care Organizations

  2. OTP Surveys Conducted 1/1/04 – 8/31/04 Twenty (20) OTPs received “No Requirement(s) for Improvement.” Forty-seven (47) OTP surveys had Requirement(s) for Improvement, with an Evidence of Standard Compliance (ESC) due as follow-up.

  3. 2004 CAMBH Chapters with Non-Compliant Standards Chapter # of Non-compliant Standards RI – (Ethics, Rights & Responsibilities) 17 LD – (Leadership) 7 APR – (Accreditation Participation Requirements) 7 HR – (Management of Human Resources) 37 PI – (Improving Organization Performance) 11 IC – (Infection Control) 2 MM – (Medication Management) 11 IM – (Information Management) 14 EC – (Environment of Care) 5 PC – (Provision of Care, Treatment & Services) 45

  4. 2004 OTPs Most Challenging Standards

  5. 2004 OTPs Most Challenging Standards con’t.

  6. OTP Surveys Conducted 2002 - 2003 • One hundred forty-two (142) OTPs received “Accreditation with Recommendations for Improvement”. • One hundred thirty-seven (137) OTPs received “Accreditation with Full Standards Compliance”. • Six (6) OTPs received “Conditional Accreditation”

  7. Individual-Focused Functions: RI – (Rights & Responsibilities & Ethics) PE – (Assessment) TX – (Care) PF – (Education) CC – (Continuum) Organization Functions: PI – (Improving Organization Performance) LD – (Leadership) EC – (Management of Environment of Care) HR – (Management of Human Resources) IM – (Management of Information) IC – (Surveillance, Prevention, Control of Infection) PS – (Behavioral Health Promotion)

  8. 2002 – 2003 OTPs Most Challenging Standards

  9. 2002 – 2003 OTPs Most Challenging Standards con.’t

  10. Comparison of overall trend(s) & identification of problem areas within OTPs: • Standards most frequently cited in OTP’s consistently came from the Assessment/Provision of Care, Treatment and Services and the Management of Human Resources sections of the standards. • Standards cited concerning licensed independent practitioners, assessment of patients religious or spiritual orientation and pain management were prevalent in OTP survey findings from both 2002-2003 and 1/1/04 – 8/31/04.

  11. Approach to OTP Education • Accreditation education efforts for OTPs should be focused on Assessment/Provision of Care, Treatment and Services and the Management of Human Resources to improve standards compliance. • If funding is awarded, Joint Commission will offer more topic specific learning opportunities utilizing user friendly distance learning formats in an effort to provide education to more OTPs.

  12. Periodic Performance Review

  13. Periodic Performance Review (PPR) • Facilitates a more continuous accreditation by incorporating an additional mid-cycle evaluation. • Provides for educational opportunities

  14. Periodic Performance Review • Is an accreditation participation requirement. • Will be completed between the 15th and 18th month point in the accreditation cycle. • Findings with an approved plan of action not subject to citation during a Random Unannounced Survey during approved timeframes.

  15. Periodic Performance Review • A surveyor on-site cannot overrule an approved plan of action. • During on-site survey, surveyors will request and review measures of success identified at time of 18-month PPR. • Process includes three options as well as full PPR.

  16. Characteristics of Full PPR • Areas of non-compliance self-assessed by the organization and scored using the JCAHO extranet tool. • Findings submitted electronically to the Joint Commission using the extranet. • JCAHO staff review plans of action and measures of success and conduct interactive phone call.

  17. Tips for the PPR • Read the user guide. • Check applicability table. • If unsure of applicability, leave unscored and discuss with standard representative. • Develop separate plans of action and measures of success (when required). • When it doubt, score it out – material for discussion. • Take full advantage of conference call time for questions.

  18. Guidelines for Sampling for PPR • When assessing category “C” Elements of Performance (EPs) these guidelines are recommended: • 30 cases for a population up to 100 (If population is less than 30, sample all) • 50 cases for a population of 101-500 • 70 cases for a population over 500

  19. Plan of Action • For each standard evaluated as “Not compliant” the organization will • Described the planned action for each element of performance (EP) marked as partial or not compliant • Develop a measure of success

  20. Measure of Success (MOS) • A numerical or other quantitative measure usually unrelated to an audit that validates that an action was effective and sustained • Submitted via extranet • Submitted on an electronic form with space limited to a brief indication of numerical measure – expressed as a percentage

  21. Benefits of Periodic Performance Review • Employs same tool as used by surveyors • Expands intra-cycle interaction with JCAHO • Supports continuous operational improvement • Assists organization in quest for 100% compliance, 100% of the time

  22. Link Between Period Performance Review and On-site Survey • At triennial survey, time will be devoted to reviewing measures of success • Surveyor will ask for data related to each measure of success • Track record requirements remain • Surveyors do not see the organization’s specific performance review or action plans

  23. PPR Option One • Organizations will attest that after careful consideration with legal counsel, they have decided not to participate in the Full PPR • Organizations will self assess compliance with standards, develop plans of action and measures of success (MOS) as applicable • Organizations will not submit PPR data to JCAHO

  24. PPR Option One • Organizations will not be able to use extranet tool to score compliance, but will be able to view and print all standards and EPs • Organizations will be able to submit standards related issues for discussion with JCAHO staff during an interactive, scheduled phone call, but no inference relative to compliance will be made

  25. PPR Option Two • Organizations will attest that after careful consideration with legal counsel, they have decided not to participate in the full PPR • An on-site survey will take the place of self-assessment activity • Survey length will be approximately one third of usual triennial survey • Organization will submit plans of action and MOS(s) for surveyor identified areas of non-compliance

  26. PPR Option Two • Conference call with JCAHO will be scheduled to review and approve plans of action and MOS(s) • Organizations will be charged a fee to cover costs of the on-site survey

  27. PPR Option Three • Organizations will attest that after careful consideration with legal counsel, they have decided not to participate in the Full PPR • A limited on-site survey will be conducted at the midpoint of the accreditation cycle • Following the survey the organization may elect to participate in a conference call to discuss standards related issues • At the time of the trienniel survey the surveyor will receive no information relating to the organizations Option 3 survey findings

  28. Using Data to Improve Program Performance • Planning is the key to preventing performance measurement mistakes • Ask the following questions: • What data should be collected? • Why should the data be collected? • What data are already available? • What are the sources of available data? • How will the data physically be collected? • How will the data be used?

  29. Using Data to Improve Program Performance • Consider the following common mistakes and tips to avoid these errors in your organization: • Mistake 1 – Insufficient planning before collecting data • Tip 1 – Determine which strategic measurement areas are high priorities

  30. Using Data to Improve Program Performance • Mistake 2 – Insufficient resources to support data collection • Tip 2 – Enlist leadership to ensure that adequate resources are available • Mistake 3 – Data integrity • Tip 3 – Assess the completeness of the data

  31. Using Data to Improve Program Performance • Mistake 4 – Extensive data collection • Tip 4 – Break data collection into manageable projects • Mistake 5 - Data collection “silos” • Tip 5 - Investigate data sources and instruments already in place.

  32. Using Data to Improve Program Performance Facts = Data Data Combinations = Measures Analyzed Measures = Information Applied Information = Improvement Improvement generates knowledge

  33. Using Data to Improve Program Performance • Follow these steps to avoid common pitfalls in data collection: • Review the specific purpose of your outcomes focused improvement project & determine what information, measures and data are necessary to achieve that purpose. • Review the specific information you need, specify performance measures that will generate that information & identify the data that compose these measures.

  34. Using Data to Improve Program Performance • Define indicator data elements. • Determine the sources for all needed data. • Create your data collection instruments. • Determine the most effective data analysis strategies by considering what type of data need to be collected and how they will be used to improve performance. • Document your data collection plan. • Pilot test the data collection tool and analysis strategies.

  35. Using Data to Improve Program Performance The Three “T’s” • TREND • Data over time on indicators • TARGET • Range of performance of each one • TOGETHER • Look at indicators in combination Joint Commission Benchmark January 2003 pgs 1,7

  36. Using Data to Improve Program Performance Types of Measurement • Administrative Measures – Productivity • Comparison Measures – Benchmarking • Process Measure – Access, Satisfaction • Functional Measures – Improvement • Fidelity Measures – Following processes

  37. Using Data to Improve Program Performance Administrative Measures • An administrative measure is an indication of how well your agency is following its mission, vision and values. • It is also a measure of how well your agency is doing. • Productivity or resource utilization is one example.

  38. Using Data to Improve Program Performance Productivity Examples • Direct Service Percentage • Billed Service Percentage • Show Rate/Keep Rate • Percentage of Improvement Rate • Revenue per staff

  39. Using Data to Improve Program Performance Comparison Data • Allows you to compare how your agency is doing in terms of other agencies. • Any number of areas you might choose to compare.

  40. Using Data to Improve Program Performance Process Measures • A process measure looks at how well your processes are meeting your goals or standards. • Examples: • Rate of meeting intake timeliness • Show rate of initial appointment • Show rate for second appointment after intake

  41. Using Data to Improve Program Performance Process Measure Examples • Emergency Services Use • Length of stay per diagnosis • Time of first appointment • Time between first & second appointment • Percent of consumers receiving first appointment within 48 hour of request • Keep rate • First appointment, subsequent appointments

  42. Using Data to Improve Program Performance Functional Measures • A functional measure is an outcome measure. • It can be as complicated as a formal, fee based measurement with national norms • Brief Symptom Inventory (BSI) • It can be as simple as a “home-made” measurement using a Likert scale

  43. Using Data to Improve Program Performance Construct a Likert Measure • List the functional elements that are important in the person’s life. • Supervisors and staff with experience with the population can help insure that the measure will have meaning. • Decide on a rating scale. • “0 to 10” is an 11 point scale, “O to 3” is a four point scale • Add descriptors to the rating to help staff know how to score the person • 0 = not present, 5 = some present, 10 = totally present

  44. Using Data to Improve Program Performance Construct a Likert Measure • Train staff how to use the scale • Implement the scale • Chart pre-and-post treatment scores as a comparison outcome measure.

  45. Using Data to Improve Program Performance Fidelity Measures • Fidelity is a concept used in formal research • In a treatment setting, ‘fidelity’ measures the extent that staff have followed your treatment guidelines. • Fidelity measurement is important in establishing a relationship between your treatment methods & functional improvement/outcome.

  46. Using Data to Improve Program Performance Fidelity: Sample Question Fidelity can be simple “Yes/No” questions to each part of your treatment protocol. • Were required lab tests current? y/n • Was the practice protocol followed? y/n • Did the physician sign the treatment plan? y/n

  47. Using Data to Improve Program Performance Readily Available Outcome Measures • Beck Depression • Beck Anxiety • CAP – Children’s Attention Problems, for Attention Deficit Hyperactivity Disorder (ADHD) • Conners (for ADHD) • Yale Brown Obsessive Compulsive • Michigan Alcohol Screening Test (MAST, for addiction)

  48. Using Data to Improve Program Performance Selecting the Measures • Organizational Context • Matching measures to your needs • Measure what reflects your vision and mission

  49. Using Data to Improve Program Performance Organizational Context • Organizational culture committed to data based decision making • Technical & management systems interdependent & well integrated • Support by top levels of management

  50. Using Data to Improve Program Performance Measures and your Mission • Quality • How do you know people are improving or at least maintaining functional level? • Coordinated • How can you tell if people can get needed services? • Responsive • What is your access goal? What is your actual access rate? Difference?????

More Related