1 / 31

Hospital e-Scorecard Report 2008: Emergency Department Care

2. Hospital e-Scorecard Reports are based on a balanced scorecard framework

oshin
Download Presentation

Hospital e-Scorecard Report 2008: Emergency Department Care

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. St. Thomas Elgin General Hospital “To be the BEST Community Hospital in Ontario” Hospital e-Scorecard Report 2008: Emergency Department Care The purpose of this presentation template for Hospital e-Scorecard Report 2008 is to provide a tool for communicating your results internally within your hospital. It should be customized to meet your needs. This template includes information on all four quadrants from Hospital e-Scorecard Report 2008 (System Integration and Change, Patient Satisfaction, Clinical Utilization and Outcomes and Financial Performance and Condition). There are a number of slides that include tables that require you to insert your own hospital’s data and LHIN and Peer Group values. In some cases, there are also columns to insert your LHIN values. These values can be exported from the e-Scorecard into MS Excel and copied into these tables. Provincial aggregate values are provided in the slides. The purpose of this presentation template for Hospital e-Scorecard Report 2008 is to provide a tool for communicating your results internally within your hospital. It should be customized to meet your needs. This template includes information on all four quadrants from Hospital e-Scorecard Report 2008 (System Integration and Change, Patient Satisfaction, Clinical Utilization and Outcomes and Financial Performance and Condition). There are a number of slides that include tables that require you to insert your own hospital’s data and LHIN and Peer Group values. In some cases, there are also columns to insert your LHIN values. These values can be exported from the e-Scorecard into MS Excel and copied into these tables. Provincial aggregate values are provided in the slides.

    2. 2 The indicators included in these reports are divided among four quadrants critical to the strategic success of any health care organization. These quadrants are based on the original work of Norton and Kaplan, and later adapted by Baker and Pink from the University of Toronto for use in balanced scorecards for Ontario hospitals. The indicators included in these reports are divided among four quadrants critical to the strategic success of any health care organization. These quadrants are based on the original work of Norton and Kaplan, and later adapted by Baker and Pink from the University of Toronto for use in balanced scorecards for Ontario hospitals.

    3. 3 …and achieve dual objectives The primary objective is to provide information to hospitals to support quality improvement. These reports also provide hospitals with an additional tool for communicating their accountability to their respective communities.The primary objective is to provide information to hospitals to support quality improvement. These reports also provide hospitals with an additional tool for communicating their accountability to their respective communities.

    4. The products for 2008 include: 4

    5. The e-Scorecard harnesses technology to enhance knowledge transfer… 5

    6. Hospital e-Scorecard Report 2008 series includes: 6 The project is sponsored by the Ontario Hospital Association and the Ministry of Health and Long-Term Care. For 2008, the data in the e-Scorecards have been updated for Acute Care, Emergency Department Care, Complex Continuing Care, and Rehabilitation. In addition, hospital-specific data for all four quadrants are included in the e-Scorecard for all four sectors. (Except the Patient and Family Satisfaction quadrant which is not included in this year’s Complex Continuing Care Hospital e-Scorecard Report) Please Note: The name of the “Patient Satisfaction” quadrant varies by sector: Acute Care – Patient Satisfaction (P.Sat) Emergency Department Care – Patient Satisfaction (P.Sat) Rehabilitation – Client Perspectives (CP) The project is sponsored by the Ontario Hospital Association and the Ministry of Health and Long-Term Care. For 2008, the data in the e-Scorecards have been updated for Acute Care, Emergency Department Care, Complex Continuing Care, and Rehabilitation. In addition, hospital-specific data for all four quadrants are included in the e-Scorecard for all four sectors. (Except the Patient and Family Satisfaction quadrant which is not included in this year’s Complex Continuing Care Hospital e-Scorecard Report) Please Note: The name of the “Patient Satisfaction” quadrant varies by sector: Acute Care – Patient Satisfaction (P.Sat) Emergency Department Care – Patient Satisfaction (P.Sat) Rehabilitation – Client Perspectives (CP)

    7. … and is based on data primarily from the 2006–2007 fiscal year 7 OHRS: Ontario Hospital Reporting System NACRS: National Ambulatory Care Reporting System DAD: Discharge Abstract Database SIC survey data is completed by hospital management staff most appropriate to answer questions in each section.OHRS: Ontario Hospital Reporting System NACRS: National Ambulatory Care Reporting System DAD: Discharge Abstract Database SIC survey data is completed by hospital management staff most appropriate to answer questions in each section.

    8. One hundred and sixteen (116) hospital corporations participated in at least one quadrant, representing 98% of all reported emergency department visits in Ontario in 2006-2007 … 8 Overall, 116 of 125 eligible hospital corporations agreed to participate in the 2008 Emergency Department Care Hospital e-Scorecard Report. Eighty-six (86) corporations participated in all four quadrants of the report. (69%) The provincial, peer group and LHIN results are calculated including data from all hospitals in the province for which data were available.Overall, 116 of 125 eligible hospital corporations agreed to participate in the 2008 Emergency Department Care Hospital e-Scorecard Report. Eighty-six (86) corporations participated in all four quadrants of the report. (69%) The provincial, peer group and LHIN results are calculated including data from all hospitals in the province for which data were available.

    9. Updates to Hospital e-Scorecard Report 2008: Emergency Department Care A condensed version of the System Integration and Change survey was carried out in December 2007 There was no change in indicator methodology with the shortened survey Performance allocation has been assigned for pediatric Clinical Utilization and Outcomes indicators (X-ray rate for asthma, bronchiolitis and croup) New accounts in the 2006-2007 OHRS standards have resulted in updated indicator definitions for some of the Financial Performance and Condition indicators. 9 OHRS: Ontario Health Reporting StandardsOHRS: Ontario Health Reporting Standards

    10. Methods for Assigning Performance Allocation

    11. The methods for assigning performance allocations are summarized below … System Integration and Change: Current method sets the upper and lower cut points at the 95th percentile and the 5th percentile respectively, capturing roughly 90% of indicator values. Scores falling above and below these cuts points are considered above and below average respectively. Performance classifications were assigned based on a hospital’s score relative to hospital type; e.g. for this quadrant, teaching and community hospitals were grouped together (small, community/teaching) because small hospitals face different challenges in carrying out many of the activities reported in the SIC areas. 11

    12. The methods for assigning performance allocations are summarized below … Patient Satisfaction: A 99.9% confidence interval of a hospital’s risk-adjusted score was compared to the provincial performance target for all measures. The provincial performance target is the average of all the hospitals’ scores for that measure. Clinical Utilization and Outcomes: Performance allocations are based on 95% confidence intervals of the hospital-specific risk-adjusted values as compared to the provincial average. 12

    13. The methods for assigning performance allocations are summarized below … Financial Performance and Condition: Performance allocations are not calculated for the indicators in the Financial Performance and Condition quadrant.  Please refer to the technical summaries for further details. 13

    14. 14 Understanding performance allocations for quadrants with patient level data as an example … For Clinical Utilization and Outcomes and Patient Satisfaction, performance allocation indicates whether the hospital’s score is statistically different than the provincial average. When dealing with statistics, one describes differences based on the amount of confidence one has in the actual score, linked primarily to sample size. This becomes more clear when discussed using the example in this slide. For this indicator, the minimum possible score is 0 and the maximum possible score is 100. Two hospital scores are highlighted in the slide: Hospital 1 has an actual score of 88, and that score is based on a large sample size of 300. Because the sample size is large, we have a lot of confidence in the precision of the score and therefore the confidence interval around the score is relatively narrow (reflected by the dark blue line on either side of the light blue dot). Hospital 2 also has an actual score of 88, but a much smaller sample size of 130. Given the smaller sample size, there is less confidence in the precision of this score and thus the confidence interval around the score is much wider. When “above average” performance is assigned to Hospital 1, it is because the confidence interval does NOT contain the provincial average. For Hospital 2 however, the bottom end of the confidence interval contains (crosses) the provincial average. Therefore we cannot say that Hospital 2’s score is statistically different from the provincial average. Thus it receives an “average” performance rating. For Patient Satisfaction, the provincial average is the 'provincial performance target' (average of all hospitals' scores). For more information please see the Patient Satisfaction Technical Summary. For Clinical Utilization and Outcomes and Patient Satisfaction, performance allocation indicates whether the hospital’s score is statistically different than the provincial average. When dealing with statistics, one describes differences based on the amount of confidence one has in the actual score, linked primarily to sample size. This becomes more clear when discussed using the example in this slide. For this indicator, the minimum possible score is 0 and the maximum possible score is 100. Two hospital scores are highlighted in the slide: Hospital 1 has an actual score of 88, and that score is based on a large sample size of 300. Because the sample size is large, we have a lot of confidence in the precision of the score and therefore the confidence interval around the score is relatively narrow (reflected by the dark blue line on either side of the light blue dot). Hospital 2 also has an actual score of 88, but a much smaller sample size of 130. Given the smaller sample size, there is less confidence in the precision of this score and thus the confidence interval around the score is much wider. When “above average” performance is assigned to Hospital 1, it is because the confidence interval does NOT contain the provincial average. For Hospital 2 however, the bottom end of the confidence interval contains (crosses) the provincial average. Therefore we cannot say that Hospital 2’s score is statistically different from the provincial average. Thus it receives an “average” performance rating. For Patient Satisfaction, the provincial average is the 'provincial performance target' (average of all hospitals' scores). For more information please see the Patient Satisfaction Technical Summary.

    15. High Performing Hospitals The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share.The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share.

    16. Criteria for High Performing Emergency Department hospitals 16 Please note that the FPC % Management and Operational Support Staff indicator is not included in the high-performing methodology. Please note that the FPC % Management and Operational Support Staff indicator is not included in the high-performing methodology.

    17. High performing Emergency Department hospitals within quadrants … 17 High performing hospitals are listed in alphabetical order.High performing hospitals are listed in alphabetical order.

    18. High Performing Hospital across quadrants Criteria: Top performer in 2 or more quadrants and very low number of indicators with below average performances in other quadrants. Hospitals: Alexandra Marine and General Hospital Small hospital LHIN 2 (South West) Top performer in 2 out of 4 quadrants (Patient Satisfaction and Financial Performance and Condition) Almonte General Hospital Small hospital LHIN 11 (Champlain) Top performer in 3 out of 4 quadrants (Clinical Utilization and Outcomes, Patient Satisfaction and Financial Performance and Condition) 18

    20. 20 The indicators for System Integration & Change include: Use of Standardized Protocols Internal Coordination of Care Clinical Data Collection and Dissemination Use of Clinical Information Technology Healthy Work Environment Note that a condensed version of the SIC Survey was carried out this year. For the indicators reported this year, there is no change in the indicator methodology with the shortened version of the survey. Note that a condensed version of the SIC Survey was carried out this year. For the indicators reported this year, there is no change in the indicator methodology with the shortened version of the survey.

    21. 21 Results: System Integration and Change System-level highlights The indicator that showed the largest improvement was related to the internal coordination of care whose provincial average increased from 60% in 2007 to 66% in 2008. The highest scores from this year results were once again related to the use of standardized protocols in EDs. Many hospitals have received a score of 100% for this indicator which is consistent with previous results. 76% of the hospitals indicated that data on adverse events (including medication errors, drug reactions) are collected and compared internally across specialties and/or to past performance for quality improvement. This is close to a 8% increase from the 2007 results. Data presented are based on results from a survey completed on a voluntary basis by hospital managers in December 2007. Results for the 102 hospitals that completed this year’s ED SIC survey are included in the analysis. Data presented are based on results from a survey completed on a voluntary basis by hospital managers in December 2007. Results for the 102 hospitals that completed this year’s ED SIC survey are included in the analysis.

    22. 22 Results: System Integration & Change High Performing Hospitals within the Quadrant Criteria: Above-average rating on 3 out of 5 indicators. (Due to the fewer number of indicators in 2008, the criteria has been amended from previous years) High Performers: Carleton Place and District Memorial Hospital Guelph General Hospital Hôtel-Dieu Grace Hospital St. Joseph's Health Centre Toronto The Credit Valley Hospital The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order. The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order.

    23. 23 2007 Results: System Integration & Change How does STEGH compare? Note that these indicators are all scored out of a maximum of 100; a higher score is preferable. Note that these indicators are all scored out of a maximum of 100; a higher score is preferable.

    24. 24 Results: System Integration & Change How have STEGH results changed over time? As some indicator methodologies have been revised in 2007, caution should be taken when comparing 2007 and 2008 with 2005’s data. The indicators revised after 2005 were: “Use of Standardized Protocols”, “Internal Coordination of Care”, “Clinical Data Collection and Dissemination”, and “Use of Clinical Information Technology”. Your hospital’s indicator results can be exported from the e-Scorecard into MS Excel and copied into this table. As some indicator methodologies have been revised in 2007, caution should be taken when comparing 2007 and 2008 with 2005’s data. The indicators revised after 2005 were: “Use of Standardized Protocols”, “Internal Coordination of Care”, “Clinical Data Collection and Dissemination”, and “Use of Clinical Information Technology”.

    26. 26 The indicators for Patient Satisfaction include: Primary indicators: Overall Impressions Communication Consideration Responsiveness Other indicators available in e-Scorecard include: Overall Satisfaction, Coordination of Care and Access, Physical Comfort, Respect for Patient Preferences and Courtesy, Information and Education, Continuity and Transition, Emotional Support, Physician Care, and Nursing Care The primary indicators are those indicators included in previous years’ Executive Summary Reports. The primary indicators are those indicators included in previous years’ Executive Summary Reports.

    27. 27 Results: Patient Satisfaction System-level highlights Patients in small hospitals report higher scores than patients in larger community and teaching hospitals in the four primary dimensions of patient satisfaction. The South West LHIN has the highest average score for the communication and overall impressions indicators while the South East LHIN has the highest average scores for the responsiveness and consideration indicators. Of the primary indicators, the highest average scores across the province are related to overall impression and the lowest average scores are related to responsiveness. Sample Question for overall impression: Overall, how would you rate the care you received at the hospital? Sample Question for responsiveness: Did you have to wait too long to see a doctor? Results for the ninety (90) hospitals that voluntarily participated in the emergency department patient satisfaction survey process in 2006-2007 are included in the analysis. Results for the ninety (90) hospitals that voluntarily participated in the emergency department patient satisfaction survey process in 2006-2007 are included in the analysis.

    28. 28 Results: Patient Satisfaction High Performing Hospitals within Quadrant Criteria: Above-average rating on 4 out of 4 indicators. High Performers: Alexandra Marine and General Hospital Almonte General Hospital Deep River and District Hospital Grey Bruce Health Services Haldimand War Memorial Hospital Huron Perth Healthcare Alliance Kemptville District Hospital Listowel & Wingham Hospitals Alliance North Wellington Health Care Perth & Smiths Falls District Hospital St. Francis Memorial Hospital The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order. The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order.

    29. 29 2006-2007 Results: Patient Satisfaction How does STEGH compare? Results for the ninety (90) hospitals that voluntarily participated in the emergency department patient satisfaction survey process in 2006-2007 are included in the analysis. *Please note that the provincial values shown are for the Provincial Weighted Average, not the Provincial Performance Target. The Provincial Performance Target is the average of hospital scores and is used for performance allocation. For each of the indicators, a higher score is desirable, as is an above-average performance classification. The maximum score for each indicator is 100. The values for your hospital, peer group and LHIN can be exported into MS Excel from the e-Scorecard and then copied into this table.Results for the ninety (90) hospitals that voluntarily participated in the emergency department patient satisfaction survey process in 2006-2007 are included in the analysis. *Please note that the provincial values shown are for the Provincial Weighted Average, not the Provincial Performance Target. The Provincial Performance Target is the average of hospital scores and is used for performance allocation. For each of the indicators, a higher score is desirable, as is an above-average performance classification. The maximum score for each indicator is 100. The values for your hospital, peer group and LHIN can be exported into MS Excel from the e-Scorecard and then copied into this table.

    30. 30 Results: Patient Satisfaction How have STEGH’s scores changed year over year? The 2006 results are only available on the e-Scorecard.The 2006 results are only available on the e-Scorecard.

    32. 32 The indicators for Clinical Utilization and Outcomes include: Pneumonia Proportion of Pneumonia Patients that have an Inpatient Length of Stay (LOS) of <=2 days Asthma Return Visit Rate for Asthma (<=24 hrs) – Adult (20-64 years) Return Visit Rate for Asthma (24 - 72 hrs) – Adult (20-64 years) Return Visit Rate for Asthma (0 – 72 hrs) – Pediatric (1-19 years) Ankle and Foot Injury X-ray Rate for Ankle or Foot Injury Patients – Adult (20-84 years) X-ray Rate for Ankle or Foot Injury Patients – Pediatric (5-19 years) Return X-ray Rate for Ankle or Foot Injury Patients (<=7 days) Pediatric- specific indicators: Chest X-ray Rate for Asthma – Pediatric (1-19 years) Chest X-ray Rate for Bronchiolitis – Pediatric (3-24 months) Chest and Neck X-ray Rate for Croup – Pediatric (3 months – 3 years) The indicators in this quadrant focus on performance related to 3 clinical conditions: pneumonia, ankle and foot injury, and asthma. Pediatric- specific indicators are also presented for 3 clinical conditions: asthma, bronchiolitis and croup Caution should be taken in making comparisons to previous years as some modifications have been made to indicator methodologies (i.e. indicator definitions, risk-adjustment methodologies). Please see the technical summaries for more information. The indicators in this quadrant focus on performance related to 3 clinical conditions: pneumonia, ankle and foot injury, and asthma. Pediatric- specific indicators are also presented for 3 clinical conditions: asthma, bronchiolitis and croup Caution should be taken in making comparisons to previous years as some modifications have been made to indicator methodologies (i.e. indicator definitions, risk-adjustment methodologies). Please see the technical summaries for more information.

    33. 33 Results: Clinical Utilization & Outcomes System-level highlights Small hospitals had lower rates for ankle x-rays on the initial visit than community and teaching hospitals, however they have higher return visit x-ray rates.  This trend may reflect differences in 24hr access to a radiology department in small hospitals.  Further analysis of return visits showed that the majority (84%) of return visits within small hospitals occurred within the first 24 hours compared to 33% in teaching hospitals. Community hospitals had higher rates on all pediatric chest x-ray rate indicators compared to small and teaching hospitals There continues to be large variation in the pediatric chest x-ray rate indicators across the province.  For example,  in the North West LHIN, the chest x-ray rates for bronchiolitis indicator is 67 per 100, whereas the rate in Champlain is 26 per 100.  Central LHIN hospitals had on average the lowest rate of return visits for asthma among children compared to other LHINs

    34. 34 Results: Clinical Utilization & Outcomes High Performing Hospitals within the Quadrant Criteria: Above-average on 3 out of 10 indicators and no below-average score on any indicator. High Performers: Almonte General Hospital Blind River District Health Centre Children's Hospital of Eastern Ontario Hotel Dieu Hospital Kingston Kirkland and District Hospital Mattawa General Hospital North York General Hospital Winchester District Memorial Hospital The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order. The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order.

    35. 35 2006-2007 Results: Clinical Utilization & Outcomes How does our hospital compare?

    36. 36 2006-2007 Results: Clinical Utilization & Outcomes How does our hospital compare?

    37. 37 Results: Clinical Utilization & Outcomes How has your hospital’s performance changed year over year? As the risk adjustment methodologies are revised each year, caution should be taken when comparing numerical data year over year. A lower rate on these indicators is generally considered to be better. For further information on the interpretation of these indicators, please refer to the technical summary report.As the risk adjustment methodologies are revised each year, caution should be taken when comparing numerical data year over year. A lower rate on these indicators is generally considered to be better. For further information on the interpretation of these indicators, please refer to the technical summary report.

    38. 38 Results: Clinical Utilization & Outcomes How has our hospital’s performance changed year over year? As the risk adjustment methodologies are revised each year, caution should be taken when comparing numerical data year over year. A lower rate on these indicators is generally considered to be better. For further information on the interpretation of these indicators, please refer to the technical summary report. For the ankle x-ray indicators, while it is desirable to have a relatively low rate on this indicator, in order to accurately identify ankle and foot fractures, too low a rate is an indication that the given hospital may be under utilizing x-rays, and hence, under diagnosing ankle and foot injuries. Hospitals with very low x-ray rates for these indicators should examine if they are under-utilizing x-rays by examining their return x-ray rates.As the risk adjustment methodologies are revised each year, caution should be taken when comparing numerical data year over year. A lower rate on these indicators is generally considered to be better. For further information on the interpretation of these indicators, please refer to the technical summary report. For the ankle x-ray indicators, while it is desirable to have a relatively low rate on this indicator, in order to accurately identify ankle and foot fractures, too low a rate is an indication that the given hospital may be under utilizing x-rays, and hence, under diagnosing ankle and foot injuries. Hospitals with very low x-ray rates for these indicators should examine if they are under-utilizing x-rays by examining their return x-ray rates.

    40. The indicators of Financial Performance and Condition include: % Total Worked Hours % Management and Operational Support Staff Hours % Nursing Worked Hours % Registered Nurse (RN) Hours 40

    41. Results: Financial Performance and Condition System-level highlights In 2006-07, approximately 85% of earned hours of ED staff were spent on activities related to the operation of the emergency department. The remaining 15% can be accounted for by vacation time, maternity leave and other benefits. EDs reported that 16% of their total staff hours were spent on management and operational support activities. Teaching hospital as a group had the highest values (19%) for this indicator, while small hospitals had the lowest (9.3%). Approximately 84% of earned hours of nursing personnel of EDs were spent engaged in activities related to client care, with the remaining time being spent on vacations and other benefits. 97% of hours worked by nursing staff in Ontario EDs were provided by registered nurses. 41

    42. Results: Financial Performance and Condition High Performing Hospitals within Quadrant Criteria: Hospitals with scores 0.5 standard deviations (SD) above the provincial average, in 3 out of 3 indicators. High Performers: Alexandra Marine and General Hospital Almonte General Hospital Hornepayne Community Hospital Nipigon District Memorial Hospital Services de santé de Chapleau Health Services South Huron Hospital St. Francis Memorial Hospital St. Michael's Hospital The Credit Valley Hospital The Willett Hospital Women's College Hospital 42 Please note: the % Management and Operational Support Staff indicator is not included in the high-performing methodology. The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order. Please note: the % Management and Operational Support Staff indicator is not included in the high-performing methodology. The purpose of identifying high performers within quadrants is to identify those hospitals that may have useful ideas and practices to share. High performing hospitals are listed in alphabetical order.

    43. 2006-2007 Results: Financial Performance & Condition How does our hospital compare? 43 The values for your hospital, peer group and LHIN can be exported into MS Excel from the e-Scorecard and then copied into this table. The values for your hospital, peer group and LHIN can be exported into MS Excel from the e-Scorecard and then copied into this table.

    44. Results: Finance Performance and Condition How have our hospital’s scores changed year over year? 44 Note: Due to a change in indicator definitions, fiscal 2005-06 (HR 2007) and 2006-07 (HR 2008) may not be directly compared against previous years. Note: Due to a change in indicator definitions, fiscal 2005-06 (HR 2007) and 2006-07 (HR 2008) may not be directly compared against previous years.

    45. Results: Finance Performance and Condition How have our hospital’s scores changed year over year? 45 Your hospital’s indicator results for 2005 and 2007 can be found in the Emergency Department Reports or exported from the e-Scorecard and then copied into this table. The 2006 results are only available on the e-Scorecard.Your hospital’s indicator results for 2005 and 2007 can be found in the Emergency Department Reports or exported from the e-Scorecard and then copied into this table. The 2006 results are only available on the e-Scorecard.

    46. How can hospitals use these data for quality improvement? Variation across hospitals points to potential areas for quality improvement. Identifying and understanding causes of variation is the first step in quality improvement. Drilling down to the component level in the e-Scorecard can assist in identifying the root causes for an indicator score. If further “drill down” reinforces an opportunity for improvement, plan an intervention and monitor the indicator outcomes for changes over time. 46 The final slides in this presentation template start the transition from performance measurement to action. The first step is to drill down and determine whether there is a need and opportunity for improvement on processes related to certain indicators. If further exploration supports the need, there are several options for identifying enhanced processes. These may come from the literature, web sites, or other hospitals that score well in a particular area. These, and other measures, may be used as a baseline for determining whether strategies implemented to improve processes are having the desired impact. The final slides in this presentation template start the transition from performance measurement to action. The first step is to drill down and determine whether there is a need and opportunity for improvement on processes related to certain indicators. If further exploration supports the need, there are several options for identifying enhanced processes. These may come from the literature, web sites, or other hospitals that score well in a particular area. These, and other measures, may be used as a baseline for determining whether strategies implemented to improve processes are having the desired impact.

    47. Quality Improvement A few tips on making change ?DOs Examine current systems using data and pictures such as flow diagrams or process maps. Identify possible changes to current systems using a team approach with individuals in your hospital who provide and organize care. Set goals and attempt small tests of change, using trial and error learning (i.e. based on Plan-Do-Study-Act cycles). Study the result of the change to find out if: a) change is happening and; b) if it is leading to improvement. Leverage senior leaders and other quality improvement and measurement supports internal and external to your hospital to make change that will lead to improvement. 47

    48. Quality Improvement A few tips on making change ?DON’Ts Respond to problems without knowing enough about them. Attribute problems to people rather than systems or processes. Do more of the same if it’s not working. Try to make change by yourself, rather than with a team. Strive for perfect change and perfect measurement, to the detriment of maintaining the status quo. Do a large exhaustive study or plan to make large sweeping changes at once. 48

    49. Quality Improvement Future initiatives planned by our hospital 49 You may already have initiatives planned or you may wish to use this slide to identify a process that will be followed to respond to the results from the e-Scorecard. You may already have initiatives planned or you may wish to use this slide to identify a process that will be followed to respond to the results from the e-Scorecard.

    50. Quality Improvement What our hospital is already doing to respond… 50 Your hospital may already have strategies implemented to address a particular area of concern. From an accountability perspective, these initiatives are important to communicate both internally and to your community. Your hospital may already have strategies implemented to address a particular area of concern. From an accountability perspective, these initiatives are important to communicate both internally and to your community.

    51. Performance measures are merely the first step… “Quality is never an accident; it is always the result of intelligent effort.” John Ruskin 51

More Related