1 / 78

Prepared by Center for the Study of Services 733 15 th Street, NW, Suite 820 Washington, DC 20005 Phone: 202-347-9612

Factors Affecting Results of HEDIS/CAHPS ® 2.0H Health Plan Member Satisfaction Surveys and Resulting Accreditation Scores Technical Information to Assist Consumers, Purchasers, and Researchers in Interpreting Results. Prepared by Center for the Study of Services

drew
Download Presentation

Prepared by Center for the Study of Services 733 15 th Street, NW, Suite 820 Washington, DC 20005 Phone: 202-347-9612

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Factors Affecting Results of HEDIS/CAHPS® 2.0H Health Plan Member Satisfaction Surveys and Resulting Accreditation ScoresTechnical Information to Assist Consumers, Purchasers, and Researchers in Interpreting Results Prepared by Center for the Study of Services 733 15th Street, NW, Suite 820 Washington, DC 20005 Phone: 202-347-9612 Paul Kallaur Project Manager Spring 2002

  2. Introduction The importance of CAHPS® surveys What’s new for HEDIS 2002 Collecting CAHPS® Data on the Internet Comparing Mail and Phone Surveys The Mailing Package Questionnaire and package format Outreach to Spanish-speaking members Oversampling to Compensate for Disenrolled Members The Impact of Combining vs. Separating HMO and POS Members Demographic Adjustments to CAHPS® Data Trends in Plan Ratings Topics

  3. The Center for the Study of Services (CSS) is a nonprofit corporation founded in 1974 • CSS’s mission is to collect and disseminate performance measurement information on various types of services • Much of our work is in the health care field where we have conducted groundbreaking work measuring HMO, hospital, and physician performance • CSS has performed HEDIS/CAHPS data collection for more than 500 samples in the past three years

  4. This briefing will provide information on how a plan’s decisions regarding sampling and methodology may impact its response rates, performance scores, NCQA accreditation scores, and costs

  5. NCQA is allowing more flexibility in the CAHPS® survey protocol than it has in the past • NCQA’s flexibility can result in plans getting different scores based on protocol differences, not performance differences • CSS has repeatedly expressed concern to NCQA that flexibility is not desirable • It is CSS’s opinion that new ideas should be tested in controlled experiments and “best” protocols should be mandated for all plans • One of the goals of this brief is to ensure that all users of HEDIS/CAHPS 2.0H data are aware of the non-performance related factors that can affect scores

  6. This briefing is based on CSS’s analysis of data from the following sources • 2001 NCQA Quality Compass: • This data set includes summary-level data for 267 adult commercial plans that submitted 2001 CAHPS® data to NCQA and allowed it to be publicly reported • 2001 Federal Employees Health Benefit Program (FEHBP) Plans: • This data set includes full respondent-level data for the 139 unique adult commercial HMO/POS plans that submitted CAHPS® data to the U.S. Office of Personnel Management (OPM) in 2001 • Participating FEHBP plans were required to submit data • Center for the Study of Services (CSS): • This data set includes full respondent level data, demographic data, and sampling information for child, adult, commercial, and Medicaid plans that used CSS as their CAHPS® vendor – over 150 samples in each of the years 2001, 2000, and 1999

  7. Several key terms will be used throughout the briefing: • The 10 key measures in the CAHPS®survey consist of: • Domains: NCQA defines the following six domains: Getting Needed Care, Getting Care Quickly, How Well Doctors Communicate, Courteous and Helpful Office Staff, Customer Service, and Claims Processing • Ratings Questions: The four 0-10 ratings questions are Rating of Health Plan (Q47 on the adult commercial version), Rating of Health Care (Q33), Rating of Specialist (Q12), and Rating of Personal Doctor or Nurse (Q8) • Ratings questions and questions within each of the domains are aggregated to determine the following two scores, which are calculated according to NCQA guidelines: • Global Proportions range in value from 0 to 100 and are reported in NCQA’s Quality Compass • For example: The percentage giving a rating of 8, 9 or 10 on one of the four ratings questions is that rating’s Global Proportion • Composite Scores range in value from 1 to 3 and are used for NCQA accreditation • Survey responses are rescaled and averaged across the several questions constituting a domain to calculate Composite Scores • Accreditation: NCQA assigns HMO and POS plans one of five possible accreditation levels based on the plan’s performance on various standards and measures

  8. It is important to keep in mind that factors that appear to have a minor impact on responses or response rates may have an important effect on how a plan’s CAHPS® data are viewed • Even seemingly minor changes in individual responses can have a substantial impact on Composite Scores and accreditation points • For example: • On the four 0-10 ratings questions, NCQA groups respondents into three categories (0-6, 7-8, or 9-10) for purposes of calculating Composite Scores • If only one percent of respondents were to move from one category into another, the Composite Score for any of the ratings questions would change by 0.01

  9. Relatively narrow ranges separate the thresholds used to determine points toward accreditation on many measures

  10. NCQA gave health plans more methodological options for HEDIS 2002 • NCQA’s 2001 protocol required a prenotification postcard, two survey mailings, two reminder postcards, and at least six attempts to reach non-respondents by telephone • Beginning in 2002, health plans were given the choice of either: • a mixed methodology consisting of two survey mailings and two reminder postcards, with at least three attempts to reach non-respondents by telephone, or • a mail-onlymethodology consisting of three survey mailings and two reminder postcards, with no telephone follow-up • All health plans also were given the option of using the Internet as an enhancement

  11. Collecting CAHPS® Data on the Internet

  12. It is reasonable to expect many potential benefits from using the Internet to collect CAHPS® 2.0H survey data • An increase in the overall response rate • An increase in the likelihood of reaching a demographically different set of members • A decrease in data collection cost by reducing return postage and data entry • A reduction in the data collection time frame due to the instant feedback that the Internet can provide • A simplification of the response process by making skip patterns invisible to respondents

  13. In 2001, NCQA allowed plans to use the Internet as an enhancement for the CAHPS® 2.0H Survey • The Internet response option was considered an enhancement to the standard survey protocol • NCQA required vendors to use a defined Internet protocol • No one had conducted a controlled experiment to assess the impact on response rates or ratings of using the Internet to collect data • One survey vendor had used the Internet to collect data in 2000

  14. CSS designed and conducted a test to assess the impact of the the Internet protocol on survey results • CSS recruited 8 plans from across the country to participate in the test, consisting of: • Seven adult commercial samples • One child commercial sample • Half of each sample was selected randomly and given the option to respond using the Internet • The control half (without an Internet option) followed the regular NCQA protocol • The test half (with the Internet option) followed NCQA’s Internet protocol • The cover letters for the survey mailings included a web address (www.confidentialsurvey.org) as well as an 8-digit username and 8-digit password • The pre-notification and reminder postcards did not mention the Internet • Overall, 2.3% of the eligible sample members responded via the Internet

  15. The negative effect of offering members the option of responding on the Internet appeared in seven of the eight samples Response Rates Internet Option vs. No Internet Option

  16. The lower response rate was especially evident among 55- to 64-year-olds in the sample

  17. Men Women The overall response rate for women was 2.7 percentage points lower among those offered the Internet option Response Rates By Gender Internet Option vs. No Internet Option

  18. In both the CSS test and the data submitted to OPM in 2001, members responding via the Internet rated their plans lower than did members who responded by telephone or mail

  19. CSS test respondents who used the Internet to complete the survey gave lower ratings to their plans across demographic categories Percent Rating Plan 8-10 Internet vs. Mail

  20. Internet Response Mode Non-Internet Response Mode Overall Health Plan Overall Health Care Overall Specialist Overall Personal Doctor Claims Processing Customer Service Courteous and Helpful Office Staff How Well Doctors Communicate Getting Care Quickly Getting Needed Care Most Composite Scores were driven down when respondents used the Internet to complete the survey CSS Composite Scores Internet vs. Non Internet Response Mode

  21. Percent of Plans With LowerNCQARanking Percent of Plans With Higher NCQARanking Overall Health Plan Overall Health Care Overall Specialist Overall Personal Doctor Claims Processing Customer Service Courteous and Helpful Office Staff How Well Doctors Communicate Getting Care Quickly Getting Needed Care Applying the CSS test results to 2001 OPM and NCQA plan scores, a notable percentage of plans would have received fewer points toward NCQA accreditation if they had offered the Internet response option

  22. Based on the CSS test, it appears that offering sample members the option of using the Internet to respond does not achieve the desired benefits • Overall response rates are decreased • The demographic distribution of respondents appears to become slightly more representative by decreasing response rates for over-represented groups as opposed to increasing the response rate for under-represented groups • The Internet response option may cause scores to fall • The Internet response option does not reduce the cost of data collection through mail and phone under the currently prescribed NCQA protocol given its negative effect on response rates • Choosing to use the Internet while other plans do not is a risk to a plan’s ranking relative to other plans

  23. The Internet might still be a useful tool for collection of CAHPS®2.0H Data • Modifying the mail protocol might increase the number of Internet responses • Example: Send a pre-notification letter asking sample members to complete the survey on the Internet and then wait two weeks before the first survey mailing • Any protocol using the Internet should be tested before it is fully implemented

  24. Comparing Mail and Phone Surveys

  25. As noted earlier, plans were given the option of collecting CAHPS® data in 2002 using a mail-only methodology or a mixed methodology that includes both mail and phone • The new mail-only protocol calls for three mailings of the CAHPS® survey instrument • NCQA has allowed plans to use three survey mailings as an enhancement to help increase response rates • CSS has examined the impact of this protocol using CSS and OPM data files • The mixed mail and phone protocol calls for two mailings of the CAHPS® survey instrument followed by three attempts to collect response information on the phone • The number of phone attempts required has been reduced by NCQA from six attempts in 2001 to three attempts in 2002 • The protocol that a plan selects will affect its: • Response rate • Plan score • Survey cost

  26. Historically, the third mail wave and the first three phone attempts have yielded similar numbers of returned surveys Commercial Medicaid

  27. Regardless of which protocol a plan selects in 2002, an estimated 14.5 percent of its survey responses will be collected after mail waves 1 and 2 • Whether a plan selects the mail-only or the mixed protocol in 2002, it will be required to begin the data collection process with two survey mailings • The scores that would be obtained after the first two mail waves serve as a baseline for assessing the impact of either a third survey mailing or telephone follow-up • The chart on the next page shows the average difference between scores received on the first two survey mailings and those that would be obtained on a third mail wave or on the telephone • The chart on the page after that shows the estimated impact on each overall plan score of either sending a third survey mailing or conducting telephone follow-up

  28. Responses collected on the phone tend to be more favorable than responses collected from the third mail wave Average difference from scores received from mail waves 1 & 2

  29. The net effect on overall scores favors the mixed methodology Average impact on overall plan scores when 3rd mail wave or phone responses are added

  30. On average, the mixed methodology will result in higher Composite Scores for each composite measure Average impact on Composite Scores when phone responses are added instead of 3rd mail wave

  31. Using a mixed mail and phone protocol rather than a mail-only protocol will cause a plan to reach a higher accreditation threshold an estimated 1.46 times across the ten CAHPS® measures Percent of plans reaching a higher threshold as a result of using a mixed methodology rather than a mail-only methodology

  32. There is some debate as to whether the difference in ratings between mail and phone respondents is due to a mode effect or due to selection bias • Mode effect would mean that a given respondent is more likely to say something favorable about a plan over the phone than on a written survey • Selection bias would mean that there is something different about the mix of respondents who respond over the phone rather than by mail • The argument that dissatisfied members are more likely to respond and thus greater effort is required to get favorable responses is somewhat dispelled by the response results from the third survey mailing • It is difficult to assess fully the impact of selection bias because • Differences may be due to a demographic variable that is not captured in the survey • CAHPS® survey data have been collected using a mixed methodology involving staggered mail and phone waves

  33. Comparing ratings from plan members who provided responses both in the mail and on the phone provides an interesting test for mode effect • When CAHPS® 2.0H data are collected, it is possible for the same respondent to complete the survey both over the phone and in the mail • This is unlikely because sample members are excluded from the phone portion of the survey as soon as the mail survey is received • When this occurs, only one survey, the first received, is included in the plan’s HEDIS data • In the 2000 and 2001 survey cycles, CSS captured 603 “double” responses, providing a direct way to test the impact of the data collection mode on survey results

  34. A sizable percentage of respondents who completed both a telephone and a mail survey gave higher ratings over the telephone than they did in the mail 2000 and 2001 Mail Versus Phone Survey Scores Overall Health Plan Overall Health Care Overall Specialist Overall Personal Doctor Claims Processing (Q35) Customer Service (Q39) Courteous and Helpful Office Staff (Q28) How Well Doctors Communicate (Q30) Getting Care Quickly (Q17) Getting Needed Care (Q06)

  35. Plans’ scores might be favorably affected by a plan’s efforts to maximize the number of responses collected on the phone • There are two straightforward ways a plan can increase the number of phone responses • Provide better phone numbers to survey vendors • Make more than the prescribed three attempts to complete the survey on the phone

  36. Plans vary widely in their ability to provide phone numbers for members • Not surprisingly, the number of responses collected on the phone is highly correlated with the number of plan-provided phone numbers Commercial Plans Correlation: .67 Medicaid Plans Correlation: .63

  37. The HEDIS 2001 protocol called for six attempts to collect survey responses over the phone, while HEDIS 2002 changed this requirement to require only three attempts • In 2001, 29.3 percent of all phone responses submitted to OPM were collected on attempts 4, 5, and 6 (T4, T5, and T6 on the figure below) • Plan scores do not appear to be significantly affected by the attempt on which the survey was completed

  38. Extending the phone portion of the survey by three attempts can have a favorable impact on on a plan’s scores Average impact on overall plan scores when six versus three phone attempts are used

  39. Using the HEDIS/CAHPS®2002 mail-only protocol is less expensive than the mixed methodology Price ratios using CSS’s 2002 and 2001 prices

  40. The Mailing Package

  41. Different vendors format their questionnaire booklets in different ways • NCQA provides an electronic version of all CAHPS® 2.0 H surveys in an 8½” by 11” format, but allows vendors to adapt this format “within certain parameters” concerning font size and general readability • Although CSS has used many different formats, our default questionnaire design is a 4¼” by 7¾” saddle-stitched booklet • Our general preference for the small booklet questionnaire format is based primarily on positive feedback that we have received from plans and respondents • A number of people have mentioned to us that they find our survey booklets easy to handle, compact, and easy to complete because this single-column format leaves little room for confusion about the sequence of questions • Although the larger questionnaire format is easier for optical scanners to read, we have come to believe that the small booklet format is more user-friendly for respondents

  42. The questionnaire booklet size may have an impact on a plan’s response rate • In addition to this anecdotal evidence that the small booklet is preferable, CSS has some empirical evidence suggesting that this format improves mail response rates • In 2001, both CSS and another NCQA-certified survey organization conducted the adult commercial CAHPS® survey for the same health plan • Samples were drawn from the same populations, and the surveys were fielded simultaneously • The main difference between the two organizations’ fielding of the survey was that CSS used its small booklet format while the other organization used the common large booklet format • The CSS format returned a higher number of responses in the first two mail waves (464 or 33.8 percent) than the format used by the other organization (412 or 30.0 percent) • This difference is statistically significant at the 95 percent confidence level

  43. CSS has conducted various tests of the effects of mail package design variants on response rates and plan scores • Using as a baseline the response rate achieved by two mail waves, each using a standard 8½” by 4½” carrier envelope, we have found that: • Stamping “URGENT” on the envelope used for the second wave increases overall response rates by about four percentage points • Sending the second wave as Priority Mail increases overall response rates by about six percentage points • Placing an official-looking mail verification sticker on the second-wave envelope increases overall response rates by about three percentage points • Using a 9” by 12” carrier envelope for the second wave increases overall response rates by about two percentage points

  44. CSS has also tested the impact of other mailing approaches not currently allowed by NCQA for HEDIS/CAHPS® surveys • Highlights of elaborate experiments involving additional follow-up mailings beyond a standard two survey mailing protocol: • Certified mail: Follow-up with an additional questionnaire mailing sent as certified mail achieved a 14.9 percentage point lift in the response rate • The words “Postal carrier may leave without signature” were typed on the outer envelope to avoid inconveniencing the recipient • Cash incentive: An extra wave of First Class mail with $1.00 cash enclosed resulted in a 13.3 percentage point lift in response rate • Additional CSS controlled experiments with non-HEDIS mailings indicate that response rates can be improved noticeably by offering either cash incentives or a useful, subject-related publication free to respondents • Extra questions: A CSS controlled experiment showed that adding 45 extra questions to the CAHPS® survey reduced the response rate only three percentage points, a difference that was not significant

  45. Some plans with a large Hispanic membership offer the option of completing the survey in Spanish which likely increases their response rates within this population • In 2001, CSS mailed out Spanish surveys to 2,858 sample members and received 157 completed Spanish surveys (5.5%) • Providing survey materials in both English and Spanish appeared to boost Spanish response rates more than offering respondents the option of calling to request a Spanish questionnaire • CSS has yet to conduct a controlled experiment to assess the impact of bilingual surveying on overall response rates

  46. Outreach to Spanish-speaking respondents has a positive effect on a plan’s ratings • Respondents to Spanish-language surveys are more likely to give the health plan a positive overall rating than are those who complete the survey in English • On a 2001 adult Medicaid survey which offered all respondents the option of responding in either language, 90.3 percent of respondents in Spanish gave the health plan an overall rating of 8, 9 or 10, compared to 70.6 percent of those completing English-language surveys

  47. Oversampling to Compensate for Disenrolled Members

  48. NCQA guidelines allow plans to oversample for their CAHPS® surveys if they so choose • Plans may wish to consider oversampling for two reasons: • To purge disenrolled respondents from the HEDIS data set • To ensure a threshold number of responses for each question and avoid “non-report” status

  49. At their own discretion, plans may oversample from 0 to 30 percent in increments of five percent • Plans that oversample may provide their vendors with an updated sample frame • Vendors mark disenrolled sample members and their response data is no longer reported to NCQA • The updated sample frame can be provided at any time prior to data submission to NCQA • Disenrollees who left the plan after the first survey mailing are still required to be included in the response data

  50. Disenrolled respondents tend to rate plans lower than current enrollees • 119 CSS plans in 2001 had disenrollees to purge • Of the 65 plans with at least 10 disenrollees, 50 (76.9%) had disenrolled respondents who were less satisfied with their health plan than those currently enrolled • Of the 25 plans with at least 20 disenrollees, 20 (80.0%) had disenrolled respondents who were less satisfied with their health plan than those currently enrolled

More Related