1 / 57

Survey Nonresponse: A Decision-Making Approach

Survey Nonresponse: A Decision-Making Approach. Roger Tourangeau Joint Program in Survey Methodology, University of Maryland Survey Research Center, University of Michigan . Outline. Falling Response Rates Many surveys taking countermeasures

perdy
Download Presentation

Survey Nonresponse: A Decision-Making Approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Survey Nonresponse: A Decision-Making Approach Roger Tourangeau Joint Program in Survey Methodology, University of Maryland Survey Research Center, University of Michigan

  2. Outline • Falling Response Rates • Many surveys taking countermeasures • Costs are rising, but response rates still falling • What’s Behind It? • Variety of theories • Lack of civic engagement; value of polls and related activities • Decrease in discretionary time • Fending off unwanted intrusions now routinized • At the individual level, how do people decide? • The Salience-leverage model • Belief-sampling as a model for quick judgments • The heuristics approach

  3. Outline (Cont’d) • Does the decline matter? The relation between nonresponse rates and nonresponse error • The impact on surveys • Rising use of incentives: Is there a cost? • Surveys as opportunity/obligation vs. survey as transaction • Many thanks to Bob Groves, from whom I stole many slides (and many ideas)!!

  4. Falling Response Rates

  5. What is Nonresponse? • Unit nonresponse is the failure to obtain survey measures on a sample unit • It occurs after the sampling step of survey • It reflects total failure to obtain survey data (won’t talk about item nonresponse, the failure to obtain an answer to a given item)

  6. Total Nonresponse and Refusal Rates Increasing over Time for BLS Consumer Expenditure Survey Census Bureau

  7. Until Recently, Current Population Surveys Rates Had Been Stable Census Bureau

  8. National Health Interview Survey Nonresponse Trends

  9. Also Apparent for Telephone Surveys: Survey of Consumer Attitudes • Thanks to Rich Curtin for the data

  10. Multi-Country Studies of Cooperation (de Leeuw & de Heer, 2002) • 16 countries (Western Europe and U.S.) • As many as 10 ongoing surveys (mostly central government sponsored), mid-1980’s-late 1990’s • Labor force • Consumer expenditure • Health • Travel • On average, 3 percentage point decline per year in cooperation rate

  11. Expected Proportion Noncontacted and Refused Under deLeeuw & deHeer Model by Year

  12. Impact on Costs per Case • Costs have risen as surveys take countermeasures • More extensive use of advance letters (especially in telephone surveys where they weren’t used before) • More extensive use of incentives (revisit issue of cost impact) • More callbacks • Steeh et al. (2001) present evidence that Michigan’s Survey of Consumer Attitudes used to take around 6 calls per complete (mid-1990’s) • By 1999, it took 12 calls on average • Seems clear in U.S. that survey costs rising far faster than inflation

  13. Theories of Response and Nonresponse: The Sociology of Nonresponse

  14. Three Forms of Nonresponse • Noncontact • Noncooperation • Inability • Unfavorable societal developments on all three fronts

  15. Noncontact: Face-to-Face Surveys • Rise of doormen buildings, locked condos, and gated communities • More than eight million American now live in gated communities and nearly 40 percent of newly built residential developments are gated (Blakely and Snyder, 1997) • New residential arrangements featuring gatekeepers (assisted living, nursing homes, etc)

  16. Noncontact: Telephone Surveys • Rise of answering machines, Caller-ID, cell phones • By 1995, most U.S. households had answering machines and roughly 40 percent reported they used them to screen their calls (Tuckel and O’Neill, 1995) • By 1996, about 10 percent of all households nationally had Caller-ID.

  17. Inability To Provide Data • Reflects both physical/mental limitations and language barriers • Rising proportion of the population is 65 or older • Concomitant increase in hearing problems, other disabilities • Increase in immigrant populations • 2002: 11.5 percent of the U.S. population was foreign-born • According to Long Form data from Census 2000, 8.1 percent of the population over age five reported that they speak English less than “very well.” • Many surveys now field both Spanish and English questionnaires, but only two-thirds of those who are less than completely fluent in English are Spanish speakers.

  18. General Theories Regarding (Non)-Cooperation • Still, the big problem is non-cooperation • Some theories are couched in terms of societal trends, others based on person-level characteristics • Nonetheless, although level of analysis is different, the ultimate causal mechanisms in these theories are similar • Three accounts widely cited • People are too busy • People are too self-absorbed • People are erecting barriers to unwanted intrusions • Both noncontact and non-cooperation may be the result

  19. Too Busy • More people are labor force participants (e.g., 66.0% of all civilians, 16+ in U.S. were in the labor force in 2004 vs. 60.2% in 1970) • The change is particularly dramatic for women (from whom respondents are disproportionately drawn): 59.2% of all women 16+ in the U.S. were in labor force in 2004 vs. 43.3% in 1970 • 60% of women who worked at all during 2003 were full-time (vs. 41% in 1970) • Societal trend with individual-level impact: Opportunity costs of survey participation too high

  20. But Are People Really any Busier? • Fewer adults are married than 25 years ago; also, fewer are parents • More and more people are retired and they are retiring at younger ages; according to Robinson and Godbey, Americans aged 55-64 gained an average of ~10 hours of free time per week since 1965 • Again, according to Robinson and Godbey’s time diary studies, Americans have gained about 5 hours of free time per week on average since the 1960s • Nonetheless, people feel busier, in part because of relentless multitasking • Based on their perception, they may be more reluctant to give up free time

  21. Too Self-Absorbed • Many survey researchers subscribe to one version or other of the “social capital” hypothesis • Response rates falling for the same reason as declines in voting, other forms of civic participation; people feel less obligated, less interested in helping others

  22. Groves, Singer, and Corning (2000) • Groves, Singer, and Corning (2000) assessed community participation in face-to-face survey: five items on joining organization to solve some community problem, writing to officials, doing volunteer work, etc. • Civic duty: “A feeling of obligation to provide help … in the belief that the common good is thereby served.” • Apparently independent” mail survey request of those completing a face to face survey, with reported community involvement attributes • $5 incentive experiment, prepaid

  23. Results • Overall, about 15% difference in response rates: 58.0% (262) vs. 43.1% (116); even bigger diff. with no incentive

  24. Implications • Could explain why election polls, which traditionally get low response rates, nonetheless generally give accurate results • Those most likely to vote overrepresented in polls; both surveys and elections overrepresent those high in involvement, social capital

  25. Too Many Unwanted Intrusions • Modern life often seems to consist of continuous bombardment with unwanted information, intrusions via every medium • Junk mail • Telemarketing • Spam • Panhandling in big cities • In response, people take countermeasures to limit access • Spam filters • Do Not Call lists, Caller-ID, answering machines • Crackdown on panhandling in NYC and elsewhere • Gated communities, locked apartment buildings, etc.

  26. Unwanted Intrusions—II • May reflect diminished community involvement, sense of busyness • Whatever the cause, contactability and willingness to cooperate may not be distinct phenomena but reflect effects to fend off unwanted contacts

  27. Theories of Response and Nonresponse: The Psychology of Nonresponse

  28. Leverage-Salience Theory of Survey Cooperation • How do these societal trends play out at the individual level? • Leverage-salience offers one important account • Persons vary in the magnitude and direction (positive and negative) of influence of various psychological predispositions toward survey participation in general and toward various design features (topic, sponsor): Leverage • The information about the survey request processed by the person varies due to interviewer variation in introductory scripts and their cognitive associations with information provided (Salience)

  29. Authority of Sponsor Authority of Sponsor Incentive Incentive Topic Burden Topic Person 1 Person 2

  30. Burden Authority of Sponsor Incentive Topic Authority of Sponsor Incentive Topic Person 1 Person 2

  31. Implication of Leverage-Salience Theory for Nonresponse Bias • People make decisions to cooperate or refuse on different bases • If the salience of design attributes or focus of interviewer behavior systematically varies over contacts, then decisions can be based on different weightings of attributes over contacts • Bias results when same survey attribute related to survey variable and survey participation decision (“common cause” model)

  32. Groves, Presser, and Dipko — I • Sample from five frames (four list samples plus RDD) — teachers, parents of children under 6 months, people 65+, contributors to fringe candidates • Two IVs: Topic (intro. Mentions topic twice) and letter plus incentive (half get letter plus $5) • Done by Maryland SRC plus 12 Practicum students • Response rate ~63.0

  33. Groves, Presser, and Dipko — II • Key result: Outcome of first contact in which topic mentioned • Incentives diminish topic effects for teachers and seniors, but increase them for parents

  34. The Belief-Sampling Model • Model of attitude judgments made on the fly; four key components; salience-leverage a special case • Determine the issue (the pool of beliefs, values, impressions, existing judgments) from which sample will be drawn • Sample some of these considerations (i.e., think about the issue)—probability of retrieving any given consideration related to its accessibility (salience may determine accessibility) • Scale (leverage) and integrate • Anchor-and-adjustment • Average/weighted average • Map onto response scale

  35. Formal model • Integration Phase • Reliability over two occasions n1: Number of considerations sampled at time 1 n2: Number of considerations sampled at time 2 ρ1: Consistency in assigning scale values (scaling consistency) ρ2: Homogeniety in pool of considerations (homogeniety) q: Overlap between samples, expressed as proportion of n2 (overlap)

  36. An Alternative: Rules/Policies for Dealing with Unwanted Intrusions • Both leverage-salience and belief-sampling assume that people make a decision at the time survey request is made • Maybe their past decisions have hardened into a policy—a set of unconscious/semi-conscious procedures for dealing with unwanted intrusions

  37. Example Policies • Throw out all mail, unless • It’s clearly a bill • It’s clearly a personal letter • It’s from a familiar organization to which you are sympathetic • Hang up/don’t answer telephone, unless • It’s a personal call from someone you know • It’s from a familiar organization to which you are sympathetic • It’s a business call that you are expecting

  38. Implications for Nonresponse Bias • Rules could vary by subgroup, changing composition of samples in predictable ways • Consistent with studies of telephone response, which indicate nonresponse happens very quickly • Important to discover rules and learn • How to avoid quick judgment (by making policy seem inapplicable; “I’m not selling anything”) • How to get people to suspend rule; people do make exceptions to policies or alter them • Advance letters may help (moving call into different category—expected business call)

  39. Does Nonresponse Matter? Empirical Estimates of the Impact of Nonresponse

  40. Nonresponse Bias • Classic formula for bias in uncorrected mean:

  41. Nonresponse Bias — II • Assumes nonresponse deterministic; two types of people • Those who never respond (WN) • Those who always respond (1- WN ) • Sampling error in mix, so nonresponse rate (and error) is not fixed • Alternative: • People have a response propensity (probability that they’ll respond); • Response propensity: Probability that they’ll be contacted, agree to cooperate, etc. • As a result, even if same people in the sample, outcome might be different

  42. Nonresponse Bias — III • Bias now depends on correlation between p (response propensity) and y (substantive variable)

  43. Empirical Estimates of Bias • Three papers, taken together, suggest that as an empirical matter, nonresponse may not be that big of a problem • Keeter et al. compare two surveys that vary in response rates • Merkle and Edelman examine within-precinct error as a function of precinct response rate • Curtin et al.: Look at what would have been result if SCA had stopped interviewing sooner

  44. Keeter, Miller, Kohut, Groves, & Presser • Compared two surveys—standard vs. rigorous—with same questionnaire but different field procedures designed to yield different response rates

  45. Keeter et al.—II • Big weakness—confounds many variables (R rule, advance letter, race and experience of interviewers, etc.) • Big strengths • Examine 91 items from a variety of domains • Big differences in response rates • 36.0% for standard vs. 60.6% for rigorous • Mostly due to differences in contact rates (68.5% vs. 92.0%) rather than cooperation rates (58.1% vs. 73.7% cooperation)

  46. Keeter et al.—III • Key result: Differences in estimates • 14/91 significant • Mean over 91 items around 2% (Is this big or small? What if this were unemployment rate?) • Largest difference (9 percent) involves interviewer rating of R interest (rigorous adds less interested cases) • Other findings • Standard survey doesn’t seem to underrepresent Republicans or conservatives relative to rigorous • Reluctant R’s (those classified as refusals) have higher item nonresponse rates

  47. Merkle and Edelman — I • Examine relation between within precinct error (relative to actual vote) and precinct response rate • Strength: Know the truth and can a direct measure of bias • Weakness: Purely correlational • Key results: Little relation between nonresponse and error

  48. Merkle and Edelman — II

  49. Curtin, Presser, and Singer • Simulate effect of changing procedures in Survey of Consumer Attitudes (e.g., reducing number of callbacks) • How would estimates change if we drop cases interviewed after call x? Converted cases? Percent Significant Differences

  50. Nonresponse Rates and Nonresponse Bias • Overall picture: Little nonresponse bias • Within a certain range (~25 to ~65%) • Not looking for small effects (change in percentage of 0.2%) • Two papers lead to different conclusions • Groves, Presser, & Dipko (2004) • Teitler, Reichman, & Sprachman (2003)

More Related