1 / 0

Seeking Voices of People: Child Welfare Survey Design and Implementation

Seeking Voices of People: Child Welfare Survey Design and Implementation. Presenters: Dr. Roger Tourangeau and Dr. Ruth Huebner Moderator: George Gabel February 23, 2012. Webinar Attendee Participation. Your Participation. Open and hide your control panel Join audio:

opa
Download Presentation

Seeking Voices of People: Child Welfare Survey Design and Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Seeking Voices of People: Child Welfare Survey Design and Implementation

    Presenters: Dr. Roger Tourangeau and Dr. Ruth Huebner Moderator: George Gabel February 23, 2012
  2. Webinar Attendee Participation Your Participation Open and hide your control panel Join audio: Choose “Mic & Speakers” to use VoIP Choose “Telephone” and dial using the information provided Submit questions and comments via the Questions panel Note: Today’s presentation is being recorded and will be posted on our website.
  3. Agenda

    Introduction :George Gabel, NRC-CWDT Issues in Survey Design: Roger Tourangeau, Westat Case Studies in Survey Design and Implementation in Child Welfare: Ruth Huebner, NRC-CWDT Questions and Answers: All Future Topics in Program Evaluation Initiative: George Gabel, NRC-CWDT
  4. Issues in Survey Design

    Roger Tourangeau Westat, Inc.
  5. Outline Brief discussion of surveys Two inferential challenges Characterizing the population (representation) Characterizing individuals (measurement) Representational issues: Coverage and nonresponse Measurement issues: Flaws in self-report data and in records Testing questions
  6. What is a Survey? A systematic method for gathering information from (a sample of) entities for the purposes of constructing quantitative descriptions of the characteristics (including attitudes and opinions) of the larger population of which the entities are members.
  7. What is a Survey? A systematicmethod for gathering information from (a sample of) entities for the purposes of constructing quantitative descriptions of the characteristics (including attitudes and opinions) of the larger population of which the entities are members. Procedures are prespecified, standardized, and explicit, to permit replication by another
  8. What is a Survey? A systematic method for gathering information from a sample ofentities for the purposes of constructing quantitative descriptions of the characteristics (including attitudes and opinions) of the larger population of which the entities are members. Censuses are considered surveys; samples are useful and informative to the extent they are microcosms of the larger populations; data can be administrative records (or records can be frame for the survey)
  9. What is a Survey? A systematic method for gathering information from (a sample of) entities for the purposes of constructing quantitative descriptions of the characteristics (including attitudes and opinions) of the larger population of which the entities are members. The product of surveys is numbers --“descriptive statistics” like means and proportions; “analytic statistics” like regression coefficients
  10. Relative Strengths of the Sample Survey Standardized procedures permit replicability Statistical theories justify use to describe large populations
  11. Relative Weaknesses of Sample Surveys Richness of information less than some qualitative methods Standardized measurement sometimes fails in diverse populations Systematic failures to measure sample members can bias conclusions
  12. Characteristics of the population Inference Characteristics of a respondent Characteristics of the sample statistical computing Inference Respondent answers to questions
  13. Survey Lifecycle from a Design Perspective Representation Measurement Target Population Construct Sampling Frame Measurement Sample Response Respondents Edited Data Postsurvey Adjustments Survey Statistic
  14. Survey Lifecycle : Threats to Validity Representation Measurement Target Population Construct Coverage Error Validity Sampling Frame Measurement Sampling Error Measurement Error Sample Nonresponse Error Response Respondents Processing Error Adjustment Error Edited Data Postsurvey Adjustments Survey Statistic
  15. Representing the Population Representation Children in foster care in a given state Target Population Agency lists of such foster children Sampling Frame Probability sample of children in foster care Sample Persons actually providing answers to interviewer-administered questionnaire Respondents Selection probability, nonresponse, poststratification weighting Postsurvey Adjustments Survey Statistic
  16. Two Major Challenges Coverage Error: Discrepancy between target and frame population Example: Survey omits children who came into foster care through volunteer placements Nonresponse: Discrepancy between sample selected for a survey and sample that completes it (respondents) Example: Survey underrepresents foster children in group settings (where there are additional gatekeepers)
  17. Ineligible units Ineligible units Covered Population Undercoverage Undercoverage Target population Frame population Coverage of a Target Population by a Frame
  18. Since the total survey population can be divided in those covered and those not covered by the frame: YN = (C/N) Yc + (U/N)(Yu) which can be written Bias due to noncoverage Yc = YN + (U/N)(Yc - Yu) That is, undercoverage error for the sample mean is a function of the undercoverage rate and the difference between means for covered and uncovered cases
  19. Data Items Interview Data Frame Data 00012456431 00026585464 00036549814 00045987984 00056498841 00066516557 00076165891 00086841133 00096519846 00109898191 00116986516 121485357981598745354836564628646 325975412874397123971253468423211 54979  961231368741686516365417962 3546841658674681646106457658743657 3216549874681687651319846094064613 651968416849646549665416965498416 879849619819846196498198498164981 6516579616541654  9651651613213477 Item Missing Data Respondents Sample Cases attitudes and opinions) Nonrespondents                                                            
  20. Since the total sample can be divided into respondents and nonrespondents: yn = (r/n) yr + (m/n)(ym) which can be written Bias due to nonresponse yr = yn + (m/n)(yr - ym) That is, nonresponse error for the sample mean is a function of the nonresponse rate and the difference between means for respondents and nonrespondents
  21. Two Recommendations Don’t do a survey if you know the sample is going to be unrepresentative because of coverage problems Don’t do a survey if the response rate is likely to be poor and nonresponse is likely to produce selection biases Mode of data collection (telephone vs. face-to-face vs. mail vs. web) likely to affect both types of error
  22. Survey Lifecycle from a Measurement Perspective Measurement Construct Extent to which the measure reflect the targeted construct (attitude toward foster care) Validity Measurement Response Edited Data Survey Statistic
  23. Survey Lifecycle from a Measurement Perspective Measurement Construct Measurement Departure of the response to a measure from the true value of the measure for the respondent Measurement Error Response Edited Data Survey Statistic
  24. The Simplest Response Error Model Yi = µi + i for the i-th person
  25. Response Error Four major processes involved in answering survey questions --Comprehension --Retrieval --Judgment/Estimation --Reporting (mapping, “editing”) Errors arise if the respondents --Misunderstand the question --Can’t remember (or never knew) the requested information --Make poor judgments --Deliberately misreport (e.g., if question is sensitive)
  26. Errors in Administrative Records Records can be (and usually are) Incomplete Inaccurate Out of date
  27. Survey Lifecycle from a Measurement Perspective Measurement Construct Measurement Response Departures of data from true value due to coding or editing of responses Processing Error Edited Data Survey Statistic
  28. Goals of the Questionnaire Design Process To reduce response error To develop measures that are: Reliable Valid Standard (Comparability) Easy to administer
  29. The Process of Designing Questionnaires Decide what you want to measure Search for existing items or multi-item scales Conduct focus groups To get a sense of how non-experts think and talk about the topic; two to four, each with 8 to 10 participants Draft new questions Format and compile in sequence--put together a draft questionnaire Get feedback from experts Do cognitive interviews — 16 to 20 Revise based on Steps 6 & 7
  30. The Process of Designing Questionnaires (cont’d) 9. Do a field test (n=30 to 50) Look at marginals Get feedback from field interviewers (quantitative vs. qualitative) Monitor and code interviews 10. Do larger tests Dress rehearsal Split ballot Random probes Reinterview, record check
  31. Recommendations Don’t do a survey unless You are using well-validated questions or scales You can pretest any new items (expert review, cognitive testing, or field test) Preferably both Don’t ask people if They don’t know Can’t remember Or won’t tell you Select an appropriate mode; two key dimensions: Computer assistance Self-administration
  32. Case Studies in Survey Design and Implementation in Child Welfare

    Ruth A. Huebner, PhD Consultant, NRCCWDT
  33. Survey Design Steps for Any Group: Focus Today on Customers/Clients Decide on the purpose of the Survey and the application level. Reliable at the state, regional, county level? What do you want to learn by seeking the voices of people. Family Receiving Family Preservation Services Assess satisfaction with services, understand their experiences, barriers to change, and outcomes. Youth in Foster Care Survey Assess the youth’s experience in foster care and their sense of involvement and empowerment. Father’s Survey To compare father’s and worker’s perception and guide development of father initiative.
  34. Develop Survey Items Brainstorming or focus groups to determine what they want to know – what is important to ask? Service providers Client representatives Review the literature and look for other surveys. There may be existing surveys or survey items or the literature may suggest items to include. Convert topics into survey items: Group these together into ideas Develop item root and scale (5-point, 4-point rating), checklists, or semantic differential scales.
  35. Refine, Refine, Refine, Refine Each Survey Item Make every word count and every word clear. Eliminate or simplify items that have two questions in them: There will be some and you need to find these. E.g., How satisfied were you with you and your families’ involvement in case planning” Test the scale and refine it again and again Ask folks with no knowledge of the survey. The intended survey group – Did we ask the right questions? Do the questions make sense? How would you answer this? What does it mean to you? Refine every word! Reduce the reading level to 6th grade level at most and shorten the questions. No more than 2 pages long front and back.
  36. Before You Send Out Keep in mind that a survey is an intervention. If we truly value our customers in child welfare as partners in decision making, then we need to ask for their ideas and implement the survey with great respect for them. Have legal services review the survey and methodology get approval for the research through the appropriate research board. Make sure that someone is available to take calls and questions from respondents. Provide a toll-free number to call. Make all reading (cover letter included) at or below 6th grade reading level,
  37. Family Preservation Survey of Clients A mailed survey to all Kentucky families that received Family Preservation services between July 1, 2006, and March 1, 2007. A list of service recipients was downloaded from a data collection system and paired with state administrative data for all families receiving services. The list was sent to providers to correct addresses since they follow-up with families and may know of changes. This survey was sent out using three separate mailings: An initial mailing A second mailing 3 weeks later A reminder postcard
  38. Mailing is Important to Build Trust The initial mailing for this and all surveys to clients was implemented like this: The cover letter was customized for each family and asked for their help in making services better, signed by the Commissioner. The letter and survey were hand folded so that the letter, survey and return envelope come out as one. The return envelope included a ‘real stamp’ and a return to a post office box with the researcher’s name – not to the agency. The second mailing included a new cover letter, a replacement survey, and a business reply envelope. These were hand folded and customized as above. The postcard reminder included a number to call for a replacement survey.
  39. Family Preservation Response Rates Of the 897 surveys that were mailed out, 200 were returned as undeliverable; 194 completed surveys were received, for a response rate of 27.8%. Surveys are anonymous and to build trust, ask demographic questions sparingly. Pre-coded the surveys in transparent ways: County Type of services as either Family Preservation or Family Reunification Sometimes pre-coding will help to test the differences between respondents and non-respondents.
  40. Demographics of Respondents 75% of respondents identified themselves as mothers, 7% as fathers, 10% were grandparents, 4.7% were other relatives, and 2.9% were adoptive parents. Experience with CPS: 10% of clients reported some or all of their children were currently in state custody; 36.4% reported that some or all of their children had previously been in state custody, but are not now in state custody. 53.8% reported that their children had never been in state custody.
  41. Check what the in-home worker did to help your family: (check all that apply)
  42. Please check all the things that it made it hard to change in your family (check all that apply)
  43. Think about the changes you made with the in-home worker
  44. My in-home worker met with me … Are Services Enough?
  45. Survey of Youth in Care: Included in Program Improvement Plan Round One: Surveys developed after focus groups and interviews with youth by teams and tested as described earlier. Surveys sent to all youth in state foster homes ages 12-21 on January 10. 2005 Names were sent to regional managers who deleted names of youth too incapacitated to participate. This step also gave state representatives time to approve the youth’s participation in the survey as deemed appropriate by legal services. Letters were sent to foster parents before the survey to describe the survey and ask for their assistance and support. 2 mailings sent about 3 weeks apart as described earlier.
  46. Youth Involvement (n = 750 youth 12-21 y/o: 50% response) Regarding meetings about my placement…(Sometimes, Often, Always)
  47. Involvement in Decisions: Surveys of Customers/Clients
  48. I practice these skills in my foster home (n=750 youth 12-21 years)
  49. My foster parents:(n=750 youth in foster care 12-21 years)
  50. I Dislike Foster Care Because
  51. Social worker changes Need more allowance Want to drive More freedom More opportunities after high school besides college Want to be with birth family Dislike foster care altogether Negative comments about foster care:
  52. Father’s SurveyA Needs Assessment of Fathers Involved in Child Protective Services

    April 2005
  53. Although federal and state initiatives encourage engaging fathers, there is little information to guide this effort. This sub-population of our customer/client base is dealing with challenging family conditions. Need to know of their experiences as our customers to improve outcomes for children. Why Fathers in Child Welfare?
  54. Design and Implementation CPS and policy specialists, field social workers, university faculty, and a father with experience with the agency designed the survey and field tested it with two fathers prior to finalization. Fathers identified in active child welfare cases as biological, legal and or adoptive fathers during a three month time period.
  55. Survey Response Rate
  56. I was invited to meetings regarding my family.
  57. My ideas about my children were taken seriously.
  58. Survey of CPS Fathers Contrasting Ideas Received Visits with children (51%) Family therapy (30%) Parenting classes (30%) Legal services (23%) Drug and alcohol treatment (15%) Wished they had Fathers Support Group (40%) Legal Services (36%) Family Therapy (34%) Visits to my child (25%) Finding housing (24%) Parenting Classes (23%) Huebner, R. A., White, S., Hartwig, S., Werner, M, & Shewa, D. (2008). Engaging fathers: Needs and satisfaction in child protective services. Administration in Social Work, 32 (2), 87-103.
  59. "Your services are great and have helped us tremendously. Our social worker is a great person and cares greatly about her clients and their needs." Thank you for asking for a fathers point of view!!! It seems since 1992 I have been put on a back burner just because I am a father. It's about time fathers start getting rights too!!!. Thank you again. Comments/Positive
  60. Where to find More Information Groves, R.M., Fowler, F.J., Couper, M.P., Lepkowski, J.M., Singer, E., & Tourangeau, R. (2009).  Survey Methodology (2nd Edition).  Hoboken, NJ:  John Wiley & Sons. Huebner, R. A., Robertson, L., Roberts, C., Brock, A. & Geremia, V. (In Press). Family Preservation: Cost Avoidance and CFSR Outcomes. Journal of Public Child Welfare. Kentucky’s Family Preservation Program Evaluation http://chfs.ky.gov/nr/rdonlyres/1c6c930e-a2d9-4336-8cbf-cda1c2d2d31a/0/fppevaluation_final.pdf Huebner, R. A., White, S., Hartwig, S., Werner, M, & Shewa, D. (2008). Engaging fathers: Needs and satisfaction in child protective services. Administration in Social Work, 32 (2), 87-103. Huebner, R. A., Jones, B., Miller, V.P., Custer, M., Critchfield, B. (2006). Comprehensive family services and customer satisfaction outcomes. Child Welfare, 85 (4). 691-714. Huebner, R. A., Jennings, M., & Schaaf, S. (2002, May). Customer satisfaction: An outcome to guide policy and practice. Government News, 24-28. (Invited).
  61. Questions and Answers?

  62. NRC-CWDT Program Evaluation Initiative

    Suggestions for Future Program
  63. Thanks for Joining us!

    Contact: George Gabel - georgegabel@westat.com Ruth Huebner – ruthhuebner@hotmail.com
More Related