1 / 17

GEM 2009 Adult Population Survey (APS) Review Yana Litovsky Jeff Seaman

GEM 2009 Adult Population Survey (APS) Review Yana Litovsky Jeff Seaman Universidad del Desarrollo, Chile January 2010. Most Important. aps@gemconsortium.org. GEM 2009 Overview. 55 Countries 183,000 + individual interviews. Multiple languages.

erica
Download Presentation

GEM 2009 Adult Population Survey (APS) Review Yana Litovsky Jeff Seaman

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GEM 2009 Adult Population Survey (APS) Review Yana Litovsky Jeff Seaman Universidad del Desarrollo, Chile January 2010

  2. Most Important aps@gemconsortium.org

  3. GEM 2009 Overview • 55 Countries • 183,000 + individual interviews. • Multiple languages. • Each country responds to the RFP, selects their own vendor, manages their own process. • Common set of procedures and objectives. • Data harmonized to create consistent coding. • Multiple quality checks run on resulting data files.

  4. Data Processing 2009 • RFPs submitted, checked and approved. • If a revision was required, teams resubmitted. • Only 4 countries did not require any revisions. • Typical issues: 1) insufficient number of call-backs, 2) insufficient sample size, 3) no accounting for double counting fixed line and mobile phone respondents. • Teams asked to explain how they would fix problems from previous years: 1) no random selection, 2) over-representation of females • Teams asked to fill in missing information

  5. Data Processing 2009 • New/returning teams submitted pilot studies. • Data quality checked before being processed. • Check for missing data, incorrect skip logic, unexpected values, improper use of SPSS template and coding. • Open Ended responses coded; Education variable recoded into harmonized GEM variables. • Data processed and GEM variables/indices calculated (like TEA).

  6. Data Processing 2009: Positives • Teams communicated well with the coordination team • fast response to emails, willingness to ask questions and follow advice, providing valuable feedback on the APS process and our work • Most teams submitted data in the proper SPSS template using appropriate variables names and codes. • Almost every new (and some returning) country submitted a pilot sample. This is also a strong suggestion for ‘veteran’ teams that change vendor.

  7. Deadlines & Data Submission • Too many countries were late (29 countries submitted the APS over 10 days late, 6 as late as September). This produces much more work. • Over 10 teams did not submit a completed Fieldwork Report. • If this fieldwork report data is available, these need to submit it as soon as possible. • Some teams did not submit the RFP (Methodology overview) in the requested Excel template.

  8. Data Formats • Several countries used incorrect value codes for the OCCU demographic variable. • Most countries had errors in the SPSS template. • A number of teams failed to translate certain parts into English • Some countries did not create well-distributed income ranges. • Some teams created EDUC which didn’t correspond easily to the UN or GEM education variable. • Some Open Ended responses were ambiguous, lacked detail and used local terminology.

  9. Data Quality • All submitted datasets were inspected to make sure that the results made sense based on previous years or general expectations. • Some odd frequencies were found due to miscoded variables. • One country had to entirely resample due to oddly high entrepreneurship rates (TEA). Coordination Team worked with National Team to hypothesize source of the error and decide on a resampling strategy. Resample was successful.

  10. Data Quality Tests • Sample Size – Does the sample meet minimum GEM requirements? • Missing or refused for key questions – What percentage of respondents did not answer critical questions? • Random assignments – Was the process of randomly assigning respondents to sets of questions done correctly? • Incomplete Interviews – What percentage of all interviews started were not completed? • Refused Interviews – What percentage refused to respond to the survey? • Gender Ratio – Does the Female/Male ratio match that of the overall population?

  11. Gender Ratio • How well does the resulting sample match the gender distribution of the country for the age range being surveyed (18-64 or 18-99)?

  12. Random Assignment • Are the random assignments for questions (1g to 1j) and (1k to 1n) done correctly – or is the age and gender distribution significantly different between the two groups?

  13. Special Topic (Social Entrepreneurship) • A few teams expressed concern over the length and size of the Special Section this year. • Special Topic was long for very few respondents since most questions did not apply to most people. • One country was excused from this section due to restrictions imposed by sponsor.

  14. Improvements made during 2009 APS cycle • GEM weights were calculated using regional population distribution for 23 countries. • APS processing syntaxes were provided with APS results to create greater transparency about our methods and allow teams to check/process their own data.

  15. 2009 Conclusions • The process from the RFP submission to the posting of the final processed datasets was smooth and communicative. • Most problems with the submitted data were small and did not affect data quality. • Still, more attention needs to be paid to details involved in data submission • All data needs to be submitted (Fieldwork Reports) and submitted on deadline.

  16. Suggestions for Improvement • More teams to send us interim data (100 or so respondents) so that problems with data collection or entry can be identified early. • Any deviation from SPSS template and demographic variables (education) should be sent to data manager before data collection begins.

  17. Remember aps@gemconsortium.org

More Related