1 / 17

Data Quality: Treasure in/Treasure Out

Data Quality: Treasure in/Treasure Out. Victoria Essenmacher,  SPEC Associates Melanie Hwalek, SPEC Associates Portions of this presentation were created in partnership with Michel Lahti, University of Southern Maine (email: mlahti@usm.maine.edu)

jbruno
Download Presentation

Data Quality: Treasure in/Treasure Out

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Quality:Treasure in/Treasure Out Victoria Essenmacher,  SPEC Associates Melanie Hwalek, SPEC Associates Portions of this presentation were created in partnership with Michel Lahti, University of Southern Maine (email: mlahti@usm.maine.edu) SPEC Associates, 615 Griswold St., Ste. 1505, Detroit, MI 48226 Phone: (313) 964-0500 Web site: www.specassociates.org Emails: vessenmacher@specassociates.org mhwalek@specassociates.org Presentation for the Michigan Nonprofit Association SuperConference May 15, 2007

  2. 1. Government Performance and Results Act (1993) Message: What gets measured gets done 2. No Child Left Behind (2002) Message: Data-based decision-making for better or for worse 3. Sarbanes-Oxley Act (2002) Message: Everyone is accountable for quality of data reporting 4. Panel on the Non Profit Sector (2005) Recommendation #5: Provide detailed information about programs, including methods used to evaluate program outcomes

  3. What do we mean by internal quality assurance of data?

  4. What do we mean by internal quality assurance of data? The planning and implementation of practices that insure “adequate” and consistent quality related to the internal conduct of data planning, collection, analysis and reporting, in order to: • Provide a system for team members to follow • Eliminate (or minimize) human error • Provide “adequate” documentation of the program/project/organization

  5. How are issues of internal quality assurance addressed currently in the field of evaluation? • No broadly accepted guidelines or standards of data quality • Much variation - quality assurance is often particular to a person or organization • Can borrow from other fields

  6. Why might data quality be important to address in a nonprofit organization? • Have a known minimum accepted level of practice within your organization • Understand what data in reports actually mean and how they were derived • Provide “real” information to base decisions

  7. Today’s Agenda • Complete checklist: Organizational Capacity to Produce Defensible Program Data • Overview of eight steps to data quality • Quick summary of results: Identifying common weak areas • Exploration and examples for two common weak areas • Wrap up Q&A

  8. How does your organization score? Eight components of data quality assurance

  9. Planning for program data collection • Scenario: A program manager did not do a good job of documenting. He had to leave the program unexpectedly and the next person was left not knowing the program design, and what program information was being collected or planned to be collected. Because it took so much time to find all of the “pieces” the organization missed a reporting deadline to a funder.

  10. 2. Beginning data collection • Scenario: In the rush of the moment, a decision was made to add a follow-up survey to an evaluation that involved pre and post testing. The new survey was sent to the program sites and a third round of data were collected on program participants. There was no time to “proof” the survey prior to copying and distributing. Inadvertently, the unique ID# for each participant was left off of the survey tool meaning that the follow-up data could not be matched with the pre-post survey for proper data analyses. The time spent gathering the data was wasted.

  11. 3. Editing data collection • Scenario: A supervisor found it curious that a part-time telephone interviewer reported the exact same times on her timesheet every day, as compared with other part-time interviewers. Validation phone calls were made to persons interviewed by this interviewer. It was determined that the interviewer was not making the calls and fabricated the interview data.

  12. 4. Data entry and cleaning • Scenario: 20 program sites emailed Excel files of enrollment and school report card data to the program manager. A support staff manually cut and pasted each site’s spreadsheet into a master file so that results could be compared across programs. In looking at the master file, the evaluation manager found that the total number of records in the file was 50 students short of the total number of students that the program sites reported serving. There was no way to identify which 50 students were missing in the master file without repeating the cut and paste process.

  13. 5. Preparation of data for analysis • Scenario: A report from a community survey indicated that the average age of the urban community was 78. The community had no nursing home or retirement community. In the database, missing data was coded as 99.

  14. 6. Data analysis • Scenario: An organization established an annual customer satisfaction survey. The first year’s results showed very high customer satisfaction which was reported to the funder. Before the data from the second survey was analyzed, the program staff who analyzed the first survey data left and a new staff person was hired. The new staff person analyzed the second year’s survey and found a dramatic decrease in the level of customer satisfaction compared to year one. Sensing something might be wrong with the data, and after determining there was no documentation of the steps taken to analyze year one data, the new staff person reanalyzed the first year’s data and found different results from what had already been reported.

  15. 7. Reporting • Scenario: The highly distributed executive summary of program results that was released to the funder showed that 90% of the program participants completed the program. The data table in the released full report showed only 70% completed. The actual output from the statistical analysis showed that 70% of the participants completed the program. The clerical staff inadvertently typed a “9” instead of a “7” in the executive summary. The program director had to address the difference (admit to the error) with the funder.

  16. 8. Data storage and security • Scenario: Due to a hardware failure the data files were not accessible. Since data were not backed up on a regular basis, the two year’s worth of program data completed to date was lost, just before the final analyses for the program were addressed. Final program performance reports to the board of directors and funders could not be produced.

  17. Wrap Up Q&A

More Related