1 / 54

Data Quality & Recordkeeping Avoiding Reporting Pitfalls

Data Quality & Recordkeeping Avoiding Reporting Pitfalls. Sound Check. Audio is available via the internet Please be sure your— speakers are on & your volume turned up. Speakers . Risk Management Service David Cattin Otis Wilson David Downey Cynthia Brown. Introduction.

seanna
Download Presentation

Data Quality & Recordkeeping Avoiding Reporting Pitfalls

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Quality & Recordkeeping Avoiding Reporting Pitfalls

  2. Sound Check Audio is available via the internet Please be sure your— speakers are on & your volume turned up

  3. Speakers Risk Management Service David Cattin Otis Wilson David Downey Cynthia Brown

  4. Introduction

  5. Transparency & Accountability • Efficient • Economical • Effective • Ethical • Equitable RESULTS! Federal Funding Accountability & Transparency Act

  6. What is Transparency? A tool to promote results & accountability • Accurate data • Presented in context • Meaningful information • Accessible & easy for public use `

  7. Treatment of Recovery Act Funds Recordkeeping processes and procedures for ARRA funds follow the same guidelines, rules and regulations that apply to other ED grants

  8. REPORT FRAUD AND MISCONDUCT TO THE ED OIG All ARRA grants include “the requirement that each grantee or sub-grantee awarded funds made available under the Recovery Act shall promptly refer to an appropriate inspector general any credible evidence that a principal, employee, agent, contractor, sub-grantee, sub-contractor, or other person has submitted a false claim under the False Claims Act or has committed a criminal or civil violation of laws pertaining to fraud, conflict of interest, bribery, gratuity, or similar misconduct involving those funds.” 1-800-MISUSED OIG.HOTLINE@ED.GOV FAX 202-260-0230

  9. Learning Objectives • Describe key reasons data quality is important • Know & apply data quality principles: quality, objectivity, utility and integrity • Assess methodology for data collection • Identify & apply federal record retention requirements • Avoid reporting pitfalls identified by GAO • Implement best practices

  10. Key Principles

  11. Quality • Encompassing term • Incorporates Objectivity, Integrity, Utility • Ensures usefulness, accuracy, reliable, secure • Integral to creation, collection, maintenance • ED reviews before dissemination

  12. Objectivity • Accurate, reliable, unbiased info • Use dependable source & appropriate techniques • Include documentation • Describe any errors that impact quality

  13. Integrity • Security and protection of info • Ensure info is not: Compromised Corrupt Falsified

  14. Utility • Usefulness of info to intended users • Stay informed of info needs • Care during review stage to ensure clarity • General info – clear & readable • Admin & Program Data – described & documented • Statistical Data – fill data gaps

  15. Methodology

  16. Assessing Methodology • Essential step in the project’s development • Preparation is key • Anticipate potential problem areas • - Address in the design phase • Ensure consistency across the board • Communicate clearly with staff collecting and recording data

  17. Where Methodology Goes Wrong • Problems by design • Problems arising from human error Determine just where they overlap and how to eliminate, reduce, negate their effects on reporting. Human Error Design

  18. Where Methodology Goes Wrong • Fundamental design flaws • Data collection mechanisms • Documentation • Connection with the target population

  19. Design Concerns • Determine most relevant time to collect data • Define the data to be gathered • Collect only the necessary data • Limit the intervals between sampling • Develop appropriate evaluation tools • Train staff accordingly

  20. Target Population Issues • Follow Family Educational Rights and • Privacy Act (FERPA) and human subject • research requirements to the letter • Utilize varied data collection strategies • Consider target population demographics • Create a non-threatening, non-intrusive • environment to conduct interview, • complete surveys, etc.

  21. Recordkeeping

  22. Record Retention Period Three years from… • Final expenditure report • Quarterly or annual financial report Exceptions… Litigation, real property, record transfer, indirect cost rate proposals, cost allocation plans

  23. Access Secretary, IG, Comptroller General… timely, unrestricted access DOCUMENTS AND PERSONNEL …audits, examinations, excerpts, transcripts, and copies. …as long as the records are retained.

  24. Types of Records Grant funds Compliance Performance

  25. Confidentiality & Privacy • Generally no restrictions on recipient records • Student records – covered by FERPA

  26. Recovery Act Data Quality

  27. Data Quality & ARRA Section 1512 Reports • Section 1512 reporting presents new challenges for data collection and reporting • Under the Federal Financial Accountability and Transparency Act – October 1st, 2010 sub-recipient reporting will become the norm www.USAspending.gov

  28. #1 Section 1512 Pitfall -Failure to Report • Quarterly reporting begins the quarter you first receive a grant (even before spending) • Track ARRA funds separately • Need 100% of sub-recipients and contracts • Keep reporting until all funds expended & activities complete • Then mark last report “FINAL”

  29. What’s Different with ARRA? • Basic expenditure reporting • Reported funds “received” should match G5 draws • Automate expenditure data – pull from finance system • Keep records of your methodology related to data collection and estimations Sub-recipient information Subs may input their data Subs must provide jobs and infrastructure information Estimating # jobs created/retained Increased oversight Public transparency

  30. Avoiding Sub-recipient Reporting Pitfalls • Collect consistent information from sub-recipients • Provide good instructions and definitions • Collect complete and accurate information from each • Know who should report & review their reports • Validate sub-recipient data • Extract or verify against finance system • Ensure they maintain records • Provide instructions on recordkeeping • Monitor subs’ data collection and internal controls over reporting • Provide guidance on good practices

  31. Be careful what you ask for… What you ask affects whether the answer will make sense • Carefully target data collection to the data you need • Carefully word the questions to get consistent & meaningful responses • Define terms and data elements • Provide the formula to use for calculations

  32. Jobs Calculation Pitfalls • OMB established new guidance in December 2009 – count jobs funded by ARRA • Ensure all sub-recipients use the same methodology • “Reality check” – Does the number of jobs reported by sub-recipients and contractors make sense given their funds/activities? Does the total number reported for the grant make sense given the total amount of the grant?

  33. Public Transparency • Public access to information is a challenge • Present data in a way that is not misleading • Describe expenditures in a meaningful way –explaining difference between “spent” i.e., obligated and “expenditure” i.e., federal funds have been drawn and disbursed • Describe activities in a meaningful way • Expect questions – maintain records

  34. What Should Be Transparent? Graphic by Steve Gunn • Raw data is insufficient • Present information in context • Analyzed and summarized • Mission-related information • Transparency that meets the goal • Public accountability • Fraud detection and reporting • Public participation and feedback

  35. Public Transparency – Tell the whole story • Expenditure reports “iPods” • New reports, public complaints, Congressional investigation • Not about iTunes –iPods have instructional purposes • Public won’t know that unless reports specify it

  36. More Questionable Reports… Even if not required, it may be a good idea to provide some explanation of the type of service procured on some contracts: • Travel expenses • Restaurant bills • Recreation parks • Tiffany

  37. Pitfall Patrol • The public • The press • Stakeholder groups • The Recovery Accountability &Transparency Board • U.S. Government Accountability Office • ED’s Office of Inspector General

  38. ED OIG Internal Controls Review • Found many agencies should strengthen controls over ARRA reporting requirements • Procedures to collect, review and report data including assignment of individual responsibilities • Clear guidance to sub-recipients & contractors • Methodology for estimating jobs created/saved • Use existing systems to extent possible • Modify existing systems or procedures if they fall short of new needs

  39. ARRA Recordkeeping Findings US GAO and ED OIG auditing internal controls over data and reporting • Have established written procedures • Establish and execute data quality review procedures • Maintain documentation of guidance and instructions to staff, sub-recipients and contractors • Maintain data used to calculate jobs created/retained

  40. GAO Concerns Raised ARRA Section 1512 reports Nov 2009 findings: • Erroneous and questionable data entry • Lack of data quality review • Issues in calculation of FTEs for jobs created/retained

  41. Erroneous Data Entry • GAO found: • Misidentification of awarding agencies • Implausible dollar amounts • Discrepancies between actual award amounts • and reports received • Actions: • Check CFDA #, the Federal Agency Codes and Treasury Account Symbol (TAS) • “Reality Check”– Does data make sense? • Verify data against other sources, i.e., finance system

  42. Lack of a Quality Review GAO found: • Issues in 10% of recipient reports • 75% of the prime reports were marked as undergoing review by federal agency • Less than 1% were marked as undergoing a review by the prime recipient Action: 1) Prime recipient is responsible for accuracy and completeness of sub-recipients’ reports

  43. Properly Calculating FTEs GAO found: • Problems with interpreting OMB guidance • Lack of consistency in applying measurements • Variations in length of time reported • Actions: • Use OMB’s simplified guidance http://www.whitehouse.gov/omb/assets/memoranda_2010/m10-08.pdf (page 13) • ED’s clarifying guidance for summer employment (forthcoming)

  44. OMB-Identified Common Mistakes • Duplicate awards • Incorrect award IDs • Changes to key elements • ‘Extra’ reporting • Non-reporting • In current quarter • In subsequent quarter

  45. Challenges to Transparency Agencies concerns: Data correct? Report misinterpreted? Major collection burden? Need technology? Need expertise?

  46. Speakers Todd Stephenson Office of Elementary & Secondary Education Student Achievement and School Accountability Programs Rebecca Walawender Office of Special Education & Rehabilitative Services Individuals with Disabilities Education Act

  47. Questions Answers Further Recovery Act questions- ED contact listed on the Grant Award Notification or, email RMSCommunications@ed.gov

  48. Thank you for participating! Please complete an evaluation— Your feedback is important. http://www.ed.gov/policy/gen/leg/recovery/rms-web-conferences.html

  49. Resource DocumentsNew FFATA Requirements • October 1, 2010 grantees will start reporting data on sub-recipients (grants and contracts) for all federal grants -- USASpending.gov • OMB instructions to the agencies at:http://www.whitehouse.gov/omb/assets/open_gov/OpenGovernmentDirective_04062010.pdf

  50. Resource: Section 1512 Reporting Registration Tips from OMB • It is the recipients’ responsibility to ensure that their CCR information is up-to-date and that their registration is active and will remain active throughout the reporting cycle.  CCR registrations must be renewed at least every 12 months or the registration expires preventing a recipient from submitting a report in FederalReporting.gov until the registration is re-instated.  Because re-instating an expired registration requires specific recipient actions to protect the security of the record, processing time, and additional time to update FederalReporting.gov, recipients should check their CCR registration prior to the beginning of each reporting cycle.  Maintaining current Points of Contact in their CCR record will assure recipients that CCR renewal reminders are received. • It is the recipient’s responsibility to ensure that that their D-U-N-S number and entity record is accurate and active with Dun and Bradstreet (D&B).  While D-U-N-S numbers don’t expire, Dun and Bradstreet (D&B) does conduct routine and continuous data maintenance and outreach to verify operations at a location.  When operations cannot be verified for a particular D-U-N-S number, D&B may flag the D-U-N-S number “inactive,” which could interfere with successful reporting. It is also important that the information about your entity is accurate when used to report on government websites.  In order to lookup, review or modify a record at D&B, or request a D-U-N-S number altogether, recipients may use the following two free resources: • Recipients may utilize the self-service webform at http://fedgov.dnb.com/webform. This application allows a recipient to lookup their D-U-N-S Number, review the data on file, request changes if necessary, or request a new D-U-N-S number if one does not already exist. • Recipients may call (866) 705-5711 to verify that a D-U-N-S number is active and confirm other details about their entity.

More Related