1 / 25

Agenda 03/27/2014

Agenda 03/27/2014. Review first test. Discuss internal data project. Review characteristics of data quality. Types of data. Data quality. Data governance. Define ETL activities. Discuss database analyst/programmer responsibilities for data evaluation. Answers to Multiple Choice Questions.

kris
Download Presentation

Agenda 03/27/2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda 03/27/2014 • Review first test. • Discuss internal data project. • Review characteristics of data quality. • Types of data. • Data quality. • Data governance. • Define ETL activities. • Discuss database analyst/programmer responsibilities for data evaluation.

  2. Answers to Multiple Choice Questions

  3. Discussed in prior classes... • Lots of data. • Traditional transaction processing systems • Non-traditional data • Call center; Click-stream; Loyalty card; Warranty cards/product registration information, email, twitter, Facebook • External data from government and commercial entities • General classification of data • Transaction data • Referential data/master data • Metadata

  4. Data quality • What is good quality data? • Correct • Accurate • Consistent • Complete • Available • Accessible • Timely • Relevant

  5. How does data “go bad”? Does all “bad” data have to be fixed?

  6. Data governance • Policies, processes and procedures aimed at managing the data in an organization. • Usually high-level cross-department committees that oversee data management across the organization. • Responsible for defining what data is necessary to gather. • Responsible for defining the source and store of data. • Responsible for security policies, processes, procedures. • Responsible for creating the policies, processes and procedures. • Responsible for assigning blame. • Responsible for enforcing policies.

  7. Data quality in data warehouses • Is it more important than data quality in source transaction and reference data? • How is better quality data achieved? • Automated ETL processes to populate the data warehouse • Spot checking programmatically

  8. Populating the data warehouse • Extract • Take data from source systems. • May require middleware to gather all necessary data. • Transformation • Put data into consistent format and content. • Validate data – check for accuracy, consistency using pre-defined and agreed-upon business rules. • Convert data as necessary. • Load • Use a batch (bulk) update operation that keeps track of what is loaded, where, when and how. • Keep a detailed load log to audit updates to the data warehouse.

  9. Data Cleansing • Source systems contain “dirty data” that must be cleansed • ETL software contains rudimentary to very sophisticated data cleansing capabilities • Industry-specific data cleansing software is often used. Important for performing name and address correction • Leading data cleansing vendors include general hardware/software vendors such as IBM, Oracle, SAP, Microsoft and specialty vendors Informatica, Information Builders (DataMigrator), Harte-Hanks (Trillium), CloverETL, Talend, and BusinessObjects (SAP-AG)

  10. Steps in data cleansing • Parsing • Correcting • Standardizing • Matching • Consolidating

  11. Parsing • Parsing locates and identifies individual data elements in the source files and then isolates these data elements in the target files. • Examples include parsing the first, middle, and last name; street number and street name; and city and state.

  12. Parsing

  13. Correcting • Corrects parsed individual data components using sophisticated data algorithms and secondary data sources.

  14. Correcting

  15. Standardizing • Standardizing applies conversion routines to transform data into its preferred (and consistent) format using both standard and custom business rules.

  16. Standardizing

  17. Matching • Searching and matching records within and across the parsed, corrected and standardized data based on predefined business rules to eliminate duplications.

  18. Matching

  19. Consolidating • Analyzing and identifying relationships between matched records and consolidating/merging them into ONE representation.

  20. Consolidating

  21. Source system view – 3 clients Account# 1238891 Policy No.ME309451-2 Transaction B498/97

  22. The reality – ONE client Account# 1238891 Policy No.ME309451-2 Transaction B498/97

  23. William Lewis Beth Parker Karen Parker-Lewis William Parker-Lewis, Jr. Consolidating whole groups

  24. ETL Products • SQL Server 2012 Integration Services from Microsoft • Power Mart/Power Center/Power Exchange from Informatica • Warehouse Builder from Oracle • Teradata Warehouse Builder from Teradata • DataMigrator from Information Builders • SAS System from SAS Institute • Connectivity Solutions from OpenText • Ab Initio

  25. ETL Goal: Data is complete, accurate, consistent, and in conformance with the business rules of the organization. Questions: • Is ETL really necessary? • Has the advent of big data changed our need for ETL? • ETL vs. ELT • Does the use of Hadoop eliminate the need for ETL software??? • Does it matter if the data is stored in the “cloud”?

More Related