1 / 68

Evaluators Scott Novak, Ph.D. RTI International

Evaluators Scott Novak, Ph.D. RTI International. July 2011. Overview. Demonstrate how to use different Dashboard Reports for Evaluation Demonstrate new Custom Reports Agenda Introductions Reports Questions. What are the New Reports?.

mort
Download Presentation

Evaluators Scott Novak, Ph.D. RTI International

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluators Scott Novak, Ph.D.RTI International July 2011

  2. Overview Demonstrate how to use different Dashboard Reports for Evaluation Demonstrate new Custom Reports Agenda Introductions Reports Questions

  3. What are the New Reports? Performance Management Dashboard Interface (Dashboards) Custom Dashboard Reports (Custom Reports) SAS/SPSS

  4. Dashboard Reports

  5. Click on any graph to see a report with additional information for that measure.

  6. [Your Grant Number and Name will be here] Blue line is your grant. Gold line is comparison (GFA)

  7. Table will show follow-ups due, follow-ups received, and follow-up rate by month and year

  8. For each outcome, data table shows number, percent at intake, percent at 6 month, and the rate of change.

  9. Employment for grant is being compared to State

  10. Dashboard and Custom Reports Navigation Tips (cont.) • View reports in HTML, PDF, EXCEL or XML by clicking the Earth icon in the upper right corner of the report screen. After clicking a view, choose HTML to return to the Dashboard display • Dashboards only • Change the comparisons in each display by clicking the radio buttons in the list to the left of the graphic display • Dashboards include only intakes matched with 6 month follow-up interviews (except Intake Target Report)

  11. Custom Dashboard Reports

  12. For specific date rage, click “Specific Date Range” button and then click “Apply” button

  13. Must select an interview type to run any outcome change report

  14. To run a custom age range, choose “Specific Age Range” and click the “Apply” button Use Ctrl button to choose multiple options within a demographic group

  15. Custom age range

  16. Breadcrumb heading allows you to see what selections you have made on previous pages Run report by clients who were homeless at intake

  17. Custom Dashboard Reports

  18. To save report, click “Keep this version”, then click “Save as Report View”

  19. To save report, choose “Select My Folders” and type the name of your report. Then click “Ok”.

  20. External Data Sources • Help contexualize performance • Understand trends in factors that influence access/capacity/treatment (e.g., area unemployment, drug epidemics) • National Survey on Drug Use and Health, Treatment Episode Data Set, Others • Benchmarking facilitate comparisons • Similar organizations based on population, resources, and community

  21. Evaluation Analyses: Bringing It All Together • Preliminary Analyses • Missing Data • Trends • Process and Outcome Analyses • Process: Identify areas of success/improvement • Outcome: Client outcomes • Special Analytic Topics • Analysis of Change over time

  22. Preliminary Analyses • Missing Data • Attrition/Drop Out • Item Missingness • Problem: Those who are retained may be different than those who drop/out or fail t answer questions

  23. Alphabet Soup • MCAR • MAR • NMAR • Ignorable/non-ignorable • GEE • RC/RE • HLM/MLM • PMM • DID

  24. Missing Completely At Random (MCAR) • Likelihood of missing data (item or assessment) are due to “chance” factors. Missingness unrelated to any specific survey item (observable) • Sick on day of test administration (missed assessment) • Miss item on survey because not “paying attention” • No differences on any factor that is not observed (unobservable) in the study, but is related to study outcome (e.g., no differences in the likelihood of being sick between smokers and non-smokers) • Ignorable in that results will not be biased

  25. Missing At Random (MAR) • Likelihood of missingness (item or assessment) is due to a respondent characteristic(s) collected in study. • Smokers are more likely to be absent on day of data collection, and information on smoking status is collected. • People who smoke are less likely to report income, and smokers have lower incomes, in general. Smoking status is collected in study. • Analyses will be biased unless appropriate procedures are used, perhaps including information on smoking. • Non-ignorable, unless appropriate procedures are in place.

  26. Not Missing at Random (NMAR) • Likelihood of or assessment) are due to factors not observed in study. • Smokers more likely to be missing on income variable and smokers have lower income. But do not have information on smoking. • Unverifiable assumption. Can distinguish MCAR and MAR by whether a factor is related to an observed covariate. Cannot distinguish between MCAR and NMAR because it is purely a hypothesis as to why data are missing. • Two procedures: Selection Models and Pattern Mixture Models • Selection: Model the joint response of outcome and predictor • Pattern Mixture: Average effect based on unique pattern of missingness

  27. Item-Level Missing Data • Data on respondents, but items missing within survey • Typically ignored in practice • Exclude data • Reduce likelihood of detecting significant effects • Bias results depending upon the nature of missing data

  28. Frequency of Item-Missing Data: Random

  29. Frequency of Item-Missing Data: Non-Random

  30. Missing Data in Longitudinal Studies • Missing: Drop-out at any given wave • Pattern: Structure of Drop-out • Monotonic Missing Data: • A data row of variables Y1, Y2, ..., Yp (in that order) is said to have a monotone missing pattern when the event that a variable Yj is missing for a particular individual implies that all subsequent variables Yk, k > j, are missing for that individual. Alternatively, when a variable Yj is observed for a particular individual, it is assumed that all previous variables Yk, k < j, are also observed for that individual (defines attriter)

  31. Understanding Drop-Out • Drop-out: Only presenting data on clients who were successfully followed up may cloud interpretation of data • Hard-to-Treat cases often drop-out • If near 100% follow-up rate, then more confidence in follow-up data • Identifying cases lost-to follow-up can help targeting/recruitment efforts • Need to understanding “coding” of drop-out (administrative discharge)

  32. Missing Data Patterns for 3 Observations value pattern 1='0 0 0' 2='M 0 0' 3='0 M 0' 4='M M 0' 5='0 0 M' 6='M 0 M' 7='0 M M' 8='M MM'; value mono 0='complete data' 1='drop after baseline' 2='drop after eot' 3='drop after 2m' 4='missing data at any other

  33. Types of Analyses for Drop-Out • Identify patterns of drop-out • Understand assumptions for methods used to analyze longitudinal data • GEE: MCAR • RE: MAR • REGRESSION:/ANOVA MCAR • Determine which variables may be related to missingness and attrition (monotonic) • Conduct sensitivity (stability) analyses under different assumptions

More Related