1 / 25

Quality Assurance

Quality Assurance. NITRC Enhancement Grantee Meeting June 18, 2009. Susan Whitfield-Gabrieli & Satrajit Ghosh RapidArt MIT. Acknowledgements. THANKS! Collaborators: Alfonso Nieto Castañón Shay Mozes Data:

naida
Download Presentation

Quality Assurance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Assurance NITRC Enhancement Grantee Meeting June 18, 2009 Susan Whitfield-Gabrieli & Satrajit Ghosh RapidArt MIT

  2. Acknowledgements THANKS! Collaborators: • Alfonso Nieto Castañón • Shay Mozes Data: • Stanford, Yale, MGH, CMU, MIT Funding: • R03 EB008673: PIs: SatrajitGhosh, Susan Whitfield-Gabrieli, MIT

  3. fMRI QA • Data inspection as well as artifact detection and rejection routines are essential steps to ensure valid imaging results. • Apparent small differences in data processing may yield large differences in results

  4. QA in fMRI Before Quality Assurance

  5. QA in fMRI Before QA After QA

  6. QA: Outline • fMRI quality assurance protocol • QA (bottom up) • QA (top down)

  7. Quality Assurance: Preprocessing Bottom Up: review data Raw Images Artifact Detection Preprocessing Review Data Check behavior Create mean functional image Review time series, movie Interpolate prior to preprocessing

  8. Quality Assurance: PostPreprocessing Top Down: review stats Bottom Up: review functional images GLM PreProc Artifact Check RFX Artifact Check Artifact Check - Check registration - Check motion parameters - Generate design matrix template - Check for stimulus corr motion - Check global signal corr with task - Review power spectra - Detect outliers in time series, motion: determine scans to omit /interp or deweight Data Review - time series - movie Review Statisitcs Mask/ResMS/RPV Beta/Con/Tmap

  9. Data Review Global mean COMBINED OUTLIERS Thresholds INTENSITY OUTLIERS Deviation From mean Over time MOTION OUTLIERS Realign Param Outliers Data Exploration

  10. Including motion parameters as covariates • Eliminates (to first order) all motion related residual variance. • If motion is correlated with the task, this will remove your task activation. • Check SCM: If there exists between group differences in SCM, AnCova

  11. Power Spectra: HPF Cutoff Selection .01 .02

  12. Artifact Detection Scan 79 Scan 95

  13. Artifact Detection/Rejection Artifact Sources: Head motion * Physiological : respiration and cardiac effects Scanner noise Solutions: Review data Apply artifact detection routines Omit*, interpolate or deweight outliers *Include a single regressor for each scan you want to remove, with a 1 for the scan you want to remove, and zeros elsewhere. *Note # of scan omissions per condition and between groups Correct analysis for possible confounding effects: AnCova : use # outliers as a within subject covariate

  14. BOTTOM UPAUDITORY RHYMING > REST Outlier Scans T map ResMS Before ART ResMS T map After ART

  15. “TOP DOWN” 2nd level, RFX

  16. Group Stats ( N = 50 ) Working Memory Task Not an obvious problem: Frontal and parietal activation for a working memory task.

  17. Group Stats (N=50) 2B Working Memory Task

  18. Find Offending Subjects: 2 of 50 subjects

  19. Artifacts in outlier images Scan 86 Scan 79 Scan 95 Scan 83

  20. Comparison of Group Stats:Working Memory (2B>X) ORIGINAL FINAL

  21. Comparison of Group Statistics: Default Network

  22. Method Validation Experiment • Data analyzed: 312 subjects, 3 sessions per subject • Outlier detection based on global signal and movement • Normality: tests on the scan-to-scan change in global BOLD signal after regressing out the task and motion parameters. Normally-distributed residuals is a basic assumption of the general linear model. Departures from normality would affect the validity of our analyses (resulting p- values could not be trusted) If all is well, we should expect this global BOLD signal change to be normally distributed because: average of many sources (central limit theorem ) • Power: the probability of finding a significant effect if one truly exists. Here it represents the probability of finding a significant (at a level of p<.001 uncorrected) activation at any given voxel if in fact the voxel is being modulated by the task (by an amount of 1% percent signal change).

  23. Outlier Experiment • Global signal is not normally distributed In 48% of the sessions the scan-to-scan change in average BOLD signal is not normally distributed. This percentage drops to 4% when removing an average of 8 scans per session (those with z score threshold = 3)

  24. Removing outliers improves the power • Plot shows the average power to detect a task effect (effect size = 1% percent signal change, alpha = .001) • Before outlier removal the power is .29 ( 29% chance of finding a significant effect at any of these voxels) After removing an average of 8 scans per session (based on global signal threshold z=3) power improves above .70

  25. THANKS! Dissemination (NITRC) - International visiting fMRI fellowships @ MGH - 2 week MMSC @ MGH - SPM8 Courses (local/remote) -Visiting programs at MIT Documentation • Manuals, Demos, Tutorials • Scripts

More Related