1 / 22

Radiochemical Methods and Data Evaluation

Radiochemical Methods and Data Evaluation. Wm. Kirk Nemeth New Jersey Department of Health & Senior Services, Environmental Chemical and Laboratory Services, Radioanalytical Services. WHAT WE’LL COVER TODAY.

Download Presentation

Radiochemical Methods and Data Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Radiochemical Methods and Data Evaluation Wm. Kirk Nemeth New Jersey Department of Health & Senior Services, Environmental Chemical and Laboratory Services, Radioanalytical Services

  2. WHAT WE’LL COVER TODAY • The analytical process: sample collection to data reporting and uncertainties • Methods for sample preparation for drinking water samples • QA data: what to look for

  3. SOURCES OF DATA VARIABILITY

  4. UNCERTAINTIES RANDOM: Includes the radioactive decay process itself, random timing uncertainties, variations in collection, sample preparation, positioning of the sample at the detector, etc. The list is nearly endless. SYSTEMATIC: can be considered to be conceivable sources of inaccuracy which are biased and not subject to random fluctuations and those which may be due to random cause but cannot be or are not assessed by statistical methods.

  5. PROPOGATION OF ERRORS • The total error for any analytical scheme involves errors in all steps: sampling, preparation and measurement. • If sampling uncertainty is  50%, and the analysis only has a 2% error; your total error is still very large

  6. DATA QUALITY OBJECTIVES (DQOs) • A statement of the overall level of uncertainty that a decision-maker is willing to accept in results derived from environmental data • The level of uncertainty can be defined through defining the uncertainty in each step of the analytical process. • QA data are key in defining the level of uncertainty

  7. STEPS TO BE DISCUSSED • Sample Collection and Preservation • Methods • Quality Assurance

  8. SAMPLECOLLECTION & PRESERVATION IN THE FIELD • Consult DEP Field Sampling Manual and Laboratory SOP manual • Collection of radiological samples • - typically 1 gallon plastic for all but Radon-222 and Tritium • Preservation (Where and How?) • HNO3 to pH < 2 is ideal • Filtration before or after H+ • Holding Times • within 48 hours for gross alpha/beta (includes collection, transport, preparation and counting) • Analyze within 6 months

  9. SAMPLE PREPARATION METHODS CAVEATS • NJDEP/OQA only certifies for certain preparation methods • You must match the method of preparation to the method of analysis • SDWA samples must use Federally approved methods

  10. Analytical Methods Approved by EPA for Radionuclide Monitoring

  11. NJDHSS PREPARATION METHODS FOR DRINKING WATER • EPA 900.0: Gross Alpha/Beta (evaporation) • EPA 900.1: Gross Alpha (co-precipitation) • EPA 903.0: Radium 226 • NJ Method: Radium 228 • EPA 00-07: Uranium • EPA 913: Radon

  12. Required Detection Limits

  13. DETECTION LIMIT DEFINITIONS • Instrument Detection Limit (IDL) • Lowest observable value above instrument background in the absence of sample matrix • Method Detection Limit (MDL) • Minimum detectable concentration that has 99% confidence of being greater than 0.

  14. ISSUES AFFECTING MEASUREMENT CHOICE • Regulatory implications/limitations • Detection limit needs • Potential analytical interferences • Cost • Time • Experience/skill needed to conduct analyses

  15. QA/QC COMPONENTS • Instrument Calibration • Blanks • Duplicates • Spikes • Calibration Verification • Reference Materials

  16. CALIBRATION • EPA approves the use of particular isotopes to create attenuation curves. Typically 20 or more planchets of varying weight. • Attenuation standards are typically laboratory created using NIST traceable materials. • They should mimic actual samples. • Some methods use internal tracers for calibration. • Samples must be within the weight range dictated by the method.

  17. BLANKS • Trip Blank: Deionized water carried from laboratory to sampling location and back to the laboratory. • Instrument Background: typically clean sample holder or planchet is used. • Method Blank: Deionized water containing all reagents carried through sample preparation & measurement procedures

  18. DUPLICATES • Field Duplicate: Extra sample taken from same place, analyzed independently to document sampling precision. • Matrix Duplicate: Intralaboratory split sample used to document method precision in a given matrix

  19. SPIKES • Spike: Known activity/nuclide addition to deionized water. • Matrix Spike: Known activity/nuclide addition to sample aliquot prior to preparation to document bias in a given matrix. (Matrix interference) • Matrix Spike Duplicate: Intralaboratory split sample with known additions prior to preparation to document precision and bias

  20. OTHER QA/QC COMPONENTS • Continuing Calibration Verification • Evaluates instrument drift • Second Source Reference Materials • Different source than used for calibration • Certified Reference Materials • Evaluate method bias • Various Sources: NIST best

  21. QA/QC SUMMARY • You cannot do too much QA • Sample data w/o QA data has limited meaning • Each type of QA sample evaluates a different part of the analytical process • You must match reference materials to media being analyzed • Labs. that do and report QA data usually produce reliable data

  22. ISSUES TO CONSIDER • Is the lab. certified to perform the specific procedure? • Is the lab. using the correct preparation and analysis methods for the DQO? • Can the lab. achieve the MDL? • Are QA data (blanks, duplicates, spikes, reference materials, …) within defined limits?

More Related