1 / 111

Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prep

Methods and Applications of Uncertainty and Sensitivity Analysis. H. Christopher Frey, Ph.D. Professor. Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prepared for: Workshop on Climate Change Washington, DC

topper
Download Presentation

Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prep

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods and Applications of Uncertainty and Sensitivity Analysis H. Christopher Frey, Ph.D. Professor Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prepared for: Workshop on Climate Change Washington, DC March 7, 2005

  2. Outline • Why are uncertainty and sensitivity analysis needed? • Overview of methods for uncertainty analysis • Model inputs • Empirical data • Expert judgment • Model uncertainty • Scenario uncertainty • Overview of methods for sensitivity analysis • Examples • Technology assessment • Emissions Factors and Inventories • Air Quality Modeling • Risk Assessment • Findings • Recommendations

  3. Why are uncertainty and sensitivity analysis needed? • Strategies for answering this question: • what happens when we ignore uncertainty and sensitivity? • what do decision makers want to know that motivates doing uncertainty and sensitivity analysis? • what constitutes best scientific practice? • Program and research managers may not care about all three, but might find at least one to be convincing (and useful)

  4. When is ProbabilisticAnalysis Needed or Useful? • Consequences of poor or biased estimates are unacceptably high • A (usually conservative) screening level analysis indicates a potential concern, but carries a level of uncertainty • Determining the value of collecting additional information • Uncertainty stems from multiple sources • Significant equity issues are associated with variability • Ranking or prioritizing significance of multiple pathways, pollutants, sites, etc. • Cost of remediation or intervention is high • Scientific credibility is important • Obligation to indicate what is known and how well it is known

  5. When is a Probabilistic Approach Not Needed? • When a (usually conservative) screening level analysis indicates a negligible problem • When the cost of intervention is smaller than the cost of analysis • When safety is an urgent and/or obvious issue • When there is little variability or uncertainty

  6. Myths: Barriers to Use of Methods • Myth: it takes more resources to do uncertainty analysis, we have deadlines, we don’t know what to do with it, let’s just go with what we have… • Hypothesis 1: poorly informed decisions based upon misleading deterministic/point estimates can be very costly, leading to a longer term and larger resource allocation to correct mistakes that could have been avoided or to find better solutions • Hypothesis 2: Uncertainty analysis helps to determine when a robust decision can be made versus when more information is needed first • Hypothesis 3: Uncertainty and sensitivity analysis help identify key weaknesses and focus limited resources to help improve estimates • Hypothesis 4: Doing uncertainty analysis actually reduces overall resource requirements, especially if it is integrated into the process of model development and applications

  7. Role of Modeling in Decision-Making • Modeling should provide insight • Modeling should help inform a decision • Modeling should be in response to clearly defined objectives that are relevant to a decision.

  8. Questions that Decision-Makers and Stakeholders Typically Ask • How well do we know these numbers? • What is the precision of the estimates? • Is there a systematic error (bias) in the estimates? • Are the estimates based upon measurements, modeling, or expert judgment? • How significant are differences between two alternatives? • How significant are apparent trends over time? • How effective are proposed control or management strategies? • What is the key source of uncertainty in these numbers? • How can uncertainty be reduced?

  9. Application of Uncertainty to Decision Making • Risk preference • Risk averse • Risk neutral • Risk seeking • Utility theory • Benefits of quantifying uncertainty: Expected Value of Including Uncertainty • Benefits of reducing uncertainty: Expected Value of Perfect Information (and others)

  10. Variability and Uncertainty • Variability: refers to the certainty that • different members of a population will have different values (inter-individual variability) • values will vary over time for a given member of the population (intra-individual variability) • Uncertainty: refers to lack of knowledge regarding • True value of a fixed but unknown quantity • True population distribution for variability • Both depend on averaging time

  11. Variability and Uncertainty • Sources of Variability • Stochasticity • Periodicity, seasonality • Mixtures of subpopulations • Variation that could be explained with better models • Variation that could be reduced through control measures

  12. Variability and Uncertainty • Sources of Uncertainty: • Random sampling error for a random sample of data • Measurement errors • Systematic error (bias, lack of accuracy) • Random error (imprecision) • Non-representativeness • Not a random sample, leading to bias in mean (e.g., only measured loads not typical of daily operations) • Direct monitoring versus infrequent sampling versus estimation, averaging time • Omissions • Surrogate data (analogies with similar sources) • Lack of relevant data • Problem and scenario specification • Modeling

  13. Overview of “State of the Science” • Statistical Methods Based Upon Empirical Data • Statistical Methods Based Upon Judgment • Other Quantitative Methods • Qualitative Methods • Sensitivity Analysis • Scenario Uncertainty • Model Uncertainty • Communication • Decision Analysis

  14. Statistical MethodsBased Upon Empirical Data • Frequentist, classical • Statistical inference from sample data • Parametric approaches • Parameter estimation • Goodness-of-fit • Nonparametric approaches • Mixture distributions • Censored data • Dependencies, correlations, deconvolution

  15. Statistical MethodsBased Upon Empirical Data • Variability and Uncertainty • Sampling distributions for parameters • Analytical solutions • Bootstrap simulation

  16. Propagating Variability and Uncertainty • Analytical techniques • Exact solutions (limited applicability) • Approximate solutions • Numerical methods • Monte Carlo • Latin Hypercube Sampling • Other sampling methods (e.g., Hammersley, Importance, stochastic response surface method, Fourier Amplitude Sensitivity Test, Sobol’s method, Quasi-Monte Carlo methods, etc.)

  17. Monte Carlo Simulation • Probabilistic approaches are widely used • Monte Carlo (and similar types of) simulation are widely used. • Why? • Extremely flexible • Inputs • Models • Relatively straightforward to conceptualize

  18. Tiered Approach to Analysis • Purpose of Analyses (examples) • Screening to prioritize resources • Regulatory decision-making • Research planning • Types of Analyses • Screening level point-estimates • Sensitivity Analysis • One-Dimensional Probabilistic Analysis • Two-Dimensional Probabilistic Analysis • Non-probabilistic approaches

  19. MethodsBased Upon Expert Judgment • Expert Elicitation • Heuristics and Biases • Availability • Anchoring and Adjustment • Representativeness • Others (e.g., Motivational, Expert, etc.) • Elicitation Protocols • Motivating the expert • Structuring • Conditioning • Encoding • Verification • Documentation • Individuals and Groups • When Experts Diasagree

  20. An Example of Elicitation Protocols:Stanford/SRI Protocol

  21. Key Ongoing Challenges • Expert Judgment vs. Data • Perception that judgment is more biased than analysis of available data • Unless data are exactly representative, they too could be biased • Statistical methods are “objective” in that the results can be reproduced by others, but this does not guarantee absence of bias • A key area for moving forward is to agree on conditions under which expert judgment is an acceptable basis for subjective probability distributions, even for rulemaking situations

  22. Appropriate Use of Expert Judgment in Regulatory Decision Making • There are examples…e.g., • analysis of health effects for EPA standards • Uncertainty in benefit/cost analysis (EPA, OMB) • Probabilistic risk analysis of nuclear facilities • Key components of credible use of expert judgment: • Follow a clear and appropriate protocol for selecting experts and for elicitation • For the conditioning step, consider obtaining input via workshop, but for encoding, work individually with experts – preferably at their location • Document (explain) the basis for each judgment • Compare judgments: identify key similarities and differences • Evaluate the implications of apparent differences with respect to decision objectives – do not “combine” judgments without first doing this • Where possible, allow for iteration

  23. Statistical MethodsBased Upon Expert Judgment • Bayesian methods can incorporate expert judgment • Prior distribution • Update with data using likelihood function and Bayes’ Theorem • Create a posterior distribution • Bayesian methods can also deal with various complex situations: • Conditional probabilities (dependencies) • Combining information from multiple sources • Appears to be very flexible • Computationally, can be very complex • Complexity is a barrier to more widespread use

  24. Other Quantitative Methods • Interval Methods • Simple intervals • Probability bounds • Produce “optimally” narrow bounds – cannot be any narrower and still enclose all possible outcomes, including dependencies among inputs • Bounds can be very wide in comparison to confidence intervals

  25. Other Quantitative Methods • Fuzzy methods • Representation of vagueness, rather than uncertainty • Approximate/semi-quantitative • Has been applied in many fields • Meta-analysis • Quantitatively combine, synthesize, and summarize data and results from different sources • Requires assessment of homogeneity among studies prior to combining • Produces data with larger sample sizes than the constituent inputs • Can be applied to summary data • If raw data are available, other methods may be preferred

  26. Scenario Uncertainty • A need for formal methods • Creativity, brainstorming, imagination • Key dimensions (e.g., human exposure assessment) • Pollutants • Transport pathways • Exposure routes • Susceptible populations • Averaging time • Geographic extent • Time Periods • Activity Patterns • Which dimensions/combinations matter, which ones don’t? • Uncertainty associated with mis-specification of a scenario – systematic error • Scenario definition should be considered when developing and applying models

  27. Model Uncertainty • Model Boundaries (related to scenario) • Simplifications • Aggregation • Exclusion • Resolution • Structure • Calibration • Validation, Partial validation • Extrapolation

  28. Model Uncertainty • Methods for Dealing with Model Uncertainty • Compare alternative models, but do not combine • Weight predictions of alternative models (e.g., probability trees) • Meta-models that degenerate into alternative models (e.g., Y = a(|x-t|)n , where n determines linear/nonlinear and t determines threshold or not)

  29. Weighting vs. Averaging Each Model has Equal Weight Model B Model A Probability Density Output of Interest Average of Both Models Neither Model Supports This Range of Outcomes Probability Density Output of Interest

  30. Sensitivity Analysis • Objectives of Sensitivity Analysis (examples): • Help identify key sources of variability (to aid management strategy) • Critical control points? • Critical limits? • Help identify key sources of uncertainty (to prioritize additional data collection to reduce uncertainty) • What causes worst/best outcomes? • Evaluate model behavior to assist verification/validation • To assist in process of model development • Local vs. Global Sensitivity Analysis • Model Dependent vs. Model Independent Sensitivity Analysis • Applicability of methods often depends upon characteristics of a model (e.g., nonlinear, thresholds, categorical inputs, etc.)

  31. Nominal Range Sensitivity Analysis (NRSA) Differential Sensitivity Analysis Regression Analysis (RA) Analysis of Variance (ANOVA) Classification and Regression Trees (CART) Scatter Plots Conditional Sensitivity Examples of Sensitivity Analysis Methods • Mathematical Methods • Assess sensitivity of a model output to the range of variation of an input. • Statistical Methods • Effect of variance in inputs on the output distribution. • Graphical Methods • Representation of sensitivity in the form of graphs, charts, or surfaces.

  32. Sensitivity Analysis Methods (Examples) • Nominal Range Sensitivity Analysis • Differential Sensitivity Analysis • Conditional Analysis • Correlation coefficients (sample, rank) • Linear regression (sample, rank, variety of basis functions possible) • Other regression methods • Analysis of Variance (ANOVA) • Categorical and Regression Trees (CART) (a.k.a. Hierarchical Tree-Based Regression) • Sobol’s method • Fourier Amplitude Sensitivity Test (FAST) • Mutual Information Index • Scatter Plots

  33. Sensitivity Analysis: Displays/Summaries • Scatter plots • Line plots/conditional analyses • Radar plots • Distributions (for uncertainty or variability in sensitivity) • Summary statistics • Categorical and regression trees • Apportionment of variance

  34. Guidance on Sensitivity Analysis Guidance for Practitioners, with a focus on food safety process risk models (Frey et al., 2004): • When to perform sensitivity analysis • Information needed depending upon objectives • Preparation of existing or new models • Defining the case study/scenarios • Selection of sensitivity analysis methods • Procedures for application of methods • Presentation and interpretion of results

  35. Summary of Evaluation Results for Selected Sensitivity Analysis Methods

  36. Example of Guidance on Selection of Sensitivity Analysis Methods Source: Frey et al., 2004, www.ce.ncsu.edu/risk/

  37. Example of Guidance on Selection of Sensitivity Analysis Methods

  38. Communication • Case Studies (scenarios) • Graphical Methods • Influence Diagrams • Decision Tree • Others • Summary statistics/data • Evaluation of effectiveness of methods for communication (e.g., Bloom et al., 1993; Ibrekk and Morgan, 1987)

  39. Example Case Studies • Technology Assessment • Emission Factors and Inventories • Air Quality Modeling • Risk Assessment

  40. Role of Technology Assessment in Regulatory Processes (Examples) • Assessment of ability of technology to achieve desired regulatory or policy goals (emissions control, safety, efficiency, etc.) • Evaluation of regulatory alternatives (e.g., based on model cost estimates) • Regulatory Impact Analysis – assessment of costs

  41. An Example of Federal Decision Making:Process Technology RD&D

  42. A Probabilistic Framework for FederalProcess Technology Decision-Making

  43. Methodology for Probabilistic Technology Assessment • Process simulation of process technologies in probabilistic frameworks • Integrated Environmental Control Model (IECM) and derivatives • Probabilistic capability for ASPEN chemical process simulator • Quantification of uncertainty in model inputs • Statistical analysis • Elicitation of expert judgment • Monte Carlo simulation • Statistical methods for sensitivity analysis • Decision tree approach to comparing technologies and evaluating benefits of additional research

  44. Exhaust Gas Blowdown Boiler Feedwater Raw water Boiler Feedwater Treatment HRSG & Steam Cycle Steam Turbine Return Water Steam Gasifier Steam Shift & Regeneration Steam Gas Turbine Exhaust Cyclone Cyclone Gasification, Particulate & Ash Removal, Fines Recycle Hot Gas Desulfur- ization Coal Coal Gas Turbine Coal Handling Raw Syngas Clean Syngas Ash Gasifier Air Ash Fines Fines Sulfuric Acid Plant Tailgas Sulfuric Acid Air Air Electricity Conceptual Diagram of Probabilistic Modeling Engineering Performance and Cost Model of a New Process Technology Input Uncertainties Output Uncertainties Performance Performance Inputs Emissions Cost Inputs Cost

  45. Comparison of Probabilistic and Point-Estimate Results for an IGCC System

  46. Example of a Probabilistic Comparison of Technology Options Uncertainty in the difference in cost between two technologies, taking into account correlations between them

  47. Example: Engineering Study ofCoal-Gasification Systems • • DOE/METC Engineers • • Briefing Packets: • - Part 1: Uncertainty Analysis (9 pages) • - Part 2: Process Area Technical Background • - Lurgi Gasifier: 12 p., 16 ref. • - KRW Gasifier: 19 p., 25 ref. • - Desulfurization: 9 p., 19 ref. • - Gas Turbine: 23 p., 36 ref. • - Part 3: Questionnaire • Follow-Up

  48. Examples of the Judgmentsof One Expert • Fines Carryover • Carbon Retention • Air/Coal Ratio

  49. Examples of the Judgmentsof Multiple Experts

  50. Do Different Judgments Really Matter? • Specific Sources of Disagreement: - Sorbent Loading - Sorbent Attrition • Qualitative Agreement in Several Cases

More Related