1 / 38

Representing and Analyzing Uncertainty in Large-Scale Complex System Models

Representing and Analyzing Uncertainty in Large-Scale Complex System Models. Doug Allaire. Outline. Modeling Uncertainty Research Objectives The Aviation Environmental Portfolio Management Tool Global Sensitivity Analysis Future Work. Modeling Uncertainty.

denise
Download Presentation

Representing and Analyzing Uncertainty in Large-Scale Complex System Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Representing and Analyzing Uncertainty in Large-Scale Complex System Models Doug Allaire

  2. Outline • Modeling Uncertainty • Research Objectives • The Aviation Environmental Portfolio Management Tool • Global Sensitivity Analysis • Future Work

  3. Modeling Uncertainty • Uncertainties in modeling are unavoidable • Uncertainty should be properly represented • Estimates and predictions • Risk analysis • Cost-benefit analysis • Furthering model development • Uncertainty in models can be represented with three different methodologies • Probabilistic (aleatory) • Fuzzy sets (epistemic) • Fuzzy Randomness (both)

  4. Large-Scale Complex System Models • Multiple disciplines • Economics, aerodynamics, atmospheric science, … • Many inputs, many outputs • Systems of models of different character • Physics-based, empirical • Many assumptions • Subjective, objective • Computationally intensive • Characterizing, representing, quantifying, and accounting for uncertainty is key • To both development and application of models

  5. Uncertainty in Large-Scale Complex System Models Goals of a formal uncertainty analysis: • Further the development of a model • Identify gaps in functionality that significantly impact the achievement of model requirements, leading to the identification of high-priority areas for further development. • Rank inputs based on contributions to output variability to inform future research • Inform decision-making • Provide quantitative evaluation of the performance of the model relative to fidelity requirements for various analysis scenarios. • Properly represent different types of uncertainty in the model Especially important for complex systems that comprise multiple models of different disciplines, different character, different assumptions

  6. What exists today • Probabilistic and fuzzy random uncertainty models • Rigorous model independent sensitivity analysis method • Expensive • Several model dependent sensitivity analysis methods • Inexpensive but difficult to ensure proper use • Limited research into the use of surrogate models in uncertainty analysis • Critical for models with long runtimes • Numerous sampling techniques for improving runtimes of probabilistic analyses

  7. Research Objectives • For specific large-scale, complex system models, identify how to properly represent and analyze uncertainty. • Systematically develop surrogate models for situations where proper representation and analysis of uncertainty is computationally prohibitive. Limitations and uncertainty associated with the use of a surrogate should be identified. • Use appropriate sampling techniques to improve computational costs in uncertainty analysis. Limitations and uncertainty associated with the use of chosen sampling methods should be identified.

  8. Cost-effectiveness • $/kg NOx reduced • $/# people removed from 65dB DNL • $/kg PM reduced • $/kg CO2 reduced • Benefit-cost • Health and welfare impacts • Change in societal welfare ($) • Distributional analyses • Who benefits, who pays • Consumers • Airports • Airlines • Manufacturers • People impacted by noise and pollution • Special groups • Geographical regions • Policy scenarios • Certification stringency • Market-based measures • Land-use controls • Sound insulation • Market scenarios • Demand • Fuel prices • Fleet • Environmental scenarios • CO2 growth • Technology and operational advances • CNS/ATM, NGATS • Long term technology forecasts APMT inputs outputs Global, Regional, Airport-local Aviation Environmental Portfolio Management Tool (APMT)

  9. Policy and Scenarios APMT approach APMTBENEFITS VALUATION BLOCK APMT PARTIAL EQUILIBRIUM BLOCK AEDT Emissions Operations CLIMATE IMPACTS Airport-level Emissions Tools Airport-level Noise Tools Schedule & Fleet DEMAND (Consumers) SUPPLY (Carriers) Emissions LOCAL AIR QUALITY IMPACTS Global Noise Assessment Global Emissions Inventories Noise NOISE IMPACTS Fares New Aircraft Monetized Benefits Collected Costs EDS Emissions & Noise Vehicle Noise Design Tools APMTCOSTS & BENEFITS Technology Impact Forecasting Design Tools Interface Vehicle Emissions Design Tools Vehicle Cost Assessment

  10. APMT Uncertainty Analysis Goals • AQ1. What are the key assumptions employed within the module? How do these assumptions translate into quantifiable uncertainty in module outputs? • AQ2. What are the key assumptions employed within the module databases? How do these assumptions translate into quantifiable uncertainty in module outputs? • AQ3. How do assumptions/limitations in modeling and databases impact the applicability of the module for certain classes of problems? What are the implications for future development efforts? • AQ4. How do uncertainties in module inputs propagate to uncertainties in module outputs? Further, what are the key inputs that contribute to variability in module outputs? • AQ5. For assumptions, limitations and inputs where effects cannot be quantified, what are the expected influences (qualitatively) on module outputs • AQ6. How do assessment results translate into guidelines for use?

  11. Answering AQ4 Probabilistically • AQ4. How do uncertainties in module inputs propagate to uncertainties in module outputs? Further, what are the key inputs that contribute to variability in module outputs? • Assign factor distributions • Propagate uncertainty • Apportion output variance to model factors

  12. Assigning Factor Distributions • Example: APMT Aircraft Emissions Module 4.17 EINOx: -.24 0 .24 9.09 Temperature: -.11 0 .11 33.33 Pressure: -.03 0 .03 5.88 Rel. Humidity: -.17 0 .17 10 Fuel Flow: -.05 0 .05

  13. Propagate Uncertainty Assume a model: • Would like to compute the mean value of g(T) g g(T) T o.w. • Trapezoidal Rule Error goes as and [2] [1]

  14. Propagate Uncertainty: Monte Carlo Simulation Assume same model: • Compute mean value of g(X) using random samples • X is any factor (temperature, pressure, etc.) g g(X) X • Assume is a random variable with a probability density function fX(x). • Monte Carlo mean estimate (for any number of factors) • Error goes probabilistically as • Strong law of large numbers • Central limit theorem

  15. Partitioning Output Variance Analysis of Variance (ANOVA) • Assumes a linear statistical model • Requires specification of factors and factor interactions total to be comprehensive, where S is the number of factors Vary-all-but-one methods • Calculate factor variance contributions by fixing a factor and performing a Monte Carlo simulation. • Requires specification of where to fix a factor • N(S+1) model evaluations Global Sensitivity Analysis • All factors varying • Computes total sensitivity indices for each factor in N(S+1) model evaluations using Monte Carlo simulation • Variance contributions take into account underlying factor distributions

  16. Global Sensitivity Analysis • What is the goal? • ANOVA Decomposition • Partitioning output variance • Monte Carlo Estimates • Total Sensitivity Indices

  17. The goal of Global Sensitivity Analysis • Ranking model factors on the basis of contribution to output variability • Useful for model development • Useful for understanding model outputs • The goal is to partition output variance amongst the factors of the model • Main effects • Interaction effects Factor 1, Factor 2 Interaction Factor 1 Output Variance Factor 2

  18. ANOVA Decomposition • High-Dimensional Model Representation of f(x) [4] • Example HDMR for a function of three parameters

  19. ANOVA Decomposition • High-Dimensional Model Representation of f(x) [4] • Example HDMR for a function of three parameters Mean value

  20. ANOVA Decomposition • High-Dimensional Model Representation of f(x) [4] • Example HDMR for a function of three parameters Main effects Mean value

  21. ANOVA Decomposition • High-Dimensional Model Representation of f(x) [4] • Example HDMR for a function of three parameters Main effects Mean value First-order interactions

  22. ANOVA Decomposition • High-Dimensional Model Representation of f(x) [4] • Example HDMR for a function of three parameters Main effects Mean value First-order interactions Second-order interaction

  23. Example of ANOVA Decomposition • Consider a function • Then the ANOVA decomposition is:

  24. Calculating Variances • Given a function f(x) that is square integrable, • Similarly, • Example for a function of two parameters:

  25. The goal of global sensitivity analysis • The goal is to partition output variance amongst the factors of the model • Main effects • Interaction effects Factor 1, Factor 2 Interaction Factor 1 Output Variance D12 D D1 D2 Factor 2

  26. Monte Carlo Estimates • The expected value of f(x) may be estimated from:

  27. Monte Carlo Estimates • The expected value of f(x) may be estimated from: • The variance of f(x) may then be estimated from:

  28. Monte Carlo Estimates • The expected value of f(x) may be estimated from: • The variance of f(x) may then be estimated from: • For the one indexed terms the variance may be estimated from:

  29. Total Sensitivity Indices • Sensitivity indices: • Total Sensitivity indices:

  30. Total Sensitivity Indices 1.2 0.02 0.02 0.11 0.01 0.05 1.0 0.36 0.28 0.30 0.8 0.6 0.4 0.69 Others 0.67 0.67 RF* short-lived 0.2 climate sensitivity Damage coefficient 0 Integrated Temperature Change Damage NPV APMT Results BVB-Noise Module Analysis BVB-Climate Module Analysis

  31. Future Work Uncertainty Model Selection Fuzzy Random Probabilistic Problem Statement Define model Define methods Problem Statement Define model Define methods Surrogate Reqs. Sampling Reqs. Surrogate Reqs. Sampling Reqs. Compare methods Test functions Compare methods Test functions Compare Application Large-scale model (APMT Module) Application Large-scale model (APMT Module)

  32. Questions?

  33. References [1]http://www.krellinst.org/UCES/archive/modules/potential/trap/index.html [2]http://www.statisticalengineering.com/curse_of_dimensionality.htm [3] T. Homma, A. Saltelli, “Importance Measures in Global Sensitivity Analysis of Nonlinear Models.” Reliability Engineering and System Safety, 52(1996) pp1-17. [4] I.M. Sobol’, “Global Sensitivity Indices for Nonlinear Mathematical Models and their Monte Carlo Estimates,” Mathematics and Computation in Simulation 55(2001) 271-280.

  34. Aviation Environmental Portfolio Management Tool (APMT): Motivation • Aviation benefits and environmental effects result from a complex system of interdependent technologies, operations, policies and market conditions • Community responses, policy and R&D options typically considered in a limited context • only noise, only local air quality, only climate change • only partial economic effects • Actions in one domain may produce unintended negative consequences in another • Tools and processes do not support recommended practice • NPV of benefits-costs is recommended basis for informing policy decisions in U.S., Canada and Europe

  35. ANOVA Decomposition (Cont.) • A high-dimensional model representation is unique if where and • The individual functions are orthogonal • The expected value of each individual function is zero e.g. • The expected value of f(x) is then

  36. Improving Runtime: Surrogate Modeling • A surrogate is a less expensive, (often) lower-fidelity model that represents the system input/output behavior • Surrogates are often classified into three categories: • Data fit, e.g. response surface models, Kriging models  EDS • Reduced-order models, derived using system mathematical structure • Hierarchical models, e.g. coarser grid, neglected physics, aggregation  AEDT • Why surrogates? • If one analysis takes 1 minute, then a Monte Carlo simulation with 10,000 samples takes 10,000 minutes = 7 days • A global sensitivity analysis requiring a Monte Carlo simulation for each factor would take 7(S+1) days, where S is the number of factors • Relating uncertainty analysis results done with a surrogate model to results from the full model is critical

  37. Surrogate Modeling Approach for APMT • Adaptive approach for Aircraft Emissions Module • Monte Carlo Simulation of a single day of flights • Build up from a small number of flights to representative set • Statistical approach for Aircraft Performance Module • Determine critical factors using expert opinion • Fuel burn per meter • Emissions per meter • Segment level thrust distribution • … • Group aircraft/engine pairs based on statistical similarities • Select representative from each group • Important questions to ask • How do these surrogates impact Monte Carlo results? • How do these surrogates impact global sensitivity results?

  38. Improving Runtime: Sampling Methods • Some well-known methods: • Brute-force Monte Carlo Methods • Use pseudorandom numbers on [0,1] and the inversion method • The probabilistic error bound is . • Quasi-Monte Carlo Methods • Use well-chosen deterministic points rather than random samples • Low discrepancy sequences (e.g. Halton, Hammersley) • Deterministic error bound • Stratification -- Latin Hypercube Sampling • Stratify on all factor dimensions simultaneously • Leads to lower variance in integral estimates than independent, identically distributed samples • Probabilistic error bound . • Things to keep in mind • How bias in sampling impacts analysis metrics • Obtaining more samples if a metric has not converged sufficiently

More Related