1 / 11

How the CRASH project has addressed the fall 2008 review recommendations

How the CRASH project has addressed the fall 2008 review recommendations. Some detail here, with reference to more material in fall 2009 presentations. Format: Title gives report section. Recommendations summarized in major bullets What we did in minor bullets. Overall Project p. 1.

jerom
Download Presentation

How the CRASH project has addressed the fall 2008 review recommendations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How the CRASH project has addressed the fall 2008 review recommendations Some detail here, with reference to more material in fall 2009 presentations.

  2. Format: Title gives report section • Recommendations summarized in major bullets • What we did in minor bullets

  3. Overall Project p. 1 • Consult systems engineer to develop schedule and resource loaded plan • We consulted a systems engineer and followed his recommendations • Construct a codes flow chart to track progress • The code elements are discussed in the Toth talk. Since tests are committed with new code elements, the Myra talk best shows progress. • UQ analysis should set directions for project • This has been essential in decisions to date and is key to the plans for the next year (Drake talk, Holloway talk) • Consider finding further experimental signatures • In process, mainly under synergistic funding. (Drake talk, Huntington and Doss posters)

  4. Overall Project p. 2 • Exercise a simple version of the entire “UQ Loop” in this year just past • Done (Holloway and Bingham talks, several posters) • Expand the effort on analytic physics • We have done this (discussed in Drake talk) • Create a CRASH Primer to capture analytic work • Now exists (distributed to committee), will be extended • Be flexible in defining experimental variations • Nothing is locked in place now

  5. Codes and Models p. 1 • Document and defend physics and numerical choices • Addressing the physics in the CRASH Primer • Numerics addressed in source code documentation, code manual and stand-alone documents for substantial elements (e.g. gray flux-limited diffusion approach) • Clarify who is writing PDT at TAMU and who is the integrating physicist at UM • Daryl Hawkins (TAMU) and Eric Myra (UM), both full FTEs • Risk mitigation: functional multigroup diffusion at UM • “The rubber meets the rode” with the CRASH code • We have four research scientists at UM contributing to the CRASH code development (Drake talk) • We attend to the balance of effort between “UQ” and codes • Maintain focus on project goals • The management team does this (Drake, Powell, Holloway talks)

  6. Codes and Models p. 2 • Start off with 1D models or otherwise simply • We did this and do this (Holloway & Bingham talks) • Attempt to quantify the uncertainties between physics and numerics • The project is just now ready to evolve in this direction • Code elements recently put in place will let us do intra-code studies to bound/assess errors due to numerical resolution and model fidelity • Grid convergence studies in space, time, number of groups • Dimensionality effects (1D/2D/3D) • Solver choice effects (e.g. HLLE vs Godunov) • Design of these experiments will be very different from current UQ runsets, however, we will measure effects of the above on the outputs that are used in the UQ runsets

  7. V, V, & UQ p.1 • Overall comment on this area by us • The review team recommended many actions in the area of testing and related documentation, so many that this could have absorbed all the project resources. Our commitment to testing and documentation is covered in talks by Powell, Toth, and Myra. However, we did not allow this to prevent us from pursuing the recommendations of developing the CRASH code into the “reasonably good, integrated code” of from completing a “UQ loop” in this past year. • Measure and track the status of each software component, with reproducible evidence and drill down capability • Eric Myra’s talk discusses the verification tests submitted with each software component; Gabor Toth’s talk discusses the testing approach and reproducibility. • As mentioned in Gabor’s talk, we have made some but very limited progress in the direction of “drill-down” capability

  8. V, V, & UQ p.2 Use more precise performance measurements and develop new manufactured solutions Eric Myra’s talk discusses the quantitative nature of the verification tests. Our emphasis has been on the new verification tests based on analytic solutions and grid convergence We have not yet developed new manufactured solutions Show the testing path from initial model development to full release of the code Touched on briefly in Ken Powell’s and Gabor Toth’s talks. Tests committed with new units, and run automatically in daily and weekly tests. Branches tagged for UQ release; UQ runsets make use of a particular tagged branch Construct a text matrix This is shown in Myra’s talk

  9. V, V, & UQ p.3 • Consider alternative approaches to solution verification in addition to adjoint methods • Adjoint methods are being developed by a professor and student, they are not on the critical path. • One of the posters shows some recent results of this work • Determine what and how NNSA UQ software will be incorporated • While we are using software with some parallel origins (Talks by Holloway, Bingham, related posters), we now do not expect to incorporate anything directly • Investigate alternative approaches to Bayesian inference • Bayesian inference is used in constructing a predictive model for the key outputs, including in finding hyperparameters for Gaussian Process models. This model informs physics parameter calibration and provides a measure of discrepancy from reality • Non-Bayesian techniques are used also in our analysis (e.g. MARS and MART for constructing response surfaces) • The over-riding issue for the project is to have an approach that we ourselves can implement and work with

  10. V, V, & UQ p.4 Consider terminology issues Predictive science is more than prediction. We argue that a scientific or engineering prediction consists of both sensitivity measures and an estimate of output pdfs or ranges corresponding to known input distributions and ranges Our center philosophy is centered around improvement in predictions, as measured by improved overlap of output pdfs with measurement uncertainty bands Predictions are ideally improved by improvements in physics modeling and improvements in numerics, and improvements in physics constants Statistical analysis tells us where improvements are required “Validated” is the unachievable ideal in which a code and its input distributions are known sufficiently to predict the output distributions in agreement with reality Make sure that the long-term effort is a challenge, but one that can be met We have paid attention to this and it still looks good to us.

  11. Culture change • Apply UQ methods to problems less ambitious than the full CRASH problem • This is what we did (Holloway & Bingham talks) • Articulate achievements in predictive science in ways that can be appreciated by non experts • We are pursuing this in non-UQ conference presentations

More Related