270 likes | 436 Views
Grey Zone committee: Pier Siebesma , Martin Miller, Andy Brown, Jeanette Onvlee. Royal Netherlands Meteorological Institute Technical University Delft The Netherlands siebesma@knmi.nl. The Grey Zone The Project Massive GPU computing. The Grey Zone.
E N D
Grey Zone committee: Pier Siebesma, Martin Miller, Andy Brown, Jeanette Onvlee Royal Netherlands Meteorological Institute Technical University Delft The Netherlands siebesma@knmi.nl The Grey Zone The Project Massive GPU computing
Most parameterisation issues of clouds and convection evolve around finding the subgrid pdf of: • Or, as a good approximation, the variances and covariances: • as a function of the used resolution (especially in the grey zone) x= y=4 km x= y=2 km x= y=1 km
A posteriori coarse graining • Use a reference (LES) model that resolves the desired phenomenum ( x << lphen) • Has a domain size much larger than the desired phenomenum ( L >> lphen) • Coarse grain the (co)variances across these scales resolved subgrid l = L all subgrid l = x all resolved
Example 1: A posteriori analysis for LES for shallow convection Standard deviation of subgrid fluxes 1000m Subgrid flux Resolved flux Coarsed grained length scale l Dorrestijn, Siebesma, Crommelin, Jonker, 2012
Example 2: A priori analysis for LES for shallow convection • a posteriori (static) coarse graining : coarse graining the output of one high-res run • A priori (dynamic) coarse graining : repeat runs at coarser resolution Subgrid Flux: Up to 800m similar results as with the a posteriori analysis.
Example 2: A priori analysis for LES for shallow convection • a posteriori coarse graining: statically coarse graining the output of one high-res run • A priori coarse graining: repeat runs at coarser resolution • Turbulent flux is overestimated at resolutions x>1000km • Due to an overestimation of the subgrid (parameterized) fluxes. • Due to the fact that in LES subgrid flux parameterization is designed to work for large eddy-resolving scale l< 1000m and not beyond!
Example 2: A priori analysis for LES for shallow convection 6.4km resolution 100m resolution
Conclusions Each phenomenum has its own “grey zone” typically centered around its typical spatial scale Stochastic effects are most important within the “grey zone” but extend to scales beyond the grey zone. Static (a posteriori) coarse graining is instructive but should not be confused with dynamic (a priori) coarse graining.
Proposal (from WGNE 2010 Japan meeting ) • Project driven by a few expensive experiments (controls) on a large domain at a ultra-high resolution (x=100~500m). • Coarse grain the output and diagnostics (fluxes etc) at resolutions of 0.5, 1, 2, 4, 8, 16, 32 km. (a posteriori coarse graining: COARSE) • Repeat CONTROLS with 0.5km 1km, 2km, 4km, 8km, etc without convective parametrizations etc (a priori coarse graining: NOPARAMS) • Run (coarse-grain) resolutions say 0.5, 1km, 2km, 4km and 8km with convection parametrizations (a priori coarse graining: PARAMS)
Aims • Gain systematic insight and understanding how models behave in the grey zone with without convential convection parameterizations • Provide guidance and benchmark for the design of scale-aware convection parameterizations that could operate in the grey zone
Case Proposal: a cold air outbreak Thanks to: Paul Field, Adrian Hill (Met Office), S. de Roode Technical University Delft Netherlands) • The Mesoscale Community is interested to start with an extra-tropical case • Cold-air outbreaks are of general interest for various communities • Proposal: “Constrain” cold-air outbreak experiment 31 January 2010 • Participation of global models, mesoscale models but also from LES models !! • Domain of interest: 750X1500 km • Fast Transition : ~ 14 hours
Case Proposal: a cold air outbreak 3 Different Flavours • Global Simulations (at the highest possible resolution up to 3~10 km) • Mesoscale Models (Eulerian) at various resolutions LAM-set up • Mesoscale/LES Models (Lagrangian). Idealized with periodic BC. highest resolution (~200m)
Attacking the Cloud Feedback Problem Large interest from all modeling communities
Pro’s/Cons of the case • Cold air outbreak has been on the wishlist of the NWP community for many years • Excellent opportunity to bring the mesoscale and the LES community together. • It’s only now that high resolution models are able to faithfully reproduce the observed mesoscale structures associated with a cold air outbreak in a turbulence resolving mode. • There is a well observed case available (CONSTRAIN) • Case is complicated because of the (ice) microphysics . • Institutes who could do the Eulerian Case in a turbulent resolving resolution are encouraged to do so. (4000X7500X128 points) see next section • Might not have been the first obvious choice for a grey zone perspective from the point of view of global models (Deep convection might have been a more obvious choice for global modelers, but different cases can be considered in the Grey Zone Project)
Status Extensive tests have been done for all 3 flavours over the last year. Case Description is on the web now: http://www.knmi.nl/~samenw/greyzone http://appconv.metoffice.com/cold_air_outbreak/constrain_case/home.html Volunteers for coordinating/analysing the subprojects have been identified: Global Model runs : Axel Seifert (MPI Hamburg) Mesoscale model runs (Eulerian) : Paul Field/Adrian Hill (Met Office) LES/Mesoscale model runs (Lagrangian): Stephan de Roode/Pier Siebesma ( TU Delft)
Time Line and concluding remarks Submission deadline: April 2013 Meeting on discussion results : second half 2013 Grey Zone should also consider other cases. Further info: siebesma@knmi.nl Website: www.knmi.nl/samenw/greyzone
3. High Resolution Computing or, can we do the Eulerian version of the cold air outbreak at a turbulent resolving scale (~100 m)?
Graphical Processor Unit (GPU) for Atmospheric Simulations Jerome Schalkwijk TU Delft ‘Acceleration’ approach GPU-Accelerated code CPU GPU Advection Surface All relevant data transferred back and forth! Routine 3 Routine 4 Routine 5 50-100x promised speedup… Routine 6
Advection Surface Routine 3 Routine 5 Routine 6 GALES GPU-resident Atmospheric Large-Eddy Simulation Acceleration Residency CPU GPU Large 3D datasets stay on the GPU – no transfer needed! Routine 4
Increasing the problem size Multi-GPU simulations: embarrassingly parallel
400x400x12km ; 100m resolution 256 GPUs ; perfect weak scaling
MPI Increasing the problem size Multi-GPU simulations: multi-level parallel LES
Local workflow! Thank You ! Editing simulation settings Interaction while simulating! Schalkwijk, Griffith, Post & Jonker March 2012