1 / 11

Visualization at the Leadership Computing Facility

Visualization at the Leadership Computing Facility. Sean Ahern Scientific Computing Center for Computational Sciences. Statistical Analysis of Fusion Eddy Formation. Principal Component Analysis allows isolation of rotational modes of eddy evolution. Visualization of Neutron Density Fields.

Download Presentation

Visualization at the Leadership Computing Facility

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Visualization at the Leadership Computing Facility Sean Ahern Scientific Computing Center for Computational Sciences

  2. Statistical Analysis of Fusion Eddy Formation Principal Component Analysis allows isolation of rotational modes of eddy evolution

  3. Visualization of Neutron Density Fields • Joint work between ORNL and Oxford University has developed a model of nuclear matter in the transitional density region between inhomogeneous matter, containing nuclei and nucleons, and homogeneous matter, consisting of uniformly distributed nucleons, in collapsing stars. • We developed custom visualization techniques for analyzing the spatially varying distribution of nucleon densities.

  4. Statistics and Visualization for Climate Simulation Volume visualizations of the simulated time evolution of land and ocean components of atmospheric CO2 concentrations originating from the ocean surface were produced to help climate scientists examine the influence of climate variability, prescribed atmospheric CO2 levels, and land cover change on terrestrial carbon fluxes during the 20th century. Data from Phase I of the CCSM LCF Computational Climate Science End Station (CCSES)

  5. EVEREST Facility • 35 million pixel, 27-tile Powerwall • 30’ x 8’ • 14 NVIDIA 3000G GPUs • Interactive, large-scale, collaborative data analysis • Open source and custom software • DMX, Chromium, PixelBlaster, Blockbuster, etc. 5 Ahern_LCFViz_0611 5 Ahern_LCFViz_0611

  6. Visualization Architectures for Tera/Petascale Visualization • Largest datasets require use of institutional resources • Reduces data movement issues • Allows exploitation of multiple GPUs • Provides visualization to remote users • Exploited by VisIt, ParaView, EnSight Title

  7. Technology Curve Mismatches Win Problems • Processor speeds (and FLOPS) are going up (60%/year) • Interconnect speeds are going up • Graphics card performance is going up (100%/year) • Memory-to-memory copy hasn’t kept up (5-10%/year) • Disk access time hasn’t kept up (5-10%/ year) • Increasing bottleneck to data processing • I/O can often be the most expensive phase of simulation or post-processing • Increased computational ability won’t solve issues – can’t just ride processor speed curves • In some cases, users are not willing to move or post-process their data for visualization Data Access Patterns are Important!

  8. Summary and “Info-vis” techniques • Increasing need to compare datasets • Parameter studies • 2D/3D simulation correlations • Direct visual comparisonoften inadequate • Different mesh types, codes • High-dimensional datasets arebecoming much more common • “Information-driven” interaction • Contour spectrum • Topology graphs/trees • Topology methods • Distance fields • Shape characterization • Information visualization techniques

  9. Chromium RenderServer Render Server to Both Local and Remote Viewers • Remote image delivery • OpenGL and X11 paths • Collaborative • Supports tiles • Leverages: • Chromium • VNC • DMX • Small Business Technology Transfer program – DOE STTR

  10. Up and Coming Technologies…(and some already here) • Office “mini-clusters” • Vendor-integrated vis clusters • Very high resolution displays • Sony 4K projector • Multi-GPU systems • “Cell” processor • “Commodity” constellations with GPUs

  11. Contact • Sean Ahern • Scientific Computing • Center for Computational Sciences • (865) 241-3748 • ahern@ornl.gov 11 Ahern_LCFViz_0611

More Related