1 / 15

Contract Year 1 Review Computational Environment (CE)

Contract Year 1 Review Computational Environment (CE). Shirley Moore University of Tennessee-Knoxville May 16, 2002. CE Strategy. Top priorities: A computational environment that is consistent, well-documented, and easy-to-use across the SRCs

dusty
Download Presentation

Contract Year 1 Review Computational Environment (CE)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contract Year 1 ReviewComputational Environment (CE) Shirley Moore University of Tennessee-Knoxville May 16, 2002

  2. CE Strategy • Top priorities: • A computational environment that is consistent, well-documented, and easy-to-use across the SRCs • Debugging and performance analysis tools that are scalable and easy-to-use in the SRC environment • Focus on: • Enabling DoD users to determine what performance they are getting and improve that performance on SRC platforms • Parallelization strategies and programming practices that enhance application portability across platforms • Tools and strategies for efficient file management and I/O • Use of COTS and/or freely available tools; interaction with tool developers on improvements and new tool features • Training on HPC architectures, tools & methodologies (Core) Contract Year 1 Review

  3. CE Efforts for Contract Year 1 • Recruitment and hiring of CE onsite Thomas Cortese • Establishment of and interaction with CE User Advisory Panel • Development of comprehensive CE training curriculum and coordination of CE training • Collaborations with tool developers • Establishment of contacts and working relationships with SRC systems and user support staff Contract Year 1 Review

  4. User Contacts and Assistance • Assisted EQM code developer Victor Parr with parallel I/O and with use of PAPI (ongoing email and phone contacts) • Assisted ARL MSRC user Dale Shires with IBM and SGI architecture and MPI questions (Jan 2002) • Assisted ARL MSRC user Marshall Cory with ScaLAPACK questions (Feb 2002) • Assisting USNA user Reza Malek-Malani with scheduling seminar on cluster computing • Installed Vampir-GuideView (VGV) beta version at NAVO MSRC at request of CWO onsite Tim Campbell (March 2002) Contract Year 1 Review

  5. Tools Introduced • Repository in a Box (RIB) toolkit used for ERDC MSRC CTA repositories • PAPI cross-platform interface to hardware performance counters • Vampir-GuideView (VGV) combined MPI/OpenMP performance analysis tool (beta version for evaluation) • MPE Logging/Jumpshot freely available MPI performance analysis tool (under evaluation) • TAU for MPI and/or OpenMP program analysis (under evaluation) • Vprof basic block profiler (under evaluation) Contract Year 1 Review

  6. CE Training • Advanced MPI (advanced), ERDC MSRC, 26-28 Sep 2001, David Cronk (UTK) • Introduction to MPI (beginning), ASC MSRC, 16-17 Jan 2002, David Cronk (UTK), 6 attendees, 40 CD’s distributed, Evaluation: 4.2/5.0 • MPI Tips and Tricks (intermediate), Fort Monmouth, 30 Jan – 1 Feb 2002, David Cronk (UTK), 10 attendees • Compaq AlphaServer SC System, ERDC MSRC, 31 Jan – 1 Feb 2002, David Ennis (OSC) • Advanced MPI (advanced), ARL MSRC, 13-14 Mar 2002, David Cronk (UTK), 13 attendees, Evaluation: 10 Excellent, 3 Good Contract Year 1 Review

  7. CE Training (cont.) • Cross-Platform Performance Analysis Tools (intermediate), ERDC MSRC, 19-20 Mar 2002, Shirley Moore (UTK), 7 attendees, Evaluation: • C Programming, AEDC, 24-29 Mar 2002, David Ennis (OSC) • Performance Optimization for Vector Processors, NAVO MSRC, 3-4 Apr 2002, James Giuliani (OSC) • Debugging Parallel Code Using TotalView, NAVO MSRC, 7-9 May 2002, David Cronk and Thomas Cortese (UTK) Contract Year 1 Review

  8. Conference Presentations • David Cronk, “MPI-I/O for EQM Applications”, DoD HPC UGC 2001, Biloxi, MS, June 2001 • David Cronk, “Metacomputing Support for the SARA3D Structural Acoustics Application”, DoD HPC UGC 2001, Biloxi, MS, June 2001 • Shirley Moore, David Cronk, Kevin London, and Jack Dongarra, “Review of MPI Performance Analysis Tools”, EuroPVM/MPI2001, Santorini, Greece, April 2002 (rescheduled from September 2001) • Shirley Moore, “A Comparison of Counting and Sampling Modes of Using Performance Monitoring Hardware”, International Conference on Computational Science (ICCS 2002), Amsterdam, April 2002 Contract Year 1 Review

  9. CE009: A Consistent Well-Documented Computational Environment • Collaborate with SRC systems and user support staff to implement, document, and support a consistent computational environment across the SRCs • $207,756; 1 Oct 2001 – 30 Sep 2002 • PI: Shirley Moore • Deliverables • Information in SRC user guides and/or OKC about all installed components of the computational environment • Checklists for testing tool installation • Traveling onsite support to SRCs • Workshops to evaluate new tool technologies (ready to schedule) • Quarterly report on status of the computational environment at the SRCs • Explanation for yellow: change in plans from commercial to freely available tools due to budget constraints: OKC not in production mode yet Contract Year 1 Review

  10. CE010: PAPI Deployment, Evaluation, and Extensions • Deploy and support the PAPI cross-platform interface to hardware performance counters on all SRC platforms • Investigate accuracy of hardware counter data • Implement memory utilization extensions • $235,064 (UTK, UTEP, PSC), 1 Oct 2001-30 Sep 2002 • PI: Shirley Moore • MSI Participation: Patricia Teller, UTEP • Deliverables • Installation, testing, and documentation of PAPI and related tools (TAU, VProf) on all MSRC platform (Compaq Alpha substrate in progress) • Microbenchmarks for measuring accuracy of hardware performance data (analyzing data and devising additional benchmarks) • Design of memory utilization extensions and implementation on ASC platforms (in design phase) Contract Year 1 Review

  11. CE012: Metacomputing Support for SIP Image Processing • Use NetSolve to improve the performance and scalability of Tannenbaum’s image segmentation algorithm. • Previous work: SARA3D • Also looking at XPatch • $70K, 1 Oct 2001-30 Sep 2002 • PI: Shirley Moore • Deliverables • Implementation of portions of Tannenbaum’s algorithm as NetSolve services • Implementation of coarse-grained parallelism • Persistent storage of intermediate results • Deployment on grid computing testbed (must satisfy security requirements) Contract Year 1 Review

  12. CE019: SPMD Collective Communication Model • To develop an Open Source Fortran 90 module for the most commonly used SPMD collective communication operations • $74,923, 1 Mar 2002-30 Jan 2003 • PI: Timothy Kaiser, Ph.D., Coherent Cognition • Deliverables • An API for SPMD collective communications based on a Fortran 90 module (specification and documentation started) • Open Source reference implementation of the module for MPI (near completion) • Open Source reference implementation of the module for SHMEM (not started) • Test and timing suite (near completion) Contract Year 1 Review

  13. Core Financial Summary Contract Year 1 Review

  14. Staffing • Onsite CE position at NAVO MSRC filled in February 2002 – Dr. Thomas Cortese Contract Year 1 Review

  15. Summary • Strategic priorities determined by CE user advisory panel (ranked from 1 to 4, 1 being highest): • Performance evaluation (1.7) • Data management and I/O (1.7) • Application portability (1.7) • Consistent computational environment (2.1) • Documentation and user support (1.7) • Debugging (2.3) • Dynamic monitoring and control of application execution (2.4) • Grid or “meta”-computing (2.9) • Above priorities being addressed by CE core support (incl. training and by current and planned projects) Contract Year 1 Review

More Related