1 / 24

Pegasus: Planning for Execution in Grids

Pegasus: Planning for Execution in Grids. Ewa Deelman Information Sciences Institute University of Southern California. Pegasus Acknowledgements. Ewa Deelman, Carl Kesselman, Saurabh Khurana, Gaurang Mehta, Sonal Patil, Gurmeet Singh, Mei-Hui Su, Karan Vahi (ISI)

glain
Download Presentation

Pegasus: Planning for Execution in Grids

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pegasus: Planning for Execution in Grids Ewa Deelman Information Sciences Institute University of Southern California

  2. Pegasus Acknowledgements • Ewa Deelman, Carl Kesselman, Saurabh Khurana, Gaurang Mehta, Sonal Patil, Gurmeet Singh, Mei-Hui Su, Karan Vahi (ISI) • James Blythe, Yolanda Gil (ISI) • http://pegasus.isi.edu • Research funded as part of the NSF GriPhyN, NVO and SCEC projects. Ewa Deelman Information Sciences Institute

  3. Outline • General Scientific Workflow Issues on the Grid • Mapping complex applications onto the Grid • Pegasus • Pegasus Application Portal • LIGO-gravitational-wave physics • Montage-astronomy • Incremental Workflow Refinement • Futures Ewa Deelman Information Sciences Institute

  4. Grid Applications • Increasing in the level of complexity • Use of individual application components • Reuse of individual intermediate data products (files) • Description of Data Products using Metadata Attributes • Execution environment is complex and very dynamic • Resources come and go • Data is replicated • Components can be found at various locations or staged in on demand • Separation between • the application description • the actual execution description Ewa Deelman Information Sciences Institute

  5. Abstract Workflow Generation Concrete Workflow Generation Ewa Deelman Information Sciences Institute

  6. Why Automate Workflow Generation? • Usability: Limit User’s necessary Grid knowledge • Monitoring and Directory Service • Replica Location Service • Complexity: • User needs to make choices • Alternative application components • Alternative files • Alternative locations • The user may reach a dead end • Many different interdependencies may occur among components • Solution cost: • Evaluate the alternative solution costs • Performance • Reliability • Resource Usage • Global cost: • minimizing cost within a community or a virtual organization • requires reasoning about individual user’s choices in light of other user’s choices Ewa Deelman Information Sciences Institute

  7. Executable Workflow Construction • Chimera builds an abstract workflow based on VDL descriptions • Pegasus takes the abstract workflow and produces and executable workflow for the Grid • Condor’s DAGMan executes the workflow Ewa Deelman Information Sciences Institute

  8. Pegasus:Planning for Execution in Grids • Maps from abstract to concrete workflow • Algorithmic and AI based techniques • Automatically locates physical locations for both components (transformations) and data • Use Globus RLS and the Transformation Catalog • Finds appropriate resources to execute • via Globus MDS • Reuses existing data products where applicable • Publishes newly derived data products • Chimera virtual data catalog Ewa Deelman Information Sciences Institute

  9. Chimera is developed at ANL By I. Foster, M. Wilde, and J. Voeckler Ewa Deelman Information Sciences Institute

  10. Example Workflow Reduction • Original abstract workflow • If “b” already exists (as determined by query to the RLS), the workflow can be reduced Ewa Deelman Information Sciences Institute

  11. Mapping from abstract to concrete • Query RLS, MDS, and TC, schedule computation and data movement Ewa Deelman Information Sciences Institute

  12. Ewa Deelman Information Sciences Institute

  13. LIGO Scientific Collaboration • Continuous gravitational waves are expected to be produced by a variety of celestial objects • Only a small fraction of potential sources are known • Need to perform blind searches, scanning the regions of the sky where we have no a priori information of the presence of a source • Wide area, wide frequency searches • Search is performed for potential sources of continuous periodic waves near the Galactic Center and the galactic core • The search is very compute and data intensive • LSC used the occasion of SC2003 to initiate a month-long production run with science data collected during 8 weeks in the Spring of 2003 Ewa Deelman Information Sciences Institute

  14. Additional resources used: Grid3 iVDGL resources Ewa Deelman Information Sciences Institute

  15. LIGO Acknowledgements • Bruce Allen, Scott Koranda, Brian Moe, Xavier Siemens, University of Wisconsin Milwaukee, USA • Stuart Anderson, Kent Blackburn, Albert Lazzarini, Dan Kozak, Hari Pulapaka, Peter Shawhan, Caltech, USA • Steffen Grunewald, Yousuke Itoh, Maria Alessandra Papa, Albert Einstein Institute, Germany • Many Others involved in the Testbed • www.ligo.caltech.edu • www.lsc- group.phys.uwm.edu/lscdatagrid/ • http://pandora.aei.mpg.de/merlin/ • LIGO Laboratory operates under NSF cooperative agreement PHY-0107417 Ewa Deelman Information Sciences Institute

  16. Montage • Montage (NASA and NVO) • Deliver science-grade custom mosaics on demand • Produce mosaics from a wide range of data sources (possibly in different spectra) • User-specified parameters of projection, coordinates, size, rotation and spatial sampling. Mosaic created by Pegasus based Montage from a run of the M101 galaxy images on the Teragrid. Ewa Deelman Information Sciences Institute

  17. Small Montage Workflow ~1200 nodes Ewa Deelman Information Sciences Institute

  18. Montage Acknowledgments • Bruce Berriman, John Good, Anastasia Laity, Caltech/IPAC • Joseph C. Jacob, Daniel S. Katz, JPL • http://montage.ipac. caltech.edu/ • Testbed for Montage: Condor pools at USC/ISI, UW Madison, and Teragrid resources at NCSA, PSC, and SDSC. Montage is funded by the National Aeronautics and Space Administration's Earth Science Technology Office, Computational Technologies Project, under Cooperative Agreement Number NCC5-626 between NASA and the California Institute of Technology. Ewa Deelman Information Sciences Institute

  19. Other Applications Using Chimera and Pegasus • Other GriPhyN applications: • High-energy physics: Atlas, CMS (many) • Astronomy: SDSS (Fermi Lab, ANL) • Astronomy: • Galaxy Morphology (NCSA, JHU, Fermi, many others, NVO-funded) • Biology • BLAST (ANL, PDQ-funded) • Neuroscience • Tomography (SDSC, NIH-funded) Ewa Deelman Information Sciences Institute

  20. Current System Ewa Deelman Information Sciences Institute

  21. Workflow Refinement and execution User’s Workflow refinement Request Levels of abstraction Application Policy info Workflow repair -level knowledge Relevant components Logical tasks Full abstract workflow Tasks bound to Task matchmaker resources and sent for Partial execution execution Not yet time executed executed Ewa Deelman Information Sciences Institute

  22. Incremental Refinement • Partition Abstract workflow into partial workflows Ewa Deelman Information Sciences Institute

  23. Meta-DAGMan Ewa Deelman Information Sciences Institute

  24. Future Directions • Incorporate AI-planning technologies in production software (Virtual Data Toolkit) • Investigate various scheduling techniques • Investigating fault tolerance issues • Selecting resources based on their reliability • Responding to failures • http://pegasus.isi.edu Ewa Deelman Information Sciences Institute

More Related