1 / 15

Grid Computing

Grid Computing. Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008. Computing challenges at LHC. Event generation ( Pythia ). Detector simulation ( Geant4 ). 100011110101110101100101110110100. Hit digitization. Reconstruction. Analysis data preparation.

ervin
Download Presentation

Grid Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008

  2. Computing challenges at LHC

  3. Event generation (Pythia) Detector simulation (Geant4) 100011110101110101100101110110100 Hit digitization Reconstruction Analysis data preparation Analysis, results (ROOT) “Full chain” of HEP data processing Slide adapted from Ch.Collins-Tooth and J.R.Catmore

  4. ATLAS Monte Carlo data production flow (10 Mevents) • Very different tasks/algorithms (ATLAS experiment in this example) • Single “job” lasts from 10 minutes to 1 day • Most tasks require large amounts of input and produce large output data

  5. LHC computing specifics • Data-intensive tasks • Large datasets, large files • Lengthy processing times • Large memory consumption • High throughput is necessary • Very distributed computing and storage resources • CERN can host only a small fraction of needed resources and services • Distributed computing resources of modest size • Produced and processed data are hence distributed, too • Issues of coordination, synchronization, data integrity and authorization are outstanding

  6. Software for HEP experiments

  7. Grid is a result of IT progress Graph from “The Triumph of the Light”, G. Stix, Sci. Am. January 2001

  8. Some Grid projects; originally byVickyWhite, FNAL

  9. Grids in LHC experiments • Almost all Monte Carlo and data processing today is done via Grid • There are 20+ Grid flavors out there • Almost all are tailored for a specific application and/or specific hardware • LHC experiments make use of 3 Grid middleware flavors: • gLite • ARC • OSG • All experiments develop own higher-level Grid middleware layers • ALICE – AliEn • ATLAS – PANDA and DDM • LHCb – DIRAC • CMS – ProdAgent and PhEDEx

  10. ATLAS Experiment at CERN - Multi-Grid Infrastructure Graphics from a slide by A.Vaniachine

  11. Nordic DataGrid Facility

  12. Swedish contribution: SweGrid • Co-funded by the Swedish Research Council and the Knut and Alice Wallenberg foundation • One technician per center • Middleware: ARC, gLite • 1/3 allocated to LHC Computing

  13. SweGrid and NDGF usage SweGrid usage ATLAS productionin 2007

  14. Swedish contribution to LHC-related Grid R&D • NorduGrid (Lund, Uppsala, Umeå, Linköping, Stockholm and others) • Produces ARC middleware, 3 core developers are in Sweden • SweGrid: tools for Grid accounting, scheduling, distributed databases • Used by NDGF, other projects • NDGF: interoperability solutions • EU KnowARC (Lund, Uppsala + 7 partners) • 3 MEUR project (3 years), develops next generation ARC. • Project’s technical coordinator is in Lund • EU EGEE (Umeå, Linköping, Stockholm)

  15. Summary and outlook • Grid technology is vital for the success of LHC • Sweden contributes very substantially with hardware, operational support and R&D • Very high efficiency • Sweden has signed MoU with LHC Computing Grid in March 2008 • Pledge of long-term computing service for LHC • SweGrid2 is coming • A major upgrade of SweGrid resources • Research Council granted 22.4 MSEK for investments and operation in 2007-2008 • 43 MSEK more are being requested for years 2009-2011 • Includes not just Tier1, but also Tier2 and Tier3 support

More Related