1 / 22

eScience meets eFrameworks

eScience meets eFrameworks. 28 th April 2006 NeSC, Edinburgh. AstroGrid overview. Virtual Observatory infrastructure

Download Presentation

eScience meets eFrameworks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. eScience meets eFrameworks 28th April 2006 NeSC, Edinburgh

  2. AstroGrid overview • Virtual Observatory infrastructure • “The VO vision can be summed up as the desire to make all archives speak the same language − all searchable and analysable by the same tools, all data sources accessible through a common interface, all data held in distributed databases that appear as one.”Andy Lawrence, 09/2003 • AstroGrid approach • Build infrastructure first • VObs  Web • Mix-and-match, plug-and-play, … eScience meets eFrameworks

  3. AstroGrid project • Duration: Sept 2001 – Dec 2007Funding: £7.7M (PPARC)Personnel: ~26 (23.4 FTE)Goal: Develop VObs Infrastructure Deploy UK VObsScope: Astrophysics, Solar, STP, … Optical, X-Ray, Radio, … • AG1 Phase A: Sept 2001 – Dec 2002 Analysis, R&D, Architecture • AG1 Phase B: Jan 2003 – Dec 2004 Build, test & deliver V1.0 • AstroGrid-2: Jan 2005 – Dec 2007 Develop & deploy eScience meets eFrameworks

  4. AstroGrid consortium • Edinburgh • Leicester • Cambridge • MSSL • JBO • RAL • QUB • From 2005, AstroGrid-2 + • Bristol • Exeter • Leeds • Portsmouth eScience meets eFrameworks

  5. Wider involvement • Europe • Euro-VO: http://euro-vo.org • VOTech project: http://www.eurovotech.org • FP6 funded: €6M: 2005-7 • Follows on from AVO (FP5 funded) • International • IVOA: http://ivoa.net • Closer collaborations • RVO • JVO • SAAO • NVO eScience meets eFrameworks

  6. Project Approach • Agile: • iterative & incremental development • Open project: • Static http://www.astrogrid.org • Wiki http://wiki.astrogrid.org • Forum http://forum.astrogrid.org • Jabber via jabber.astrogrid.org • Open Source code • Academic Free Licensehttp://www.opensource.org/licenses/academic.php • AstroGrid releaseshttp://software.astrogrid.org eScience meets eFrameworks

  7. AstroGrid infrastructure eScience meets eFrameworks

  8. Deployments 1 eScience meets eFrameworks

  9. Deployments 2 eScience meets eFrameworks

  10. Deployments 3 eScience meets eFrameworks

  11. How do the VO components that implement these interfaces relate to each other? eScience meets eFrameworks

  12. ADQL Translator Plugins: JDBC/Db FITS etc VOTable SkyNode SIAP/ SSAP Web Svc (CEA) DSA: Architecture DSA eScience meets eFrameworks

  13. Registry We can access data but how do we know where it can be found? • Knowledge of the location and nature of resources is critical for all this to work • The AstroGrid Infrastructure supplies a Registry (IVOA compliant of course!) which contains this information • Registry is searchable by humans and by machines. eScience meets eFrameworks

  14. Common Execution Architecture Why would we want CEA/UWS? • Same reason we need universal access to data • AstroGrid supports the concept of workflow • Essentially, remote execution of special scripts or jobs • Allows data access and processing to be undertaken at source or where the compute power lies • The Grid concept! • The AG Infrastructure helps make the details transparent, keeping focus on the science. eScience meets eFrameworks

  15. Workflow eScience meets eFrameworks

  16. Compute grid Common Execution Connector Application as .exe file (local) JES Common Execution Connector Application as .exe file (grid) Common Execution Connector Application as .HTTP service Registry Common Execution Connector Application as Java class eScience meets eFrameworks

  17. MySpace If all this happens remotely, where do the results end up? • Distributed data access and processing requires distributed data storage • MySpace is the AstroGrid solution • VOSpace/VOStore under review at IVOA • Distributed, virtual data storage that appears to the user as a homogeneous collection of files • Database tables coming soon! eScience meets eFrameworks

  18. Distributed Storage DSA Process Process Desktop MySpace eScience meets eFrameworks

  19. MySpace eScience meets eFrameworks

  20. AstroGrid Actors • Astronomers • Incl PIs, management structure, … • Instrument operators, … • Data curators • Publishers • Administrators • Systems managers • Developers • Standards definers … eScience meets eFrameworks

  21. AstroGrid and eFrameworks:Issues • Security • Shibboleth authentication: • Single-signon • Authorisation • Common policy requirements? • Service discovery • Resource registries • Need for commonality? • Will others need to get at astro resources? • Common identifiers? eScience meets eFrameworks

  22. AstroGrid and eFrameworks:Other Issues • Service execution • Workflow • Shared storage • Accounting • Naming! e.g. VO • Bottom line • Standards & interoperability • How do we get W3C, GGF, IVOA, IE & project standards to work? eScience meets eFrameworks

More Related