1 / 25

iGrid2005 Cyber-infrastructure

iGrid2005 Cyber-infrastructure. Paola Grosso GigaPort project UvA. Outline. The question : iGrid showed impressive science that used a custom built network. What happened behind the scenes to make it happen? With some background information : What is iGrid and how it has evolved.

Download Presentation

iGrid2005 Cyber-infrastructure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. iGrid2005 Cyber-infrastructure Paola Grosso GigaPort project UvA iGrid 2005 Cyber-Infrastructure

  2. Outline The question: iGrid showed impressive science that used a custom built network. What happened behind the scenes to make it happen? With some backgroundinformation: What is iGrid and how it has evolved. What is this optical networking about. The answer: Where, who and how the iGrid 2005 infrastructure took shape. iGrid 2005 Cyber-Infrastructure

  3. What is iGrid? The official web sites www.igrid2005.org contains the mission statement: • Three key points: • community driven • multi-10Gb networks • hybrid networks the 4th community-driven biennial International Grid event, is a coordinated effort to accelerate the use of multi-10Gb international and national networks, to advance scientific research, and to educate decision makers, academicians and industry researchers on the benefits of these hybrid networks. iGrid 2005 Cyber-Infrastructure

  4. History of previous iGrids The themes were already there from the beginning… iGrid1998:Empowering Global Research Community Networking Applications and technologies depend on end-to-end delivery of multi-tens-of-megabits bandwidth with quality of service control, and need the capabilities of emerging Internet protocols for resource control and reservation. iGrid2000: An International Grid Application Research Demonstration at INET2000 Demonstrate how the power of todays’ research networks enables access to remote computing resources, distribution of digital media, and collaboration with distant collegues. iGrid2002:The International Virtual Laboratory Demonstrate application demand for increased bandwidth. iGrid 2005 Cyber-Infrastructure

  5. Lambda networking The iGrid2005 cyber-infrastructure provided a lambda networking facility to demonstrators. In the scientific arena, lambda networking indicates: • use of different light wavelengths (i.e. light paths) to provide independent services over the same strand of optical fiber • creation of dedicated and application-specific paths Main lambda networking characteristics of the iGrid setup: • broad international connectivity • large available bandwidth • (user driven) light path provisioning • reconfigurable and flexible setup iGrid 2005 Cyber-Infrastructure

  6. Where and when? The event took place: in the CalIT2 building in the UCSD campus in San Diego; between Sep. 26-29 September 2005. Challenge: the building inauguration had not yet taken place: the network was built while the building was being finished up. iGrid 2005 Cyber-Infrastructure

  7. What and who? There were two main activities: demonstrations and symposium sessions. Over 300 participants …plus the committee members. Demonstrations A global effort: - 49 demonstrations; - 12 countries as main demo contacts; - 20 participating countries; - 4 continents. Symposium In the auditorium: - 6 keynote speakers; - 12 panels sessions; - 3 master classes. iGrid 2005 Cyber-Infrastructure

  8. Demonstrations types A closer look at the demonstrations types: • Data Services: 7 demos • E-Science: 4 demos • Lambda Services: 10 demos • Scientific Instrument Services: 3 demos • Supercomputing Services: 3 demos • Video Streaming Services: 5 demos • Visualization Services: 17 demos iGrid 2005 Cyber-Infrastructure

  9. How? … thanks to the effort of: • 16 sponsors • 38 organizing institutions • 15 organizing committee members • 10 subcommittees On the cyber-infrastructure side: • Cyber-infrastructure CalIT2 Co-Chairs and Committee members • Cyber-infrastructure Int’l/National Co-Chairs and Committee members iGrid 2005 Cyber-Infrastructure

  10. Demos requirements The guiding principle: ask what they want, and sometimes tell them what they need. A questionnaire that tried to understand the demos’ needs for: • On-site computers, data storage and visualization displays • Remote computers and storage • Software • Special-purpose equipment • Audio • Networking topology iGrid 2005 Cyber-Infrastructure

  11. Demo stations The demos were distributed across 4 spaces: TeraScale Room Cave Room Multipurpose Room Auditorium 3 demo stations: Rice: 2-Panel display Goodhue: 2-Panel Display Quin: 4-Panel display 3 demo stations: Couts: C-Wall Spreckels: 100 Mpixel Bushyhead: 3D Auto-stereo 2 demo stations: Sessions: Stereo Projection Bandini: Side-by-side Proj. ..plus Research Channel 2 demo stations: Swing: Sony 4K Projections Harrison: Side-by-side Projection iGrid 2005 Cyber-Infrastructure

  12. Onsite resources Two of the jewels: Tiled Display: 11x5 tiled display of NEC 20” 1600x1200 LCD panels Sony 4k Projection iGrid 2005 Cyber-Infrastructure

  13. Onsite resources (II) Another way to look at it: 24 10GE ports: • 5 interfaces for common infrastructure equipment: 3 x10GE nodes, 2 x 10GE ports for HP switch used for the Tiled Display in Spreckels • 19 interfaces for demonstrator equipment, network switches and nodes 11 1GE fiber ports: - 11 to demonstrator equipment, network switches and nodes 53 1GE copper ports: • 19 for common infrastructure equipment • 34 for demonstrators equipment iGrid 2005 Cyber-Infrastructure

  14. SunLight To satisfy the needs of the demos: SunLight, the optical exchange built for iGrid at CalIT2. Ingredients: • Lots (lots!) of planning. • Committees members met several times before the workshop time • Network equipmentdonated by vendors: • Cisco, Force10, Nortel primarily • Setup in the weeks preceding the workshop • Circuits delivery and installation iGrid 2005 Cyber-Infrastructure

  15. Nortel OME 6500 Optical switch Connections To local hosts Nortel HDXc Optical switch To outside resources Force10 E1200 Ethernet switch Connections To local hosts Cisco ONS 15454 Optical switch To outside resources ch1 ch3 ch4 Cisco 7609 Ethernet switch HP Ethernet switch Connections To local hosts Cisco 6509 Ethernet switch ch2 Connections To CAVEwave Connections To Tiled Display Connections To SDSC T320 SunLight (II) iGrid 2005 Cyber-Infrastructure

  16. External connectivity From SunLight: 10 x 10gbps = 100gbps available to the demonstrators. (Side note: during iGrid2002 it was 1 x 10GE) Some paths to be mentioned: CaveWave link to Chicago, used for many of the visualization demos. Layer1 circuits - few. Layer2 circuits - the majority. Layer3 circuits - for the routed connectivity. iGrid 2005 Cyber-Infrastructure

  17. Layer1/2 int’l connectivity iGrid 2005 Cyber-Infrastructure

  18. Layer1/2 int’l connectivity (II) An international effort to reach the demonstrators’ countries: Asia - China, Korea, Japan, Taiwan North America - Canada, Mexico, US Europe - Czech Republic, Netherlands, Poland, Spain, UK A central role played by the various optical exchanges: PacificWave in Seattle KRLight in Seoul T-LEX in Tokyo StarLight/TransLight in Chicago MANLAN in New York NetherLight in Amsterdam UKLight in London CZLight in Prague NorthernLight in Stockholm … all part of the GLIF. The GLIF meeting followed iGrid iGrid 2005 Cyber-Infrastructure

  19. Layer3 infrastructure iGrid 2005 Cyber-Infrastructure

  20. Routing Did I hear well?… Not surprising: routing is a component in hybrid networks. Routing needed: Internet connectivity to demonstrators, via commodity peering from UCSD and connection to major NRENs; Demos using Layer3 paths via NRENs; Routing in SunLight to direct multiple demos to shared resources , for example to Tiled Display. iGrid 2005 Cyber-Infrastructure

  21. The NOC Committee members and vendor engineers provided the NOC support during the workshop. The NOC: • setup the infrastructure: racking, pulling fibers • configure the equipment • provide continuing support to the demonstrators The biggest challenge: • automatic versus manual configuration. • scheduling of common links Missing: the user/application _really_ configuring the light paths. Not all demos were “NOC-independent” after the kick-off. iGrid 2005 Cyber-Infrastructure

  22. Light paths What is in a name? For every demonstrator light paths meant something else: • optical path without L2 or L3 services; • L2 path over completely dedicated circuits, with possible need for scheduling; • L2 path over shared link (coexisting demos); • Mix of L3 and L2 features. For each demo the NOC needed to do the “translation” among the various meaning. iGrid 2005 Cyber-Infrastructure

  23. “Dutch” lightpaths An easy way to see this: 4 demos with a Dutch label NL101,NL102, NL103, NL104… NL104 Dead cat demo NL101/2 VM Turntable, Token-based network element access control and path selection NL103 IPv4 Link-local addressing for optical networks AMS AMS AMS IRNC link VLAN NL103 Routed Internet CHI NY CHI VLAN NL103 CaveWave link SAN SAN SEA Effort?… Low. Routing does It all but performance Needs to be tuned. Effort?… High. Effort?… Medium. VLAN configuration Difficult when L2 is multi-domain SAN iGrid 2005 Cyber-Infrastructure

  24. Just after iGrid: SC05 Using the experience gained in September, many tried again. iGrid 2005 Cyber-Infrastructure

  25. Lessons learned • It was a lot of work, but the achievements were rewarding. • Global lambdas are a reality and a need. • The community is focusing on the tools for automatic engineering and setup needed on hybrid networks. Submitted an article on the topic: The network infrastructure at iGrid2005: lambda networking in action - Paola Grosso, Pieter de Boer and Linda Winkler. iGrid 2005 Cyber-Infrastructure

More Related