1 / 21

SCARIe FABRIC A pilot study of distributed correlation

SCARIe FABRIC A pilot study of distributed correlation. Huib Jan van Langevelde Ruud Oerlemans Nico Kruithof Sergei Pogrebenko and many others…. What correlators do…. Synthesis imaging simulates a very large telescope by measuring Fourier components of sky brightness on each baseline pair

primo
Download Presentation

SCARIe FABRIC A pilot study of distributed correlation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SCARIe FABRICA pilot study of distributed correlation Huib Jan van Langevelde Ruud Oerlemans Nico Kruithof Sergei Pogrebenko and many others…

  2. What correlators do… • Synthesis imaging simulates a very large telescope • by measuring Fourier components of sky brightness • on each baseline pair • Sensitivity is proportional to √bandwidth • optimal use of available recording bandwidth • by sampling 2 bits (4 level) at Nyquist rate • Correlator calculates ½N(N-1) baseline outputs • after compensating for the geometry of array • Integrates output signal to something relatively slow • and samples with delay/frequency resolution huib 02/11/06

  3. EVN MkIV data processor at JIVE • Implements this in custom silicon • 16 stations input from tapes • now hard-disks and fibres • Input data is 1 Gb/s max • 1 or 2 bit sampled • up to 16 sub-bands • format includes time codes • “Super computer” 1024 chips • 256 complex correlations each • at 32 MHz clock • Around 100 T-operations/sec • 2 bit only! • Depends a bit how you do it huib 02/11/06 Should next correlator also use special hardware?

  4. Next generation… Can be implemented on standard computing? Time critical, keep up with input example: LOFAR on BlueGene Higher precision and new applications Better sensitivity, interference mitigation, spacecraft navigation Can CPU cycles be found on the Grid? From 16 antenna @ 1Gb/s (eVLBI) And growing… To 1000s at 100 Gb/s (SKA) Pilot projects FABRIC & SCARIe Connectivity, workflow Real-time resource allocation LOFAR central processor FABRIC eVLBI SKA inner core (5km) huib 02/11/06

  5. Tflops, Pflops… • 2 bit operations ⇒ floating point • Results in enormous computing tasks • Very few operations / bit • Some could be associated with telescope huib 02/11/06 SKA not even in here… Rough estimate based on XF correlation

  6. SCARIe FABRIC • EC funded project EXPReS (03/2006) • To turn eVLBI into an operational system • Plus: Joint Research Activity: FABRIC • Future Arrays of Broadband Radio-telescopes on Internet Computing • One work-package on 4Gb/s data acquisition and transport (Jodrell Bank, Metsahovi, Onsala, Bonn, ASTRON) • One work-package on distributed correlation (JIVE, PSNC Poznan) • Dutch NWO funded project SCARIe (10/2006) • Software Correlator Architecture Research and Implementation for eVLBI • Collaboration with SARA and UvA • Use Dutch Grid with configurable high connectivity: StarPlane • Software correlation with data originating from JIVE • Complementary projects with matching funding • International and national expertise from other partners • Total of 9 man year at JIVE, plus some matching from staff • plus similar amount at partners huib 02/11/06

  7. Aim of the project • Research the possibility of distributed correlation • Using the Grid for getting the CPU cycles • Can it be employed for the next generation VLBI correlation? • Exercise the advantages of software correlation • Using floating point accuracy and special filtering • Explore (push) the boundaries of the Grid paradigm • “Real time” applications, data transfer limitations • To lead to a modest size demo • With some possible real applications: • Monitoring EVN network performance • Continuous available eVLBI network with few telescopes • Monitoring transient sources • Astrometry, possibly of spectral line sources • Special correlator modes: spacecraft navigation, pulsar gating • Test bed for broadband eVLBI research Something to try on the roadmap for the next generation correlator, even if you do not believe it is the solution… huib 02/11/06

  8. Previous experience on Software correlation • Builds on previous experience at JIVE • regular and automated network performance tests • Using Japanese software correlator from NICT • Huygens extreme narrow band correlation • Home grown superFX with sub-Hz resolution huib 02/11/06

  9. Work packages • Grid resource allocation • Grid workflow management • Tool to allocate correlator resources and schedule correlation • Data flow from telescopes to appropriate correlator resources • Expertise from the Poznan group in Virtual Laboratories • Will this application fit on Grid? • As it is very data intensive • And time-critical if not real-time • Software correlation • correlator algorithm design • High precision correlation on standard computing • Scalable to cluster computers • Portable for grid computers and interfaced to standard middleware • Interactive visualization and output definition • Collect & merge data in EVN archive • Standard format and proprietary rights huib 02/11/06

  10. Basic idea • Use the Grid for correlation • CPU cycles on compute nodes • The Net could be crossbar switch? huib 02/11/06 • Correlation will be asynchronous • Based on floating point arithmetic • Portable code, standard environment

  11. Workflow Management • Must interact with normal VLBI schedules • Divide data, route to compute nodes, setup correlation • Dynamic resource allocation, keep up with incoming data! Effort from Poznan, based on their Virtual Lab. huib 02/11/06

  12. Topology Slice in time Every node gets an interval A “new correlator” for every time slice Employ clusters computers at nodes Minimizes total data transport Bottleneck at compute node Probably good connectivity at Grid nodes anyway Scales perfectly Easily estimated how many nodes are needed Works with heterogeneous nodes But leaves sorting to compute nodes Memory access may limit effectiveness Slice in baseline Assign a (or a range of) products to a certain node E.g. two data streams meet in some place Transport Bottleneck at sources (telescopes) Maybe curable with multicast transport mechanism which forks at network nodes Some advantage when local nodes at telescopes Does not scale very simply Simple schemes for ½N2 nodes Need to re-sort output But reduces the compute problem Using the network as the cross-bar switch huib 02/11/06

  13. Work packages • Grid resource allocation • Grid workflow management • Tool to allocate correlator resources and schedule correlation • Data flow from telescopes to appropriate correlator resources • Expertise from the Poznan group in Virtual Laboratories • Will this application fit on Grid? • As it is very data intensive • And time-critical if not real-time • Software correlation • correlator algorithm design • High precision correlation on standard computing • Scalable to cluster computers • Portable for grid computers and interfaced to standard middleware • Interactive visualization and output definition • Collect & merge data in EVN archive • Standard format and proprietary rights huib 02/11/06

  14. Broadband software correlation Station 1 Station 2 Station N EVN Mk4 equivalents Raw data BW=16 MHz, Mk4 format on Mk5 disk From Mk5 to linux disk Raw data 16 MHz, Mk4 format on linux disk DIM,TRM,CRM Channel extraction Extracted data SU Pre-calculated,Delay tables DCM,DMM,FR Delay corrections Delay corrected data Correlator Chip huib 02/11/06 Correlation. SFXC Data Product

  15. Better SNR than Mk4 hardware huib 02/11/06

  16. Software correlation Working on benchmarking Single core processors so far Different CPU’s available Already quite efficient More work on memory performance Must deploy on cluster computers And then on Grid Organize the output to be used for astronomy huib 02/11/06

  17. Side step: Data intensive processing Radio-astronomy can be extreme User data sets can be large Few – 100 GB now Larger: LOFAR, eVLBI, APERTIF, SKA All data enter imaging Iterative calibration schemes Few operations per Byte Parallel computing: not obviously suited for messaging systems Task (data oriented) parallelization Processing traditionally done interactively on user platform More and more pipeline approaches Addressed in RadioNet Project ALBUS resulted in Python for AIPS Looking for extension in FP7 Interoperability with ALMA, LOFAR But for user domain huib 02/11/06

  18. Goal of the project • Develop: methods for high data rate e-VLBI using distributed correlation • High data rate eVLBI data acquisition and transport • Develop a scalable prototype for broadband data acquisition • Prototype acquisition system • Establish a transportation protocol for broadband e-VLBI • Build into prototype, establish interface normal system • Interface e-VLBI public networks with LOFAR and e-MERLIN dedicated networks • Correlate wide band Onsala data on eMERLIN • Demonstrate LOFAR connectivity • Distributed correlation • Setup data distribution over Grid • Workflow management tool • Develop a software correlator • Run a modest distributed eVLBI experiment huib 02/11/06

  19. Current eVLBI practice user correlator parameters observing schedule in VEX format earth orientation parameters field system controls antenna and acquisition BBC & samplers correlator control including model calculation output data Mk4 formatter Mk4 data in Mk5prop form over TCPIP Mk5 recorder Mk5 playback huib 02/11/06

  20. FABRIC components GRID resources data user correlator parameters observing schedule in VEX format earth orientation parameters field system controls antenna and acquisition resource allocation and routing correlator control including model calculation DBBC VSI output data FABRIC = The GRID VSIe?? on?? PC-EVN #2 huib 02/11/06

More Related