1 / 23

High Speed Links

High Speed Links. F. Vasey, CERN, PH-ESE. High speed links in LHC and commercial applications On-going common projects for LHC Phase I upgrades Towards HL-LHC Conclusions. 1. High Speed Links in LHC. Trigger. Timing/triggers/ sync. Control/monitoring. Switching network. CPU.

jase
Download Presentation

High Speed Links

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. High Speed Links F. Vasey, CERN, PH-ESE • High speed links in LHC and commercial applications • On-going common projects for LHC Phase I upgrades • Towards HL-LHC • Conclusions francois.vasey@cern.ch

  2. 1. High Speed Links in LHC Trigger Timing/triggers/sync Control/monitoring Switching network CPU Front-end DAQ interface Front-end DAQ interface CPU Front-end DAQ interface Front-end DAQ interface 1 CPU Front-end DAQ interface 2 CPU CPU Front-end DAQ interface N CPU francois.vasey@cern.ch

  3. 1.1 Many different Link types • Readout - DAQ: • Unidirectional • Event frames. • High rate • Point to point • Trigger data: • Unidirectional • High constant data rate • Short and constant latency • Point to point • Detector Control System • Bidirectional • Low/moderate rate (“slow control”) • Bus/network or point to point • Timing: Clock, triggers, resets • Precise timing (low jitter and constant latency) • Low latency • Fan-out network(with partitioning) • Different link types remain physically separate, each with their own specific implementation francois.vasey@cern.ch

  4. 1.2 For instance: Links in ATLAS francois.vasey@cern.ch

  5. 1.3 For instance: Links in CMS francois.vasey@cern.ch

  6. 1.4 High Speed Optical Links in LHC • Large quantity o(100k), large diversity • Majority running @ o(1Gbps) • From full custom to qualified COTS • Tuned to specific application and environment • Developed by independent teams • Successful adoption of technology in HEP francois.vasey@cern.ch

  7. 1.5 High Speed Links in LHC: Lessons Learned • Increase Link Bandwidth • amortize system cost better • Share R&D effort • use limited resources optimally • Strengthen quality assurance Programs • Identify problems early • Test at system level Joint ATLAS/CMS NOTE ATL-COM-ELEC-2007-001 CMS-IN-2007/066 https://edms.cern.ch/document/882775/3.8 francois.vasey@cern.ch

  8. 1.6 High Speed (short distance) Links outside LHC • Rapid progress driven by: • Successful standardization effort • 100 GbE standard ratified in 2010 by IEEE • Availability of hardware cores embedded in FPGAs • 50+ x 10G transceivers in modern FPGAs • Commercial availability of MSA-based hardware • Commodity 10G and 40G devices • Emerging 100G and 400G engines • Current LAN rates @ o(10Gbps), ramping up to 40Gbps • Widening performance gap compared to HEP • But consider: • Specific environmental constraints • Long development time: vintage 2000 hardware in LHC francois.vasey@cern.ch

  9. 2. On-going Development Projects for LHC Phase I Upgrades • Initially aiming at a single SLHC target • Launched in 2008 • Timely developments for phase I upgrades • Working Groups • Microelectronics User Group (MUG) • Optoelectronics Working Group (Opto WG) • Common Projects • Rad Hard Optical Link • GigaBit Transceiver (GBT) project (chip-set) • & GBT-FPGA project • Versatile Link (VL) project (opto) • & Gigabit Link Interface Board (GLIB) • Many others … francois.vasey@cern.ch

  10. 2.1 Rad Hard Optical Link Common Project • Requirements: • General • Bi-directional • High data rate: 4.8Gbits/s • Low and constant latency (for TTC and trigger data paths) • Error detection and correction • Environment • Radiation hard ASICs (130nm) and radiation qualified opto-electronics at Front-End • Magnetic Field tolerant devices at Front-End • Flexible chip interface (e-links) to adapt to various architectures • Compatibility with legacy fibre plants • Back-end COTS • High-end FPGAs with embedded transceivers • GBT-FPGA firmware • Parallel optics • Commitment to deliver to ATLAS, CMS and LHCb in 2014-2015 francois.vasey@cern.ch

  11. 2.2 Impact on System Architecture Timing/triggers/sync Trigger • Custom development for difficult Front-End • Firmware only for COTS-based Back-End • Evaluation platform for system-tests Switching network CPU CPU Front-end interface Rad-hard optical links CPU Front-end interface CPU Front-end interface CPU DCS network Control/monitoring CPU francois.vasey@cern.ch

  12. 2.3 GBT Project Status • Project started in 2008 • GBT (Serializer/Deserializer) • 1st iteration (GBT-Serdes) in 2009 • 2nd iteration (GBTx) in 2012 • Packaging in 2013 • 3rd iteration and prod in 2014 • GBLD (Laser Driver) • Final iteration (V4.1/V5) in 2013 • GBTIA (Pin Diode Receiver) • Final iteration (V3) in 2012 • GBTSCA (Slow Control ASIC) • Final version expected in 2014 • GBT-FPGA firmware • Available • Tracking major FPGA families • Project delivers • Chipset for Front-End • GBT-FPGA Back-End firmware francois.vasey@cern.ch

  13. 2.4 VL Project Status • Kick-off: April08 • Proof of concept: Sep09 • Feasibility demo: Sep11 • Project delivers • Custom built Rad Hard VTRx • Production readiness: Apr14 • Early delivery of rad-soft VTTX to CMS-Cal-Trig: Dec13 • Recommendations for • Fibre and connectors • Backend optics • Evaluation Interface boards (GLIB) • Experiments • Design their own system • Select passive and backend componentsbased on VL recommendations and on their own constraints francois.vasey@cern.ch

  14. 2.5 Packaging and Interconnects Status • GBT • 20x20 BGA with on-package crystal and decoupling capacitors • CERN<>IMEC<>ASE • 5 iterations to freeze design • 1-4 weeks per iteration • 6 months to first prototype • Mask error, re-spin, +2months • VL • High speed PCB simulation and design • Injection-moulded ULTEM 2xLC connector latch and pcb support • Prototyping started 2009, moulded parts delivered 2013 francois.vasey@cern.ch

  15. 2.6 Rad Hard Optical Link Project Effort • 6 years of development • Launched in 2008 • Delivery in 2014-15 • 6 institutes involved • CERN, FNAL, IN2P3, INFN, Oxford, SMU • Estimated 80 MY + 2-3 MCHF development effort francois.vasey@cern.ch

  16. 3. Towards HL-LHC • Higher Data-rate • Lower Power • Smaller Footprint • Enhanced Radiation Resistance • Not to be forgotten: • Fast electrical links • Radiation-soft links Not all in same link francois.vasey@cern.ch

  17. 3.1 Higher Data-Rate and Lower Power • Requires migrating to a more advanced technology node: 65nm • Qualify technology • Establish stable support framework and design tools for full duration of development • Design new ASICs taking advantage of technology • Either high speed (multiply by two) • Or low power (divide by four) • VL Opto components 10G capable • But will need to qualify new emerging components • Electrical interconnects and packaging become performance limiters • Build up expertise • Train with relevant simulation and design Tools • Establish relationships with selected suppliers francois.vasey@cern.ch

  18. 3.2 Smaller Footprint VTRx SF-VTRx • GBT package size can be shrunk by limiting the number of IO pads and going to fine pitch BGA • Will affect host board design • VTRx concept has been pushed to its size limit: SF-VTRx • Not sufficient for some tracker layouts • Tracker frontends will need custom packaging • Industry to be approached francois.vasey@cern.ch

  19. 3.3 Enhanced Radiation Resistance Tx • ASICs likely to be OK • Active opto devices tested up to now do not resist fluences beyond 1016 cm-2 • Still OK for HL-LHC Tk, not for pixels • Tight margins • Are there alternatives for fluences beyond 1016 cm-2 ? • Reconsider Passives? • modulators HL-LHC TK Rx francois.vasey@cern.ch HL-LHC TK

  20. 3.4 Si-Photonics, a paradigm changing technology? • Si • is an excellent optical material with high refractive index (but indirect bandgap) • Is widely available in high quality grade • Can be processed with extreme precision using deep submicron CMOS processing techniques • So, why not build a photonic circuit in a CMOS Si-wafer? francois.vasey@cern.ch

  21. 3.5 Si-Photonics, status in the community • Commercial devices tested at CERN and ANL • Excellent functional performance • Moderate radiation resistance limited by controller ASIC failure • On-going collaborations with one academic and two industrial partners • Simulation tools in hands • Selected building blocks under test • No usable conclusion so far, much more work needed • Packaging is challenging • Assess radiation hardness first ! Luxtera QSFP+ Si-Photonics chip francois.vasey@cern.ch

  22. 3.6 Not to be forgotten • Fast electrical data links are not obsolete !!! • Short distance, on-board serial links • Aggregation to high speed opto-hubs • Very high radiation fields (HL-LHC pixels) • Develop expertise and tools • Detectors with modest radiation levels may not need custom front-ends • Qualify COTS and/or Radiation-soft components • Shortlist recommended parts • Continuously track market evolution francois.vasey@cern.ch

  23. 4. Conclusions Development • High speed links are the umbilical cord of the experiments • Meeting the HL-LHC challenge will require: • Qualifying new, emerging technologies • Designing electronics, interconnects, packages and perhaps even optoelectronics • Maintaining expertise, tools and facilities • Investing with a few selected industrial partners • The community is healthy, but small and fragmented • Working groups and common projects put in place for phase I upgrades are effective and should be continued in phase II • Additional projects and working groups could be created • WG on fast electrical links & signal integrity, and radiation-soft links • Project on Si-photonics for HEP applications • Manpower is the real bottleneck, time is the contingency • Development time remains very long • 6 years, 6 institutes, 80 MY were required to reach production readiness for phase I • Widening gap with industry Design Service Commercial liaison Common projects Working groups People

More Related