1 / 21

ROOT in the PHOBOS Online System

ROOT in the PHOBOS Online System. Peter Steinberg Brookhaven National Laboratory Upton, NY, USA. PHOBOS experimental setup. Multiplicity/ Vertex. Spectrometer. Readout Channels. PHOBOS is meant (CDR) to be a high-rate detector

merrill
Download Presentation

ROOT in the PHOBOS Online System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ROOT in the PHOBOSOnline System Peter Steinberg Brookhaven National Laboratory Upton, NY, USA

  2. PHOBOS experimental setup Multiplicity/Vertex Spectrometer

  3. Readout Channels • PHOBOS is meant (CDR) to be a high-rate detector • However, while a “small” detector, it has a “big” channel count. • Spectrometer • 73,728 channels • Octagon Multiplicity + Vertex detectors • 28,612 channels • Ring Multiplicity Detectors • 16,384 channels • Over 120,000 channels to read out -> 150kB/central event • Main Constraint: average bandwidth 10MB/sec allowed by RHIC computing facility (RCF)

  4. FASTBUS Trigger Mercury PHOBOS online data flow ORACLE EventBuilder(PhATDAQ) DataValidation RCF Si RHIC TOF Slow Control Run Control Online Monitoring

  5. PhATDAQ • PhAT is the general PHOBOS analysis toolkit (see Gunther Roland’s talk) • Originally used for offline tasks • Data analysis, algorithms, etc. • But what about DAQ? • Original DAQ group started to develop using CODA from CEBAF • But one day, Andrei Sukhanov (BNL) mentioned that he would try and perform the event building directly using ROOT objects and I/O • (very) soon thereafter, PhATDAQ was born

  6. Legend: Working Slow Monitoring Being developed Data Base ROCDB Gateway Work started Work not started yet TCP TSocket events UDP socket PhatOnline Silicon ROC Event Builder phatdaq.exe TOF Data Run Control (root/phat) ROC Trigger ROC Data & Commands PhATDAQ Architecture (Andrei Sukhanov, BNL) Structure of the communication messages Gigabit Message: SubEvent: Length Length Seq.number Header length Command Source Type Event# from Event Manager ROC: SubEvent or Command SubEvent SubEvent … 0x12345678 EB interface Run number Data Data taking

  7. PhATDAQ data format • Large segments of detector are read out by “Front-End Controller” (FEC) • Each FEC consists of several “strings” of “chips” • The chip data is concatenated in each string. • Thus, channels are indexed by FEC/String/Channel FEC FEC FEC FEC String String String String String String String String Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip “channels”

  8. ROOT in PhATDAQ • ROOT objects are used to store and access raw data • Each FEC is its own object, which contains arrays of string data. • Trees are used for raw data files • one event after the other, no tags as of yet. • Heavy use of ROOT containers for flexibility • Our TPhEvent class is a hash table which store relevant objects (TPhDataObject, TPhObjectContainer) • However, hashtables are “heavy” -- we have a substitute object (TPhSubEventArray) which can be turned into a TPhEvent and vice versa. • PhATDAQ has been working perfectly for 6 months • Extensive performance benchmarks to come.

  9. Run Control • So far, CINT is our run control interface • .L runcontrol.cxx • rc_start_run() • rc_end_run(), etc. • Works very well • flexible • robust • DAQ team is conspiring to move to a GUI • Inspired by NA45 RC GUI

  10. Online Monitoring • Designed and implemented by a team at University of Illinois, Chicago • Judith Katzy, Clive Halliwell, Don McLeod, Mike Reuter (katzy@nanalpc.phy.uic.edu) PHOBOS PhATDAQ RCF TCP/IP Dual Pentium Machines ORACLE

  11. Monitoring Implementation • Based fully on PhAT and ROOT • Class library encompassing various functions • Main process - event loop • Display classes • Histogram filling and managing • Histogram selecting • Geometry and electronics setup is self-configuring • GUI controls are created dynamically • Easily extendable to add new histogram controls • All available online application options available through user interface: • Program control, histogram selection, display functions

  12. Online Monitoring GUI

  13. Module of Online Monitoring package Allows selection of sub-detectors Hits can be displayed either on the pixel, or perpendicular (hedgehog display) 2/3 dimensional display Different display options: Lego Color coded ROOT-based event display (Don McLeod, UIC) Octagon Spectrometer Planes

  14. More event displays

  15. Oracle, ROOT & PhAT • Oracle was chosen long ago as the only PHOBOS database • Oracle is used comprehensively in the whole system • Silicon testing • Pedestals and Calibrations • Data Tracking • Experimental Logbook • Slow control information (from Labview & PhATDAQ) • However, Oracle is not OO • For storing certain objects, simple tables will not do. • Hough tables, tracking templates,... • Want to store and retrieve arbitrary objects (as BLOBs) by means of a “generic” key. • Should also work with ROOT TTrees • For more info, ask George Heintzelman (gah@bnl.gov)

  16. Signal Processing in PhAT • PHOBOS requires extensive amounts of low-level signal processing before any hits can be found • Requirements: • Granular - algorithms should be “plug-and-play”. • Geometric -- wanted to be able to easily analyze data by subsection (FEC, String, Chip, Channel) without overhead of traversing object trees. • Easy to adapt to DSP code (c code, Mercury platform) • Easy to debug -- simple interface between data and histograms • Need ROOT compatibility • Led to development of TPhModules and several new data structures to satisfy these requirements...

  17. TPhTopology (George Heintzelman, BNL) • Any particular geometry has a fixed electronics structure • Thus, every channel has a well defined location • This lets us store a set of values as a flat array. • The detector topology can then be used to find the start of data! We can iterate from there. FEC Detector FEC FEC FEC FEC String String String String String String String String Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip Chip “TopoData”

  18. TPhTopoData • With this “topology” we can create data structures of different “depths” • “Channel Depth” - one entry for each channel • “Chip Depth” - one entry for each chip (e.g. for CMN studies…) • etc. • The contents of this structure can in principle be any object ( working w/ templates in CINT) • Integer, Float, etc. • Any TObject - TPhHistogram, TPhCalibration, TPhFrog, etc. NEW

  19. Application to Signal Processing • Visualization of arrays • Makes histograms merely a visualization tool, rather than a fundamental object to be moved around Pedestals stored in a TPhTopoData<int> Visualize “by string” “TPhTopoData<TH1F>”

  20. Things we’d like to see... • A collection from the collaboration: • More & Better Template support • reduces code duplication, increases type-safety • need working examples, esp. w/ STL • Simple shared memory interface • At least a way to directly share simple structures with an object interface • Maybe not as critical if thread support improves • Rationalization of the relationship between objects and TDirectories • “now, where did my pointer go?…” • Instead, had to roll our own (TPhObjectManager)

  21. And something we really need • Help! • PHOBOS group at BNL is also looking for a postdoc (or even two) • Available immediately • Scope very flexible, but a few ideas: • Improving Oracle classes (could be of general use to community) • We have a very powerful multiple-DSP system (~8 G4’s) between the silicon readout and the event builder. • Not fully exploited yet. • Plans for Level 3 triggering : multiplicity, vertex, tracking • Physics! RHIC is a very exciting place to be right now. Offline framework is still under intensive development.

More Related