The Fourth Level Trigger Online Reconstruction Farm of HERA-B 1
|
|
- Stella Moore
- 5 years ago
- Views:
Transcription
1 The Fourth Level Trigger Online Reconstruction Farm of HERA-B Introduction A. Gellrich 2, I.C. Legrand, H. Leich, U. Schwanke, F. Sun, P. Wegner DESY Zeuthen, D-5738 Zeuthen, Germany S. Scharein Humboldt-University, Berlin, Germany 29 October 998 HERA-B FARM note / HERA-B note The HERA-B experiment starts its physics program at the 920 GeV proton ring 3 of HERA at DESY in Hamburg/Germany in 999. HERA-B is dedicated to B-physics [,2]. Its primary goal is to measure CP-violation in decays of neutral B-mesons in the golden channel, B 0 -> J/Ψ K 0 s, with subsequent decays, J/Ψ -> µ + µ and K 0 s -> π+ π. bb pairs are produced in collisions of protons of the HERA proton ring halo with a system of eight wire targets. At HERA energies, the bb cross section is estimated to be σ bb = 2 nb whereas the inelastic cross section is σ inel = 40 mb. To obtain an adequate sample of reconstructed B 0 -decays of O(000) per year in the golden channel, an event rate of 0 MHz, corresponding to the HERA bunch clock, with on average four overlaying interactions is needed. The target wires can be steered remotely accordingly. The rate of interesting physics is well below Hz. The data acquisition and trigger system is designed to achieve a background reduction factor of 0 6. This results in a logging rate of 20 Hz. With an average event size of 00 kb, a data volume of 20 TB/year will be stored. This paper describes concept and implementation of a processor farm to perform full online event reconstruction [5] on the fourth level of HERA-B s data acquisition and trigger system. 2 Data Acquisition and Trigger System HERA-B uses a sophisticated four-level DAQ and trigger system [3,4,5] to read out channels of the detector. Two main characteristics are used to distinguish B-physics from background events: The high b-quark mass leads to high-pt tracks and the long B-lifetime causes detached secondary vertices. In subsequent trigger steps these properties are digested (table ). FLT: The First-Level-Trigger works on so-called Regions-of-Interests (RoI) which are defined by one of three pre-trigger sources which are 3 pad chambers to identify high-pt tracks, the electromagnetic calorimeter to look for high-pt clusters, and track segments from the muon system. Using a simple tracking mechanism based on RoIs in the tracking chambers behind the magnet, J/Ψ candidates are searched for. It is realized in hardware. An overall background suppression factor of 200 is obtained. SLT: The Second-Level-Trigger is based on the RoIs defined by the FLT. Using a simplified Kalman filter algorithm, drift time information from more detector planes behind the magnet are used to re-define RoIs. Tracks are then projected backwards through the magnet into. Talk presented at CHEP 98, Chicago, USA, 3 August 998, Session D, Talk # Andreas.Gellrich@desy.de 3. Before 998 HERA operated its proton ring at 820 GeV. CHEP 98, Chicago, USA
2 the vertex detector. SLT trigger algorithms run on a multi-processor 240 nodes PC-farm which receives event data via high speed links through a DSP -based switch [5]. An reduction factor of 00 is obtained mainly by rejecting ghosts and by applying vertex cuts. TLT: The Third-Level-Trigger step is carried out on the same processor nodes following event building. Depending on the event type and the cuts already applied on SLT, local pattern recognition in the vertex detector in- and outside RoIs is planned. It could be shown that specially tailored algorithms can determine primary and secondary vertices in ~00 msec. In addition, information from the particle-id detectors can be exploited. To allow for full online event reconstruction on the 4LT farm, a reduction factor of 0 must be obtained. 4LT: The Fourth-Level-Trigger is subject of this paper. Table : Input rates, timing, and background suppression factors of the trigger levels. Level Input Time Supp. Method 0 MHz 0 µsec 200 Simple tracking, high p T, di-lepton mass 2 50 khz <7 msec 00 Track re-fit, magnet tracking, vertexing Hz ~00 msec 0 Further tracking, particle-id 4 50 Hz 4 sec 2.5 Full event reconstruction 3 Design Criteria for the 4LT Farm In order to reach its primary goal to measure CP-violation in the golden channel, HERA-B needs to fully reconstruct O(000) B 0 per year. Due to the vast amount of event data and the large event reconstruction time, immediate data analysis can only be ensured by performing event reconstruction online before event data are stored. In addition event classification and a final event selection is foreseen. Although the HERA-B trigger system provides a background suppression of 6 orders of magnitude, less than Hz of interesting B-physics events are contained in the logging rate of 20 Hz. Data derived during reconstruction which are relevant for alignment and calibration purposes are collected and monitored. In the HERA-B environment event reconstruction takes ~4 sec on modern high-end processors. The computing power to perform event reconstruction at a rate of 50 Hz can be provided by multi-processor system (farm). Main building blocks are: high performance processor nodes, a network which allows to route event and control data from the TLT to the 4LT system, a Unix-like operating system which provides an environment to run software. Basic aspects to set up the 4LT farm are: scalability to allow for an incremental set-up, flexibility to handle variations in rates and processing times, costs limitations to ~2.5 kdm/node. Requirements to hardware and software components are: reliability, availability for the life time of the experiment, support, maintainability. The general goal is to give up the borderline between online and offline software. This allows to use HEP software such as reconstruction and analysis packages which are usually developed. Digital-Signal-Processors from Analog Devices (SHARC). 2 CHEP 98, Chicago, USA
3 in offline environments online without (major) modifications. 4 Implementation of the 4LT Farm The HERA-B 4LT farm will be exclusively built from off-the-shelf products. Only standard components from the PC market will be used to realize the processing nodes and the networking. It is foreseen to set up the 4LT farm as modular as possible by clustering small numbers of nodes in so-called mini-farms. As an operating system Linux was chosen. This Unix-like system provides standard tools such as inter-process communication (shared memory, semaphores, message queues) and network services (udp, tcp/ip) which support modular software packages running in a multi-process environment. Even more important, Linux is since recently a widely accepted development platform in HEP and is heavily used in the offline world. Processor nodes The rapidly growing PC market provides customers regularly with newer faster high-end processors. The price/performance ratio improves simultaneously. For the a first order of 20 nodes, PC with Intel Pentium II / 400 MHz cpus were chosen (table 2). Earlier, other solutions such as VME-boards or workstations were discussed [5,6,7] but could not stand the PC competition. Table 2: Hardware components of the 4LT farm nodes. Motherboard Processor L2-cache Memory Hard disk Graphics card Power supply Fan Housing Operating system Node hardware Asus P2B (00 MHz) PII / 400 MHz 52 kb 64 MB 2.5 GB Quantum Fireball 2.5EL Else Winner 000 S3 Trio 200 W high quality midi-tower (ATX) System Software Linux Distribution S.u.S.E. 5.2 Kernel card Fast-Ethernet (00Mb/s) SMC Ultra 9432 (autosense) To transport data to and from 4LT nodes, a network is needed (figure ). In addition to the event CHEP 98, Chicago,USA 3
4 data stream which goes from TLT nodes to 4LT nodes and after processing to a central logger, control messages and monitoring data must be routed through the system. The total data rate of the event data stream is 50 Hz * 00 kb/sec = 5 MB/sec, with on average 00 kb/event. This results in an input of 25 kb/sec/node for 200 nodes. The same amount of data leaves the nodes again to go to the logger. Control messages are much smaller but are exchanged more frequently. An additional rate up to 0% of the event data rate might be produced by monitoring data nodes are clustered in so-called mini-farms. To connect up to 200 nodes in total, 5-0 mini-farms will be installed. The total bandwidth is divided accordingly. The backbones of the mini-farms are based on switches to ensure full bandwidth to/from nodes concurrently. By using manageable devices, performance monitoring and remote controlling of the network is possible. All mini-farms are connected to a central switches which must be capable of standing the full data rate. Event data appear two times in the central switch: First when being routed from TLT to 4LT nodes and second when being sent to the central logger. The logger is located close to the mass storage system which consists of hard disks and a tape robot in the DESY computer center. Disks are used to cache data before being stored on tape (DST ). The full DST stream leads to 20 TB/year which can not be stored on disk. A small fraction of events (~%) will be selected for direct access. This data as well as so-called MINIs 2 are provided to the user on disks. Currently, the 4LT farm nodes are set up as workstations without tools for interactive work. Home directories are provided by a server (srv) via NFS and NIS 3 which holds executables and data. The network is built of standard Fast-Ethernet components and contains network cards and switches. Between single TLT and 4LT nodes data rates up to 6 MB/sec can be reached, using a udp-based message passing system. The connection to the computer center is realized by two dedicated FDDI-links. 4LT Mini-Farm n /k/n /k/n 2/k switch 0 /k Figure :. k = 5-0 n = 0-20 switch Trigger Level 2/3 Farm PC PC PC 2 NFS NIS srv 4LT file & home directory service control ctl 4LT farm control tasks 4LT Mini-Farm k Computer Center tape /k logger disk log. Data-Summary-Tapes contain entire events, including raw data. 2. MINIs contain only fractions of entire events, mainly reconstruction information. 3. -File-Service and -Installation-Service. 4 CHEP 98, Chicago, USA
5 Farm Control The 4LT farm control scheme is depicted in figure 2. Its tasks is to control the event data flow from around 50 TLT nodes to up to 200 4LT nodes. The 4LT farm control contains two processes which run on a dedicated machine (ctl). After an event has passed all trigger steps, a TLT node requests the ID (IP-address) of a free 4LT node from the ID_Request process. This process looks up the Free_ID_Queue to get the first available free 4LT node ID. The associated IP-address is stored in the ID_Table. Each 4LT node registers itself at start-up with the ID_Report process which puts the node s ID to the Free_ID_Queue and stores the IP-address in the ID_Table. After event processing, the 4LT nodes reports to be free and waits for the next event. The ID_Table is also used to monitor the 4LT farm nodes. Communication between the two control processes in the control machine is done by means of the inter-process communication tools shared memory and message queue. Message and data transfer between machine via the internet is realized by a HERA-B special message passing system () based on udp. Figure 2: Farm control. request () ID_Request TLT 2 m 4LT (udp) Data (udp) (message queue) Free_ID_Queue out in (shared memory) ID_Table slot slot 2 slot 3 slot n 2 3 n report () ID_Report Node Processes Figure 3 shows the process scheme of a 4LT farm node. Major tasks are housed in separate processes, using between nodes and inter-process communication within nodes. Reconstruction, event classification and selection packages are contained in a frame program called Arte. It provides memory management for event data stored in tables. Arte is also used for offline analysis, including Monte Carlo event generation and detector simulation. Arte contains interfaces to event data files for offline purposes as well as to shared memory segments for online usage on the 4LT farm. In the latter case, event data are received by a dedicated process L4Recv from a TLT node initiated by the 4LT farm control. Output is done in a similar way. Event data are sent to the central logger in the computer center. CHEP 98, Chicago,USA 5
6 Figure 3: Node processes. L3Send L4IdReq L4IdRep L4Recv emg Buffer I/O Conversion & Sparsification DB Access Publisher / DB Cache online ARTE L4Reco L4Clas L4Trig L4Moni rhp Gatherer / Calibrator Sender Logger Disk Tape Monitoring During the reconstruction procedure, data which are needed to align and calibrate the detector are derived. To make use of the large statistics of up to 200 nodes providing such data in parallel, a scheme was developed to collect data in a central place (Gatherer). Gathered data are then used to compute alignment and calibration constants which are if needed updated in the central database. The updated database contents is published again to the 4LT farm (Publisher). For this purpose a so-called remote-histogramming-package (rhp) was developed using. 5 Summary and Outlook Concept and implementation of the 4LT farm of the HERA-B experiment at HERA were presented. The system is based on standard off-the-shelf components and makes use of commodity hardware and software. The goal to give up the borderline between online and offline software by providing an environment which allows to directly use typical HEP programs written in an offline style for online event reconstruction was achieved. Currently, server (srv), controller (ctl), and 7 nodes are installed and running. All software packages are written and tested. Commissioning is going on. 6 CHEP 98, Chicago, USA
7 HERA-B s plans for this year s running focus on J/Ψ-physics with a di-electron trigger derived from the electro-magnetic calorimeter. A small 4LT farm which is about to be included in the full data path will be sufficient. In 999 a first attempt to measure CP violation in the golden channel will be started with an almost complete detector. Around 50 4LT nodes will be installed. From the year 2000 on, the full physics program will be carried out. Assuming an input rate of 50 Hz and processing times of 4 sec for the full event reconstruction 200 4LT farm nodes mare needed. The final purchase will be initiated mid 999. Acknowledgements We gratefully thank the electronics department of DESY Zeuthen and the computer centers of DESY Hamburg and DESY Zeuthen for their support. References [] T. Lohse et al., Proposal, DESY-PRC 94/02 (994). [2] E. Hartouni et al., Design Report, DESY-PRC 95/0 (995). [3] M. Medinnis, HERA-B triggering, NIM A 368 (995) 6. [4] F. Sanchez, The HERA-B DAQ system, This proceedings, 998. [5] M. Dam, Second and third level trigger systems for the HERA-B experiment, This proceedings, 998. [6] R. Mankel, Online track reconstruction for HERA-B, NIM A 384 (996) 20. [7] A. Gellrich et al., The processor farm for online triggering and full event reconstruction of the HERA-B experiment at HERA, Proc. of CHEP 95, Rio de Janeiro, Brazil, 995. [8] A. Gellrich et al., A test system for the HERA-B online trigger and reconstruction farm, Proc. of DAQ 96, Osaka, Japan, 996. [9] A. Gellrich et al., A prototype system for the farm of the HERA-B experiment at HERA, Proc. of CHEP 97, Berlin, Germany, 997. [0] M. Dam et al., Higher level trigger systems for the HERA-B experiment, Proc. of IEEE Real-Time 97, Beaune, France, 997. [] A. Gellrich and M. Medinnis, HERA-B higher-level triggers: architecture and software, NIM A 408 (998) CHEP 98, Chicago,USA 7
A programming environment to control switching. networks based on STC104 packet routing chip 1
A programming environment to control switching networks based on STC104 packet routing chip 1 I.C. Legrand 2, U. Schwendicke, H. Leich, M. Medinnis, A. Koehler, P. Wegner, K. Sulanke, R. Dippel, A. Gellrich
More informationFirst LHCb measurement with data from the LHC Run 2
IL NUOVO CIMENTO 40 C (2017) 35 DOI 10.1393/ncc/i2017-17035-4 Colloquia: IFAE 2016 First LHCb measurement with data from the LHC Run 2 L. Anderlini( 1 )ands. Amerio( 2 ) ( 1 ) INFN, Sezione di Firenze
More informationHera-B DAQ System and its self-healing abilities
Hera-B DAQ System and its self-healing abilities V.Rybnikov, DESY, Hamburg 1. HERA-B experiment 2. DAQ architecture Read-out Self-healing tools Switch SLT nodes isolation 3. Run control system 4. Self-healing
More informationATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract
ATLAS NOTE December 4, 2014 ATLAS offline reconstruction timing improvements for run-2 The ATLAS Collaboration Abstract ATL-SOFT-PUB-2014-004 04/12/2014 From 2013 to 2014 the LHC underwent an upgrade to
More information1 Introduction The challenges in tracking charged particles in the HERA-B experiment [5] arise mainly from the huge track density, the high cell occup
Hera-B 99{111 Software 99{16 ranger { a Pattern Recognition Algorithm for the HERA-B Main Tracking System Part V: Compatibility Analysis Rainer Mankel 1 Institut fur Physik, Humboldt Universitat zu Berlin
More informationDESY at the LHC. Klaus Mőnig. On behalf of the ATLAS, CMS and the Grid/Tier2 communities
DESY at the LHC Klaus Mőnig On behalf of the ATLAS, CMS and the Grid/Tier2 communities A bit of History In Spring 2005 DESY decided to participate in the LHC experimental program During summer 2005 a group
More informationMuon Reconstruction and Identification in CMS
Muon Reconstruction and Identification in CMS Marcin Konecki Institute of Experimental Physics, University of Warsaw, Poland E-mail: marcin.konecki@gmail.com An event reconstruction at LHC is a challenging
More informationPoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.
Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector. CERN E-mail: agnieszka.dziurda@cern.ch he LHCb detector is a single-arm forward spectrometer
More informationLHCb Computing Resources: 2018 requests and preview of 2019 requests
LHCb Computing Resources: 2018 requests and preview of 2019 requests LHCb-PUB-2017-009 23/02/2017 LHCb Public Note Issue: 0 Revision: 0 Reference: LHCb-PUB-2017-009 Created: 23 rd February 2017 Last modified:
More informationPoS(High-pT physics09)036
Triggering on Jets and D 0 in HLT at ALICE 1 University of Bergen Allegaten 55, 5007 Bergen, Norway E-mail: st05886@alf.uib.no The High Level Trigger (HLT) of the ALICE experiment is designed to perform
More informationEvent reconstruction in STAR
Chapter 4 Event reconstruction in STAR 4.1 Data aquisition and trigger The STAR data aquisition system (DAQ) [54] receives the input from multiple detectors at different readout rates. The typical recorded
More informationb-jet identification at High Level Trigger in CMS
Journal of Physics: Conference Series PAPER OPEN ACCESS b-jet identification at High Level Trigger in CMS To cite this article: Eric Chabert 2015 J. Phys.: Conf. Ser. 608 012041 View the article online
More informationTrigger and Data Acquisition at the Large Hadron Collider
Trigger and Data Acquisition at the Large Hadron Collider Acknowledgments (again) This overview talk would not exist without the help of many colleagues and all the material available online I wish to
More informationThe performance of the ATLAS Inner Detector Trigger Algorithms in pp collisions at the LHC
X11 opical Seminar IPRD, Siena - 7- th June 20 he performance of the ALAS Inner Detector rigger Algorithms in pp collisions at the LHC Mark Sutton University of Sheffield on behalf of the ALAS Collaboration
More informationarxiv:hep-ph/ v1 11 Mar 2002
High Level Tracker Triggers for CMS Danek Kotliński a Andrey Starodumov b,1 a Paul Scherrer Institut, CH-5232 Villigen, Switzerland arxiv:hep-ph/0203101v1 11 Mar 2002 b INFN Sezione di Pisa, Via Livornese
More informationEvaluation of the computing resources required for a Nordic research exploitation of the LHC
PROCEEDINGS Evaluation of the computing resources required for a Nordic research exploitation of the LHC and Sverker Almehed, Chafik Driouichi, Paula Eerola, Ulf Mjörnmark, Oxana Smirnova,TorstenÅkesson
More informationPoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN
CERN E-mail: mia.tosi@cern.ch During its second period of operation (Run 2) which started in 2015, the LHC will reach a peak instantaneous luminosity of approximately 2 10 34 cm 2 s 1 with an average pile-up
More informationDeferred High Level Trigger in LHCb: A Boost to CPU Resource Utilization
Deferred High Level Trigger in LHCb: A Boost to Resource Utilization The use of periods without beam for online high level triggers Introduction, problem statement Realization of the chosen solution Conclusions
More informationThe CMS Computing Model
The CMS Computing Model Dorian Kcira California Institute of Technology SuperComputing 2009 November 14-20 2009, Portland, OR CERN s Large Hadron Collider 5000+ Physicists/Engineers 300+ Institutes 70+
More informationDevelopment of a PCI Based Data Acquisition Platform for High Intensity Accelerator Experiments
Development of a PCI Based Data Acquisition Platform for High Intensity Accelerator Experiments T. Higuchi, H. Fujii, M. Ikeno, Y. Igarashi, E. Inoue, R. Itoh, H. Kodama, T. Murakami, M. Nakao, K. Nakayoshi,
More informationReal-time Analysis with the ALICE High Level Trigger.
Real-time Analysis with the ALICE High Level Trigger C. Loizides 1,3, V.Lindenstruth 2, D.Röhrich 3, B.Skaali 4, T.Steinbeck 2, R. Stock 1, H. TilsnerK.Ullaland 3, A.Vestbø 3 and T.Vik 4 for the ALICE
More informationLHCb Computing Resources: 2019 requests and reassessment of 2018 requests
LHCb Computing Resources: 2019 requests and reassessment of 2018 requests LHCb-PUB-2017-019 09/09/2017 LHCb Public Note Issue: 0 Revision: 0 Reference: LHCb-PUB-2017-019 Created: 30 th August 2017 Last
More informationThe GAP project: GPU applications for High Level Trigger and Medical Imaging
The GAP project: GPU applications for High Level Trigger and Medical Imaging Matteo Bauce 1,2, Andrea Messina 1,2,3, Marco Rescigno 3, Stefano Giagu 1,3, Gianluca Lamanna 4,6, Massimiliano Fiorini 5 1
More informationA LVL2 Zero Suppression Algorithm for TRT Data
A LVL2 Zero Suppression Algorithm for TRT Data R. Scholte,R.Slopsema,B.vanEijk, N. Ellis, J. Vermeulen May 5, 22 Abstract In the ATLAS experiment B-physics studies will be conducted at low and intermediate
More informationTracking and Vertex reconstruction at LHCb for Run II
Tracking and Vertex reconstruction at LHCb for Run II Hang Yin Central China Normal University On behalf of LHCb Collaboration The fifth Annual Conference on Large Hadron Collider Physics, Shanghai, China
More informationTracking and compression techniques
Tracking and compression techniques for ALICE HLT Anders Strand Vestbø The ALICE experiment at LHC The ALICE High Level Trigger (HLT) Estimated data rate (Central Pb-Pb, TPC only) 200 Hz * 75 MB = ~15
More informationTracking and flavour tagging selection in the ATLAS High Level Trigger
Tracking and flavour tagging selection in the ATLAS High Level Trigger University of Pisa and INFN E-mail: milene.calvetti@cern.ch In high-energy physics experiments, track based selection in the online
More informationThe CMS Event Builder
The CMS Event Builder Frans Meijers CERN/EP-CMD CMD on behalf of the CMS-DAQ group CHEP03, La Jolla, USA, March 24-28 28 2003 1. Introduction 2. Selected Results from the Technical Design Report R&D programme
More informationReprocessing DØ data with SAMGrid
Reprocessing DØ data with SAMGrid Frédéric Villeneuve-Séguier Imperial College, London, UK On behalf of the DØ collaboration and the SAM-Grid team. Abstract The DØ experiment studies proton-antiproton
More informationTrack pattern-recognition on GPGPUs in the LHCb experiment
Track pattern-recognition on GPGPUs in the LHCb experiment Stefano Gallorini 1,2 1 University and INFN Padova, Via Marzolo 8, 35131, Padova, Italy 2 CERN, 1211 Geneve 23, Switzerland DOI: http://dx.doi.org/10.3204/desy-proc-2014-05/7
More informationPoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout
An Overview of the b-tagging Algorithms in the CMS Offline Software Christophe Saout CERN, Geneva, Switzerland E-mail: christophe.saout@cern.ch The CMS Offline software contains a widespread set of algorithms
More informationClustering and Reclustering HEP Data in Object Databases
Clustering and Reclustering HEP Data in Object Databases Koen Holtman CERN EP division CH - Geneva 3, Switzerland We formulate principles for the clustering of data, applicable to both sequential HEP applications
More informationATLAS PILE-UP AND OVERLAY SIMULATION
ATLAS PILE-UP AND OVERLAY SIMULATION LPCC Detector Simulation Workshop, June 26-27, 2017 ATL-SOFT-SLIDE-2017-375 22/06/2017 Tadej Novak on behalf of the ATLAS Collaboration INTRODUCTION In addition to
More informationLHC Detector Upgrades
Su Dong SLAC Summer Institute Aug/2/2012 1 LHC is exceeding expectations in many ways Design lumi 1x10 34 Design pileup ~24 Rapid increase in luminosity Even more dramatic pileup challenge Z->µµ event
More informationFirst results from the LHCb Vertex Locator
First results from the LHCb Vertex Locator Act 1: LHCb Intro. Act 2: Velo Design Dec. 2009 Act 3: Initial Performance Chris Parkes for LHCb VELO group Vienna Conference 2010 2 Introducing LHCb LHCb is
More informationSoftware and computing evolution: the HL-LHC challenge. Simone Campana, CERN
Software and computing evolution: the HL-LHC challenge Simone Campana, CERN Higgs discovery in Run-1 The Large Hadron Collider at CERN We are here: Run-2 (Fernando s talk) High Luminosity: the HL-LHC challenge
More informationA Fast VME Data Acquisition System for Spill Analysis and Beam Loss Measurement
A Fast VME Data Acquisition System for Spill Analysis and Beam Loss Measurement T. Hoffmann, D. A. Liakin *, P. Forck Gesellschaft für Schwerionenforschung (GSI), Planckstraße 1, D-64291Darmstadt * ITEP
More informationFermi National Accelerator Laboratory
a w Fermi National Accelerator Laboratory FERMILAB- Conf-97/077 PC Farms For Offline Event Reconstruction at Fermilab A. Beretvas", M.T. Chengb, P.T. Changb, F. Donno", J. Fromm", D. Holmgren", C.H. Huang",
More informationarxiv:cs/ v2 [cs.dc] 12 Mar 2004
arxiv:cs/0403015v2 [cs.dc] 12 2004 Belle Computing System Ichiro Adachi a, Taisuke Hibino a, Luc Hinz b, Ryosuke Itoh a, Nobu Katayama a, Shohei Nishida a, Frédéric Ronga b, Toshifumi Tsukamoto a and Masahiko
More informationMonte Carlo Production on the Grid by the H1 Collaboration
Journal of Physics: Conference Series Monte Carlo Production on the Grid by the H1 Collaboration To cite this article: E Bystritskaya et al 2012 J. Phys.: Conf. Ser. 396 032067 Recent citations - Monitoring
More informationPrompt data reconstruction at the ATLAS experiment
Prompt data reconstruction at the ATLAS experiment Graeme Andrew Stewart 1, Jamie Boyd 1, João Firmino da Costa 2, Joseph Tuggle 3 and Guillaume Unal 1, on behalf of the ATLAS Collaboration 1 European
More informationVelo readout board RB3. Common L1 board (ROB)
Velo readout board RB3 Testing... Common L1 board (ROB) Specifying Federica Legger 10 February 2003 1 Summary LHCb Detectors Online (Trigger, DAQ) VELO (detector and Readout chain) L1 electronics for VELO
More informationThe LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions
The LHCb upgrade Burkhard Schmidt for the LHCb Collaboration Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions OT IT coverage 1.9
More informationTrigger and Data Acquisition: an ATLAS case study
Trigger and Data Acquisition: an ATLAS case study Standard Diagram of ATLAS Trigger + DAQ Aim is to understand most of this diagram by the end of the lecture! 1 Outline Basic Trigger and DAQ concepts The
More informationThe creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM
The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM Lukas Nellen ICN-UNAM lukas@nucleares.unam.mx 3rd BigData BigNetworks Conference Puerto Vallarta April 23, 2015 Who Am I? ALICE
More informationATLAS TDAQ RoI Builder and the Level 2 Supervisor system
ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair 1, J. Dawson 1, G. Drake 1, W. Haberichter 1, J. Schlereth 1, M. Abolins 2, Y. Ermoline 2, B. G. Pope 2 1 Argonne National Laboratory,
More informationThe ALICE High Level Trigger
The ALICE High Level Trigger Richter Department of Physics and Technology, of Bergen, Norway for the ALICE HLT group and the ALICE Collaboration Meeting for CERN related Research in Norway Bergen, November
More informationPerformance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger
Journal of Physics: Conference Series PAPER OPEN ACCESS Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger To cite this article: Mia Tosi 205 J. Phys.: Conf. Ser.
More informationIEPSAS-Kosice: experiences in running LCG site
IEPSAS-Kosice: experiences in running LCG site Marian Babik 1, Dusan Bruncko 2, Tomas Daranyi 1, Ladislav Hluchy 1 and Pavol Strizenec 2 1 Department of Parallel and Distributed Computing, Institute of
More informationBeam test measurements of the Belle II vertex detector modules
Beam test measurements of the Belle II vertex detector modules Tadeas Bilka Charles University, Prague on behalf of the Belle II Collaboration IPRD 2016, 3 6 October 2016, Siena, Italy Outline Belle II
More informationTrack reconstruction with the CMS tracking detector
Track reconstruction with the CMS tracking detector B. Mangano (University of California, San Diego) & O.Gutsche (Fermi National Accelerator Laboratory) Overview The challenges The detector Track reconstruction
More information1. Introduction. Outline
Outline 1. Introduction ALICE computing in Run-1 and Run-2 2. ALICE computing in Run-3 and Run-4 (2021-) 3. Current ALICE O 2 project status 4. T2 site(s) in Japan and network 5. Summary 2 Quark- Gluon
More informationThe Belle II Software From Detector Signals to Physics Results
The Belle II Software From Detector Signals to Physics Results LMU Munich INSTR17 2017-02-28 Belle II @ SuperKEKB B, charm, τ physics 40 higher luminosity than KEKB Aim: 50 times more data than Belle Significantly
More informationFast pattern recognition with the ATLAS L1Track trigger for the HL-LHC
Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC On behalf of the ATLAS Collaboration Uppsala Universitet E-mail: mikael.martensson@cern.ch ATL-DAQ-PROC-2016-034 09/01/2017 A fast
More informationConference The Data Challenges of the LHC. Reda Tafirout, TRIUMF
Conference 2017 The Data Challenges of the LHC Reda Tafirout, TRIUMF Outline LHC Science goals, tools and data Worldwide LHC Computing Grid Collaboration & Scale Key challenges Networking ATLAS experiment
More informationData oriented job submission scheme for the PHENIX user analysis in CCJ
Journal of Physics: Conference Series Data oriented job submission scheme for the PHENIX user analysis in CCJ To cite this article: T Nakamura et al 2011 J. Phys.: Conf. Ser. 331 072025 Related content
More informationBenchmarking message queue libraries and network technologies to transport large data volume in
Benchmarking message queue libraries and network technologies to transport large data volume in the ALICE O 2 system V. Chibante Barroso, U. Fuchs, A. Wegrzynek for the ALICE Collaboration Abstract ALICE
More informationData preservation for the HERA experiments at DESY using dcache technology
Journal of Physics: Conference Series PAPER OPEN ACCESS Data preservation for the HERA experiments at DESY using dcache technology To cite this article: Dirk Krücker et al 2015 J. Phys.: Conf. Ser. 66
More informationMulti-threaded, discrete event simulation of distributed computing systems
Multi-threaded, discrete event simulation of distributed computing systems Iosif C. Legrand California Institute of Technology, Pasadena, CA, U.S.A Abstract The LHC experiments have envisaged computing
More information05/09/07 CHEP2007 Stefano Spataro. Simulation and Event Reconstruction inside the PandaRoot Framework. Stefano Spataro. for the collaboration
for the collaboration Overview Introduction on Panda Structure of the framework Event generation Detector implementation Reconstruction The Panda experiment AntiProton Annihilations at Darmstadt Multi
More informationCMS Conference Report
Available on CMS information server CMS CR 2005/021 CMS Conference Report 29 Septemebr 2005 Track and Vertex Reconstruction with the CMS Detector at LHC S. Cucciarelli CERN, Geneva, Switzerland Abstract
More informationStephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S)
Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S) Overview Large Hadron Collider (LHC) Compact Muon Solenoid (CMS) experiment The Challenge Worldwide LHC
More informationMonte Carlo programs
Monte Carlo programs Alexander Khanov PHYS6260: Experimental Methods is HEP Oklahoma State University November 15, 2017 Simulation steps: event generator Input = data cards (program options) this is the
More informationUW-ATLAS Experiences with Condor
UW-ATLAS Experiences with Condor M.Chen, A. Leung, B.Mellado Sau Lan Wu and N.Xu Paradyn / Condor Week, Madison, 05/01/08 Outline Our first success story with Condor - ATLAS production in 2004~2005. CRONUS
More informationKLOE software on Linux
KLOE software on Linux Offline review L.N.F. March 16, 2001 C. Bloise, P. Valente Linux box : Minimal requirements apentium class PC (Intel Pentium, PII, PIII or AMD K6,K7) RAM and local disk sufficient
More information8.882 LHC Physics. Track Reconstruction and Fitting. [Lecture 8, March 2, 2009] Experimental Methods and Measurements
8.882 LHC Physics Experimental Methods and Measurements Track Reconstruction and Fitting [Lecture 8, March 2, 2009] Organizational Issues Due days for the documented analyses project 1 is due March 12
More informationThe BaBar Computing Model *
SLAC PUB 9964 April 1997 The BaBar Computing Model * N. Geddes Rutherford Appleton Laboratory, Chilton, Didcot, England OX11 0QX Representing the BaBar Collaboration Abstract The BaBar experiment will
More informationThe High-Level Dataset-based Data Transfer System in BESDIRAC
The High-Level Dataset-based Data Transfer System in BESDIRAC T Lin 1,2, X M Zhang 1, W D Li 1 and Z Y Deng 1 1 Institute of High Energy Physics, 19B Yuquan Road, Beijing 100049, People s Republic of China
More information2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System
Chapter 8 Online System The task of the Online system is to ensure the transfer of data from the front-end electronics to permanent storage under known and controlled conditions. This includes not only
More informationThe ATLAS Tier-3 in Geneva and the Trigger Development Facility
Journal of Physics: Conference Series The ATLAS Tier-3 in Geneva and the Trigger Development Facility To cite this article: S Gadomski et al 2011 J. Phys.: Conf. Ser. 331 052026 View the article online
More informationTHE ATLAS DATA ACQUISITION SYSTEM IN LHC RUN 2
THE ATLAS DATA ACQUISITION SYSTEM IN LHC RUN 2 M. E. Pozo Astigarraga, on behalf of the ATLAS Collaboration CERN, CH-1211 Geneva 23, Switzerland E-mail: eukeni.pozo@cern.ch The LHC has been providing proton-proton
More information50GeV KEK IPNS. J-PARC Target R&D sub gr. KEK Electronics/Online gr. Contents. Read-out module Front-end
50GeV Contents Read-out module Front-end KEK IPNS J-PARC Target R&D sub gr. KEK Electronics/Online gr. / Current digitizer VME scalar Advanet ADVME2706 (64ch scanning )? Analog multiplexer Yokogawa WE7271(4ch
More informationALICE tracking system
ALICE tracking system Marian Ivanov, GSI Darmstadt, on behalf of the ALICE Collaboration Third International Workshop for Future Challenges in Tracking and Trigger Concepts 1 Outlook Detector description
More informationA generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade
A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade F. Alessio 1, C. Caplan, C. Gaspar 1, R. Jacobsson 1, K. Wyllie 1 1 CERN CH-, Switzerland CBPF Rio de Janeiro, Brazil Corresponding
More informationAndrea Sciabà CERN, Switzerland
Frascati Physics Series Vol. VVVVVV (xxxx), pp. 000-000 XX Conference Location, Date-start - Date-end, Year THE LHC COMPUTING GRID Andrea Sciabà CERN, Switzerland Abstract The LHC experiments will start
More informationATLAS, CMS and LHCb Trigger systems for flavour physics
ATLAS, CMS and LHCb Trigger systems for flavour physics Università degli Studi di Bologna and INFN E-mail: guiducci@bo.infn.it The trigger systems of the LHC detectors play a crucial role in determining
More informationImplementation of a PC-based Level 0 Trigger Processor for the NA62 Experiment
Implementation of a PC-based Level 0 Trigger Processor for the NA62 Experiment M Pivanti 1, S F Schifano 2, P Dalpiaz 1, E Gamberini 1, A Gianoli 1, M Sozzi 3 1 Physics Dept and INFN, Ferrara University,
More informationTORCH: A large-area detector for precision time-of-flight measurements at LHCb
TORCH: A large-area detector for precision time-of-flight measurements at LHCb Neville Harnew University of Oxford ON BEHALF OF THE LHCb RICH/TORCH COLLABORATION Outline The LHCb upgrade TORCH concept
More informationPoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)
Universitá degli Studi di Padova e INFN (IT) E-mail: mia.tosi@gmail.com The trigger systems of the LHC detectors play a crucial role in determining the physics capabilities of the experiments. A reduction
More informationBTeV at C0. p p. Tevatron CDF. BTeV - a hadron collider B-physics experiment. Fermi National Accelerator Laboratory. Michael Wang
BTeV Trigger BEAUTY 2003 9 th International Conference on B-Physics at Hadron Machines Oct. 14-18, 2003, Carnegie Mellon University, Fermilab (for the BTeV collaboration) Fermi National Accelerator Laboratory
More informationCLAS12 Offline Software Tools. G.Gavalian (Jlab) CLAS Collaboration Meeting (June 15, 2016)
CLAS12 Offline Software Tools G.Gavalian (Jlab) Overview Data Formats: RAW data decoding from EVIO. Reconstruction output banks in EVIO. Reconstruction output convertor to ROOT (coming soon). Data preservation
More informationL1 and Subsequent Triggers
April 8, 2003 L1 and Subsequent Triggers Abstract During the last year the scope of the L1 trigger has changed rather drastically compared to the TP. This note aims at summarising the changes, both in
More informationInstrumentation and Control System
Instrumentation and Control System Instrumentation and control system plays a crucial role in the commissioning and routine operation of a modern light source like NSRRC. The system at NSRRC was implemented
More informationModules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade
Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade Carlos García Argos, on behalf of the ATLAS ITk Collaboration University of Freiburg International Conference on Technology
More informationDevelopment of Beam Monitor DAQ system for 3NBT at J-PARC
1th ICALEPCS Int. Conf. on Accelerator & Large Expt. Physics Control Systems. Geneva, 1-14 Oct 25, PO1.24-1 (25) Development of Beam Monitor DAQ system for 3NBT at J-PARC M.Ooi 1, T. Kai 1, S. Meigo 1,
More informationPoS(IHEP-LHC-2011)002
and b-tagging performance in ATLAS Università degli Studi di Milano and INFN Milano E-mail: andrea.favareto@mi.infn.it The ATLAS Inner Detector is designed to provide precision tracking information at
More informationThe FTK to Level-2 Interface Card (FLIC)
The FTK to Level-2 Interface Card (FLIC) J. Anderson, B. Auerbach, R. Blair, G. Drake, A. Kreps, J. Love, J. Proudfoot, M. Oberling, R. Wang, J. Zhang November 5th, 2015 2015 IEEE Nuclear Science Symposium
More informationAndy Kowalski Ian Bird, Bryan Hess
Building the Mass Storage System at Jefferson Lab Andy Kowalski Ian Bird, Bryan Hess SURA/Jefferson Lab Jefferson Lab Who are we? Thomas Jefferson National Accelerator Facility SURA/DOE What do we do?
More informationVUV FEL User Workshop 2005
VUV FEL User Workshop 2005 Data Acquisition and DOOCS for VUV-FEL experiments Vladimir Rybnikov DESY 15. 11. 2005 1 Contents DOOCS control system Data AcQuisition System Integration to DAQ Data types Synchronization
More informationData Reconstruction in Modern Particle Physics
Data Reconstruction in Modern Particle Physics Daniel Saunders, University of Bristol 1 About me Particle Physics student, final year. CSC 2014, tcsc 2015, icsc 2016 Main research interests. Detector upgrades
More informationThe ALICE electromagnetic calorimeter high level triggers
Journal of Physics: Conference Series The ALICE electromagnetic calorimeter high level triggers To cite this article: F Ronchetti et al 22 J. Phys.: Conf. Ser. 96 245 View the article online for updates
More informationDesign Concepts For A 588 Channel Data Acquisition & Control System
Design Concepts For A 588 Channel Data Acquisition & Control System Kenneth H. Bosin, Spectral Dynamics Inc. This paper discusses the design methods used to create a data acquisition and control system
More informationNew Development of EPICS-based Data Acquisition System for Millimeter-wave Interferometer in KSTAR Tokamak
October 10-14, 2011 Grenoble, France New Development of EPICS-based Data Acquisition System for Millimeter-wave Interferometer in KSTAR Tokamak October 11, 2011, Taegu Lee KSTAR Research Center 2 Outlines
More informationStefan Koestner on behalf of the LHCb Online Group ( IEEE - Nuclear Science Symposium San Diego, Oct.
Stefan Koestner on behalf of the LHCb Online Group (email: Stefan.Koestner@cern.ch) IEEE - Nuclear Science Symposium San Diego, Oct. 31 st 2006 Dedicated to B-physics : single arm forward spectrometer
More informationFull Offline Reconstruction in Real Time with the LHCb Detector
Full Offline Reconstruction in Real Time with the LHCb Detector Agnieszka Dziurda 1,a on behalf of the LHCb Collaboration 1 CERN, Geneva, Switzerland Abstract. This document describes the novel, unique
More informationPhysics Analysis Software Framework for Belle II
Physics Analysis Software Framework for Belle II Marko Starič Belle Belle II collaboration Jožef Stefan Institute, Ljubljana CHEP 2015 M. Starič (IJS) Physics Analysis Software Okinawa, 13-17 April 2015
More informationOPERA: A First ντ Appearance Candidate
OPERA: A First ντ Appearance Candidate Björn Wonsak On behalf of the OPERA collaboration. 1 Overview The OPERA Experiment. ντ Candidate Background & Sensitivity Outlook & Conclusions 2/42 Overview The
More informationVertex Detector Electronics: ODE to ECS Interface
Vertex Detector Electronics: ODE to ECS Interface LHCb Technical Note Issue: 1 Revision: 0 Reference: LHCb 2000-012 VELO Created: 1 February 2000 Last modified: 20 March 2000 Prepared By: Yuri Ermoline
More informationDesign, Implementation, and Performance of CREAM Data Acquisition Software
Design, Implementation, and Performance of CREAM Data Acquisition Software S. Y. Zinn*(1), H. S. Ahn (1), M. G. Bagliesi (2), J. J. Beatty (3), J. T. Childers (4), S. Coutu (3), M. A. DuVernois (4), O.
More informationSilvia Miglioranzi University College of London / Argonne National Laboratories. June 20, Abstract
Tagging secondary vertices produced by beauty decay and studies about the possibilities to detect charm in the forward region at the ZEUS experiment at HERA Silvia Miglioranzi University College of London
More information