Real-time Analysis with the ALICE High Level Trigger.

Similar documents
Tracking and compression techniques

High Level Trigger System for the LHC ALICE Experiment

The ALICE High Level Trigger

PoS(High-pT physics09)036

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

ALICE tracking system

arxiv: v1 [physics.ins-det] 26 Dec 2017

Event reconstruction in STAR

Track reconstruction with the CMS tracking detector

PoS(Baldin ISHEPP XXII)134

3D-Triplet Tracking for LHC and Future High Rate Experiments

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

Tracking and Vertex reconstruction at LHCb for Run II

CMS Conference Report

arxiv:hep-ph/ v1 11 Mar 2002

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

8.882 LHC Physics. Track Reconstruction and Fitting. [Lecture 8, March 2, 2009] Experimental Methods and Measurements

The ALICE electromagnetic calorimeter high level triggers

Muon Reconstruction and Identification in CMS

PERFORMING TRACK RECONSTRUCTION AT THE ALICE TPC USING A FAST HOUGH TRANSFORM METHOD

Charged Particle Reconstruction in HIC Detectors

CMS FPGA Based Tracklet Approach for L1 Track Finding

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Online Reconstruction and Calibration with Feedback Loop in the ALICE High Level Trigger

Detector Control LHC

PATHFINDER A track finding package based on Hough transformation

Prompt data reconstruction at the ATLAS experiment

The GTPC Package: Tracking and Analysis Software for GEM TPCs

Tracking and Vertexing performance in CMS

ATLAS ITk Layout Design and Optimisation

Stefania Beolè (Università di Torino e INFN) for the ALICE Collaboration. TIPP Chicago, June 9-14

SLAC PUB 8389 Mar 2000 TRACKING IN FULL MONTE CARLO DETECTOR SIMULATIONS OF 500 GeV e + e COLLISIONS a M.T. RONAN Lawrence Berkeley National Laborator

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction

b-jet identification at High Level Trigger in CMS

Track reconstruction of real cosmic muon events with CMS tracker detector

New Test Results for the ALICE High Level Trigger. Timm M. Steinbeck - Kirchhoff Institute of Physics - University Heidelberg - DPG 2005 HK 41.

TPC tracking and particle identification in high-density environment

Track reconstruction for the Mu3e experiment based on a novel Multiple Scattering fit Alexandr Kozlinskiy (Mainz, KPH) for the Mu3e collaboration

Tracking and flavour tagging selection in the ATLAS High Level Trigger

ATLAS PILE-UP AND OVERLAY SIMULATION

ATLAS, CMS and LHCb Trigger systems for flavour physics

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Direct photon measurements in ALICE. Alexis Mas for the ALICE collaboration

TPC digitization and track reconstruction: efficiency dependence on noise

A LVL2 Zero Suppression Algorithm for TRT Data

Integrated CMOS sensor technologies for the CLIC tracker

arxiv: v1 [physics.ins-det] 26 Dec 2017

PANDA EMC Trigger and Data Acquisition Algorithms Development

The GAP project: GPU applications for High Level Trigger and Medical Imaging

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

Communication Software for the ALICE TPC Front-End Electronics

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

HLT Hadronic L0 Confirmation Matching VeLo tracks to L0 HCAL objects

LHC Detector Upgrades

Adding timing to the VELO

Track Reconstruction

Simulation study for the EUDET pixel beam telescope

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

LHC-B. 60 silicon vertex detector elements. (strips not to scale) [cm] [cm] = 1265 strips

Klaus Dehmelt EIC Detector R&D Weekly Meeting November 28, 2011 GEM SIMULATION FRAMEWORK

05/09/07 CHEP2007 Stefano Spataro. Simulation and Event Reconstruction inside the PandaRoot Framework. Stefano Spataro. for the collaboration

TPC Detector Response Simulation and Track Reconstruction

Atlantis: Visualization Tool in Particle Physics

for the DESY/ ECFA study detector

8.882 LHC Physics. Analysis Tips. [Lecture 9, March 4, 2009] Experimental Methods and Measurements

Tracking POG Update. Tracking POG Meeting March 17, 2009

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

SoLID GEM Detectors in US

The CMS Computing Model

PoS(IHEP-LHC-2011)002

Trigger and Data Acquisition at the Large Hadron Collider

Robustness Studies of the CMS Tracker for the LHC Upgrade Phase I

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.

A Real-Time Transverse Momentum Discriminator for the BABAR Level 1 Trigger System 1

Tracker algorithm based on Hough transformation

CMS Alignement and Calibration workflows: lesson learned and future plans

Modelling of non-gaussian tails of multiple Coulomb scattering in track fitting with a Gaussian-sum filter

Software and computing evolution: the HL-LHC challenge. Simone Campana, CERN

The ATLAS Trigger System: Past, Present and Future

The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM

A MIMD Multi Threaded Processor

Inside-out tracking at CDF

TPC Detector Response Simulation and Track Reconstruction

SoLID GEM Detectors in US

TPC Detector Response Simulation and Track Reconstruction

Performance of the ATLAS Inner Detector at the LHC

THE ALICE experiment described in [1] will investigate

1 Introduction The challenges in tracking charged particles in the HERA-B experiment [5] arise mainly from the huge track density, the high cell occup

A New Variable-Resolution Associative Memory for High Energy Physics

First results from the LHCb Vertex Locator

Associative Memory Pattern Matching for the L1 Track Trigger of CMS at the HL-LHC

LArTPC Reconstruction Challenges

ACTS: from ATLAS software towards a common track reconstruction software

Data Reconstruction in Modern Particle Physics

Implementing Online Calibration Feed Back Loops in the Alice High Level Trigger

The performance of the ATLAS Inner Detector Trigger Algorithms in pp collisions at the LHC

CMS reconstruction improvements for the tracking in large pile-up events

Transcription:

Real-time Analysis with the ALICE High Level Trigger C. Loizides 1,3, V.Lindenstruth 2, D.Röhrich 3, B.Skaali 4, T.Steinbeck 2, R. Stock 1, H. TilsnerK.Ullaland 3, A.Vestbø 3 and T.Vik 4 for the ALICE collaboration 1 Institute for Nuclear Physics, University of Frankfurt, Germany 2 Kirchhoff Institute for Physics, University of Heidelberg, Germany 3 Department of Physics, University of Bergen, Norway 4 Department of Physics, University of Oslo, Norway Talk is based on the Technical Design Report CERN/LHCC 2003-062 http://www.ikf.physik.uni-frankfurt.de/~loizides/

ALICE = A Large Ion Collider Experiment Large Hadron Collider at CERN starting operation 2007 Heavy Ion Reaction to Study New State of Matter UrQMD model, M. Bleicher et al. J. Phys. G: Nucl. Part. Phys. 25 (1999) 1859-1896

Event Display Jets in PbPb Pythia, dn/dy=2000

Jet Trigger

LHC Parameters for PbPb (and pp) s ALICE resulting data rate (TPC only) Event rates Central Pb-Pb < 200 Hz (past/future proteced) Min. bias pp < 1000 Hz (roughly 25-100 piles) TPC is the largest data source with ~ 600000 channels, 512 timebins and 10 bit ADC value. Event sizes (after zero suppression) for PbPb ~75 Mbyte for pp ~0.5 Mbyte Data rates for PbPb < ~15 Gbyte/sec for pp < ~ 2.5 Gbyte/sec (25 piles) TPC data rate by far exceeds the foreseen total DAQ bandwidth of ~ 1-2 Gbyte/sec.

Reducing the Data Rate: the HLT System Without HLT ~10 7 PbPb events ~10 9 pp events Trigger events based on detailed online analysis of its physics content. Select relevant parts of the event, or regions of interest, including the entire event. Compress those parts of the event which are being selected for readout reducing taping rate and assossicated costs without any loss to the containing physics. All require online pattern recognition With HLT ~20 * 10 7 PbPb events ~10 * 10 9 pp events can be processed per Ay (using the TPC). Low XSection Signals 10 10 pp events lead to ~10 5 100 GeV jets ~10 3 Ypsilons ~10 5 Omegas 2*10 8 PbPb events lead to ~10 4 >100 GeV jets ~2*10 4 Ypsilons

Logical View of the ALICE Event Flow

Connectivity of the HLT System TPC S1 TPC S36 TRD ITS Muon Sub- Sector- Nodes Sector Nodes Super- Sector Nodes Event Nodes Trigger Nodes

Components of the HLT System Commercial off-the-shelf Pcs ~300 PCs equipped with FPGA- Coprocessor cards (RORC) ~300-500 Compute Nodes Light-Weight Communication NIC (GE, Infini-Band, SCI, Myrinet,...) Protocol Stack (TCP, STP,...) Publisher-Subscriber Interface Readout Receiver Card Local Pattern Recognition New Event Shared Memory Subscriber Analysis Code Publisher Read data Write data Event Input Data Event Output Data

Online Pattern Recognition Schemes (TPC) HLT Task: Reconstruct ~10000 tracks produced by the charged particles traversing the TPC within ~5 ms. Sequential feature extraction: Cluster Finder and Track Follower on space points (conventional approach) Iterative feature extraction: Tracklet finder on raw data (Hough Transform) and cluster evaluation (Deconvoluter, Refitter) ~10000 times Low Multiplicity Scenario High Multiplicity Scenario Due to the solenoidal magnetic field inside the TPC, the tracks follow helical trajectories.

Low Multiplicity Tracker: Cluster Finder and Track Follower Step 1: Cluster Centroid Finding: Reconstruct the positions of space points, which are then interpreted as the crossing points between tracks and the center of padrows. Around 100 such space points assigned to one helix form a found track. Each padrow can be processed independently, clusters are calculated as the weighted mean of the ADC values in time and pad direction. Step 2: Conformal Mapping Convert all space points with x v = (x-x p )/r y v = (y-y p )/r r = (x-x v ) 2 + (y-y v ) 2 So that bended tracks become straight lines. (requires knowledge of vertex or first point of the track) Step 3: Track Follower P. Yepes, A Fast Track Pattern Recognition, NIM A380 (1996) 582 Segments: Starting at the outermost padrow track segments are formed by connecting nearby hits. Tracks: When segments are long enough, they will be fitted. Only new hits will added which (within certain cuts) are closest to the fit.

Low Multiplicity Tracker: Efficiency and Timing Efficiency: The ratio of good found tracks versus findable monte carlo tracks. Timing Setup Integral efficiency Timing Results P t resolution 1-5% Fakes < 1 %

Cluster Finder FPGA implementation: FPGA Cluster Finder: Synthesized on Altera APEX 20K-1x uses 1860 LE (11%) + 8448 Bits (4%) Clock speed: 50 Mhz

Pileup Removal for pp (at high lumi) Track complete event and apply cut on vertex position! Assumption: Vertex of the triggered event is known to a few centimeter!

25 Piles Efficiency Fakes Lambda Cut around 40 cm: all primaries, 3/25 compression, loose 40% lambdas

Low Multiplicity Tracker: TPC Sector Benchmark Simulated pp events with 25 piles (~400 particles in TPC per Event) achieved 430 Events/sec Heidelberg HLT Cluster 19 Nodes, P3 800 Mhz, Fast Ethernet

High Multiplicity Tracker: Hough Transform and Cluster Evaluation Track Candidate Finding before Cluster Finding: Provide track seeds for cluster fitting Hough Transform Fit & unfold clusters based on track parameters 2D gaussfit of clusters Perform final track fits The HT is a widely adopted technique in pattern recognition. The global detection of certain geometrical properties in image space is converted into a local peak detection in a suitable parameter space. Every charge value votes for all tracks it possibly belongs to. The intersection of these curves (peaks) define the track candidates (segments), which in turn are used to deconvolute overlapping clusters. Cluster shape depends on no. of tracks contributing to it, track crossing angle with the padrow plane and drift length. Once the track parameters are roughly known we can fit the clusters to the given shape and unfolding becomes straight forward.

High Multiplicity Tracker: Circle HT Divide raw data into small slices in eta. Search all primary track candidates by applying a Circle Hough Transform to every digit. Find the peaks with a simple Peak Finder. Merge track candidates crossing slices. HT on Raw Data Detect all possible primary vertex tracks Peaks = Track Candidates

High Multiplicity Tracker: Efficiency and Timing Efficiency Timing Results Efficiency is high (>90%), but thats only after evaluating the raw data with a painful slow method. Problem: Relative no. of fake tracks is high: ~100-200% VHDL simulation is working, but not synthesizable yet.

Cluster Fitter / Deconvoluter Cluster fitting utilities were adapted from NA49: - Clusters are fitted to a 2D gauss function with 5 parameters pad, time, charge, widths in pad & time - Widths in pad and time are provided from track parameters: In order to check the performance of the cluster fitting algorithm, offline track parameters were used as input: Offline tracks HLT Cluster fitter Clusters Offline track finder Before cluster fitting Relative Pt res. 1.17% 1.07% Phi res. 2.19% 1.90% Theta res. 1.28% 1.25% dn/dy=8000, Bfield = 0.4T After cluster fitting

Tracking Efficiency Comparison Different Hough schemes are being evaluated Conventional tracker as seeds in outer padrows Local versus global transform Error propagation into parameter space

Data Compression using Modeling Techniques Store quantized cluster centroid deviation from track model Cluster widths are parametrized according to track parameters Remaining clusters (not assigned to any track) may be stored in addition to the compressed data Track parameters Size (byte) Curvature 4 (float) Begin X,Y,Z 4 (float) Dip angle 4 (float) Azimuthal angle 4 (float) Track length 2 (int) No. of clusters 1 (int) Cluster parameters Size (bit) Cluster present 1 Pad residual 10 Time residual 10 Cluster charge 13 Spatial resolution: Pad : 0.5 mm Time: 0.8 mm

Data Compression (low multiplicity) Compression ratio critically depend on the remaining clusters: - Mainly comes from low Pt tracks (< 100 MeV/c), delta-electrons

Data Compression Results Zero suppressed 8 bit RLE ADC data: 100% Compressed data relative size: ~14% (12%) Remaining clusters relative size (no coding): ~41% (28%) Efficiency Resolution

Summary The HLT system consists of 500 to 1000 Pcs partially equipped with FPGA-Coprocessor cards connected in a light-weight network. Physics topics include rare X-signals as jets etc. or pileup removal. The Low Multiplicity Tracker (conventionell tracking scheme) is fully implemented. The timing benchmark for pp (pileup) is very satisfactory. The FPGA Cluster Finder is synthesized and currently being tested on the RORC hardware. The High Multiplicity Tracker shows reasonable results, but needs further investigation and research effort in order to be really useful. The Cluster Deconvoluter works when the number of fake tracks is low. Needs further evaluation with different kinds of track candidates. Using modeling techniques for compression we might achieve a factor of ~10 in compression with acceptable efficiency loss.

Natural Readout of Detectors: B1 TPC Receiver processors 2 18 SMP farm High Level Trigger System

Cluster Finder Algorithm: Details B2 Input: List of sequences on each pad Sequence matching at neighboring pads Only 2 pads in memory: Current and previous Loop over each pad only once Sequences of each cluster are not stored internally

Low Multiplicity Tracker: Efficiency and Resolution (detailed) B3 Efficiency Resolution

Cluster Finder FPGA implementation: B4 Verification