PoS(High-pT physics09)036

Similar documents
Real-time Analysis with the ALICE High Level Trigger.

Direct photon measurements in ALICE. Alexis Mas for the ALICE collaboration

Performance of the MRPC based Time Of Flight detector of ALICE at LHC

ALICE tracking system

CMS Conference Report

Implementing Online Calibration Feed Back Loops in the Alice High Level Trigger

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

b-jet identification at High Level Trigger in CMS

The ALICE electromagnetic calorimeter high level triggers

ATLAS, CMS and LHCb Trigger systems for flavour physics

Charged Particle Reconstruction in HIC Detectors

The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM

Tracking and flavour tagging selection in the ATLAS High Level Trigger

Tracking and Vertexing performance in CMS

Tracking and compression techniques

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

MIP Reconstruction Techniques and Minimum Spanning Tree Clustering

Detector Control LHC

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

Full Offline Reconstruction in Real Time with the LHCb Detector

Stefania Beolè (Università di Torino e INFN) for the ALICE Collaboration. TIPP Chicago, June 9-14

PoS(Baldin ISHEPP XXII)134

PoS(IHEP-LHC-2011)002

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract

The ALICE High Level Trigger

Performance of the ATLAS Inner Detector at the LHC

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Performance of the GlueX Detector Systems

Open charm measurements with new Vertex Detector at NA61/SHINE experiment at CERN SPS

Muon Reconstruction and Identification in CMS

ATLAS PILE-UP AND OVERLAY SIMULATION

arxiv: v1 [physics.ins-det] 18 Jan 2011

arxiv:hep-ph/ v1 11 Mar 2002

The ATLAS Conditions Database Model for the Muon Spectrometer

Electron and Photon Reconstruction and Identification with the ATLAS Detector

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

Tracking and Vertex reconstruction at LHCb for Run II

High Level Trigger System for the LHC ALICE Experiment

Track reconstruction with the CMS tracking detector

Update of the BESIII Event Display System

Event reconstruction in STAR

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

First LHCb measurement with data from the LHC Run 2

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

Update of the BESIII Event Display System

PoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout

for the DESY/ ECFA study detector

Inside-out tracking at CDF

8.882 LHC Physics. Track Reconstruction and Fitting. [Lecture 8, March 2, 2009] Experimental Methods and Measurements

3D-Triplet Tracking for LHC and Future High Rate Experiments

First results from the LHCb Vertex Locator

PERFORMING TRACK RECONSTRUCTION AT THE ALICE TPC USING A FAST HOUGH TRANSFORM METHOD

The GAP project: GPU applications for High Level Trigger and Medical Imaging

arxiv: v1 [physics.ins-det] 26 Dec 2017

PoS(ACAT08)100. FROG: The Fast & Realistic OPENGL Event Displayer

Alignment of the ATLAS Inner Detector tracking system

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

The ATLAS Trigger System: Past, Present and Future

CMS Alignement and Calibration workflows: lesson learned and future plans

Lawrence Berkeley National Laboratory Lawrence Berkeley National Laboratory

Tracking POG Update. Tracking POG Meeting March 17, 2009

ATLAS ITk Layout Design and Optimisation

The CLICdp Optimization Process

Adding timing to the VELO

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer

Prompt data reconstruction at the ATLAS experiment

arxiv: v1 [physics.ins-det] 13 Dec 2018

The LHCb Upgrade. LHCC open session 17 February Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014

The CMS Computing Model

TPC digitization and track reconstruction: efficiency dependence on noise

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Calorimeter Object Status. A&S week, Feb M. Chefdeville, LAPP, Annecy

Evaluation of the computing resources required for a Nordic research exploitation of the LHC

Online Reconstruction and Calibration with Feedback Loop in the ALICE High Level Trigger

Analysis of Σ 0 baryon, or other particles, or detector outputs from the grid data at ALICE

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Beam test measurements of the Belle II vertex detector modules

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

CMS reconstruction improvements for the tracking in large pile-up events

Data Reconstruction in Modern Particle Physics

SoLID GEM Detectors in US

Optimisation Studies for the CLIC Vertex-Detector Geometry

The STAR Time-of-Flight System

TPC Detector Response Simulation and Track Reconstruction

1. Introduction. Outline

PoS(ACAT)049. Alignment of the ATLAS Inner Detector. Roland Haertel Max-Planck-Institut für Physik, Munich, Germany

ATLAS NOTE ATLAS-CONF July 20, Commissioning of the ATLAS high-performance b-tagging algorithms in the 7 TeV collision data

Overview of the American Detector Models

Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction

arxiv: v1 [physics.ins-det] 26 Dec 2017

Deep Learning Photon Identification in a SuperGranular Calorimeter

PROJECT REPORT. Prototype of an online track and event reconstruction scheme for

The performance of the ATLAS Inner Detector Trigger Algorithms in pp collisions at the LHC

Control and Monitoring of the Front-End Electronics in ALICE

Overview. About CERN 2 / 11

TPC Detector Response Simulation and Track Reconstruction

TPC Detector Response Simulation and Track Reconstruction

Transcription:

Triggering on Jets and D 0 in HLT at ALICE 1 University of Bergen Allegaten 55, 5007 Bergen, Norway E-mail: st05886@alf.uib.no The High Level Trigger (HLT) of the ALICE experiment is designed to perform on-line reconstruction of p+p and Pb+Pb collisions. As a consequence, triggering on special events becomes possible. In its current state HLT can process data from all major detectors. For this discussion only the detectors of the central barrel are relevant: Inner Tracking System (ITS), Time Projection Chamber (TPC), Transition Radiation Detector (TRD), Photon Spectrometer (PHOS). Among other possibilities, the reconstructed tracks give us the chance to analyse jet and D 0 topologies. At LHC energies jet and D 0 production will be important signatures for the formation of the Quark Gluon Plasma produced in heavy ion collisions. Although jets and D 0 are produced quite abundantly, only a subset can be reconstructed, which gives the opportunity for the HLT to select these events and thus to enrich the achieved data sample. 4th international workshop High-pT physics at LHC 09 Prague, Czech Republic February 4 7, 2009 1 Speaker Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike Licence. http://pos.sissa.it

1. Introduction The Large Hadron Collider (LHC) at CERN is a proton and heavy ion accelerator that will provide p+p collisions with a centre of mass energy at s = 14 TeV and Pb+Pb collisions with s = 5.5 TeV. The ALICE (A Large Ion Collider Experiment) detector is designed to study the heavy ion collisions, but it will also have a substantial program for p+p collisions which can be used as reference data. The data rate of the Pb+Pb events will be larger than what the Data Acquisition (DAQ) can handle (1.2 GB/s.). This is why High level Trigger (HLT) has been developed. HLT will do online event reconstruction. It will give us the opportunity to do online physics analysis of the events. All the main tracking sub detectors in ALICE will be part of HLT. For the Jet- and D 0 triggers, the main detectors are the tracking detectors (TPC, ITS, TRD), and the calorimeters (PHOS, EMCAL). 2. ALICE An important task for the ALICE detector is to reconstruct the trajectories of the particles produced in the collisions. For doing this a dedicated set of detectors are employed, i.e. the Time Projection Chamber (TPC) as the main tracking device, the Inner Tracking System (ITS) and the Transition Radiation Detector (TRD). The tracking detectors in the central barrel are placed inside a magnet which provides a magnetic field of 0.5 T. 2.1 Time Projection Chamber (TPC) The TPC is a gas-filled detector with an inner radius of 80 cm and an outer one of 250 cm. It is 510 cm long and filled with Neon, CO 2 and Nitrogen. It has an acceptance of η < 0.9. When a charged particle travels through the gas, it will ionize it along its path. Since there is an electric field inside the TPC, the electrons will drift towards the end caps of the TPC, which are instrumented with multi wire proportional chambers (MWPC). The amplified charge signal is read out by 557 568 readout channels [1]. These MWPC pads give us the x-y coordinates of the cluster, and the time from the collision to when the cluster hit the readout pads gives the z coordinate. This will provide a 3D picture of the tracks over a big volume. The size of the TPC and the number of readout channels will give us an event size of about 76 MB for the TPC only for central events. If we expect 200 central events/s, we will get a data rate of about 15 GB/s. The limit for archiving the data is 1.2 GB/s. HLT will reduce the data rate either by compressing the data or by selecting events. 2.2 Inner Tracking System (ITS) The ITS consists of 6 layers of silicon detectors. The first two layers are silicon pixel detectors (SPD), located at 4 cm and 7 cm from the collision point. The two next layers are silicon drift detectors (SDD) at 14 cm and 24 cm. At 39 cm and 44 cm there are silicon strip 2

detectors (SSD) [2]. Thanks to the good vertex reconstruction capability of the ITS, the reconstructed tracks will have an impact parameter resolution better then 60 µm for tracks with momentum higher then 1 GeV/c. The resolution of the ITS is what makes open charm physics possible. The full ITS has an acceptance of η < 0.9, but the inner two pixel layers will have a coverage of η < 1.98. The pixels will also provide a good estimate of the primary vertex. 2.3 Transition Radiation Detector (TRD) Like the TPC and the ITS, the TRD covers the full azimuthal angle, and it is installed outside the TPC. It has an acceptance of η < 0.84. The main purpose of the TRD is to identify electrons of momentum higher than 1 GeV/c. The TRD consists of 18 supermodules which are divided in 5 stacks with 6 layers each. The layers consist of a radiator, a drift section and a MWPC for readout. The detector is filled with a gas mixture of Xeon and CO 2 [3]. 3. High Level Trigger (HLT) The idea of HLT is to get the raw data from the detectors, reconstruct the events and trigger on interesting events. This will reduce the data volume, and optimise the use of the data bandwidth. Events which are reconstructed in HLT will have the same output as the offline reconstruction Figure 1: The raw data from the detectors will get to HLT before DAQ. DAQ will then look at HLT as a detector in ALICE, and write the output of HLT to mass storage. HLT will get the raw data into the computing cluster via a set of receiving nodes. For the TPC we have 216 readout partitions. With 4 Read Out Cards (RORC) in one node, there are 54 nodes just for reading out the data from the TPC [4]. Since the raw data after being read out are in memory of these nodes, we perform as many local operations on the data as possible on these nodes. The cluster finding is such a process. By using the publisher/ subscriber method the data are shipped further out in the computing cluster. The tracking of the clusters is done on parts of the detector, and then merged. In the end HLT will merge/match the sub detectors together in a full event reconstruction. This scheme makes HLT able to parallelize the processing of the data, 3

and speed up the reconstruction time. The output of this reconstruction has the same format like the offline reconstruction. This will give the opportunity to run already developed and tested code on HLT output, which will help verify the HLT reconstruction and make it possible to do online physics analysis and physics triggering. The framework is in place to run offline reconstruction code in HLT, and to run HLT code in the offline simulation/reconstruction. This is important for testing code which will run online. HLT also have access to calibration data via interfaces to the Offline Condition Data Base (OCDB). Figure 2: The raw data is read in by the receiving nodes where local operations like cluster finding takes place. On the next layer we will do tracking on parts of the detectors and merge this and the sub detectors to a full event reconstruction. 4. Tracking in HLT The tracking in HLT has to be able to do operations on local areas of the detector. For the TPC, a tracking method based on Cellular Automaton is chosen. The tracking works on local level to make track seeds. These seeds will then be merged together to form full tracks. TPC tracks will be propagated to the ITS and matched with TRD to get the full event reconstruction. Creation of cells Merging of cells Creation of segments Fit of segments Merging of segments 0.5 ms 0.3 ms 0.3 ms 0.1 ms 1.8 ms The table shows how much time the different steps in the reconstruction of the TPC take. This is for a p+p event. We see here that the full reconstruction of the TPC will take about 3 ms. The timing is done on an Intel(R) Core(TM)2 Quad CPU Q6600 @ 2.40GHz. 4

5. Open Charm trigger The open charm reconstruction in HLT will be based on the standard offline code. One of the decay modes that can be used is D 0 -> K - π +, with a branching ratio of 3.89%. This is an interesting decay since it has two charged decay products, which will ionize the gas in the TPC. The expected yield for this channel is dn(d 0 ->K - π + )/dy = 0.53 in Pb+Pb events at s = 5.5 TeV [6]. This means that we get one D 0 in this decay mode in every event within the acceptance of the tracking detectors. Offline analysis has shown that after applying necessary reconstruction cuts in the analysis, one will have a signal/event ratio of 0.0013 for the D 0 ->K - π + channel [5]. This ratio will be of the same order in HLT also, but since HLT is reconstructing every event it can trigger on D 0 candidates. This will make a better use of the bandwidth limitations in DAQ, and enrich the statistics content in D 0. The decay products, K - π +, traverse the tracking detectors in ALICE. The D 0 has a lifetime of about ct = 122.9 µm, which means that the tracks of the decay products do not look like they come from the primary vertex. To distinguish these tracks from the primary ones, the ITS is very important for acquiring the resolution needed. The open charm reconstruction steps involve several cuts: Cuts on single tracks: - p T of decay product K - Impact parameter, d 0, d Cut on two track combinations: - Distance between the two tracks - Product of the impact parameters - Invariant mass - Pointing angle - Decay angle π 0 Figure 3: The reconstructed invariant mass of D 0 with the use of HLT reconstruction [8]. 5

6. Jet trigger For jets like the D 0, we take advantage of the reconstructed online event and apply a jet trigger. Figure 4 shows the production of jets in LHC per month in minimum bias events. This is the total production, but we have to take into account the acceptance of the detectors. We would like to have the full jet-cone in the TPC, so an acceptance of η < 0.5 has to be used. If we also want the EMCAL ( η < 0.3 and 83 < φ < 157 for R = 0.4), we would get about 3000 jets with E T > E T, min = 250 GeV/c 2 per month [6]. The ability of triggering on jets will enrich the samples by better use of the bandwidth limitations in DAQ. Figure 4: N Jets with Et > Et, min per month Pb+Pb collisions E t, min [GeV] 20 50 100 150 200 250 N Jets 2.4 x 10 8 5.8 x 10 6 3.1 x 10 5 4.8 x 10 4 1.2 x 10 4 3.0 x 10 3 There are three methods which are being tested for use as a jet trigger. First we have a simple seeded cone algorithm where there is no splitting and merging. Then we have two fastjet methods based on kt and anti-kt [7]. For triggering in HLT, speed is important. There is no background subtraction since this is done event by event. The different algorithms can run in parallel in HLT, and effort is being invested on how to combine the results from them to one trigger decision. 7. Summary With HLT, triggering of special events becomes possible. In ALICE we plan to have full event reconstruction. All the main tracking detectors are part of HLT, and also the calorimeters. Jets and D 0 are two examples of physics triggers. HLT event reconstruction will have the same output as the offline reconstruction, which makes it possible to run offline code on the data. Since the triggers have to run online, the performance of the code is important. But by being able to run the triggers on both offline and HLT reconstructed events, we can verify their output by doing the same for offline reconstruction of the same physics. In this way we can tune the efficiency for the algorithm. In this paper only two possible triggers are described. Nevertheless the generic structure of the trigger framework allows the implementation of other physics cases, and will enrich the recorded statistics in physics content. 6

References [1] TPC Technical Design Report, ALICE-DOC-2004-006 v.1 [2] ITS Technical Design Report, ALICE-DOC-2005-002 v.1 [3] TRD Technical Design Report, ALICE-DOC-2004-009 v.1 [4] Trigger, Data Acquisition, High Level Trigger, Control System Technical Design Report, ALICE- DOC-2004-001 v.2 [5] Andrea Dainese: Charm production and in-medium QCD energy loss in nucleus-nucleus collisions with ALICE. A performance study, arxiv:nucl-ex/0311004v2. [6] J. Phys. G: Nucl. Part. Phys. 32 (2006) 1295 2040 [7] M. Cacciari, G. F. Salam and G. Soyez: FastJet release 2.3.4, www.lpthe.jussieu.fr/~salam/fastjet, Phys. Lett. B641 (2006) hep-ph/0512210] [8] Benchmarks and Implementation of the ALICE High Level Trigger, IEEE Transactions On Nuclear Science, vol. 53. NO. 3, June 2006, page 854-858. 7