Set up and programming of an ALICE Time-Of-Flight trigger facility and software implementation for its Quality Assurance (QA) during LHC Run 2

Similar documents
Performance of the MRPC based Time Of Flight detector of ALICE at LHC

PoS(High-pT physics09)036

Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

The STAR Time-of-Flight System

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

ALICE TOF Shifters Instructions. ALICE TOF Team 09/11/2009

Detector Control LHC

TOF Electronics. J. Schambach University of Texas Review, BNL, 2 Aug 2007

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

CANbus protocol and applications for STAR TOF Control

The CMS data quality monitoring software: experience and future prospects

Data Acquisition Software for CMS HCAL Testbeams

Control and Monitoring of the Front-End Electronics in ALICE

The ATLAS Level-1 Muon to Central Trigger Processor Interface

Raw data format for the MUON spectrometer

CSC Trigger Motherboard

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

PETsys SiPM Readout System

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

arxiv: v1 [physics.ins-det] 11 Jul 2015

Update on PRad GEMs, Readout Electronics & DAQ

Streaming Readout, the JLab perspective. Graham Heyes Data Acquisition Support Group Jefferson Lab

Production and Quality Assurance of Detector Modules for the LHCb Silicon Tracker

Track reconstruction of real cosmic muon events with CMS tracker detector

Construction of the Phase I upgrade of the CMS pixel detector

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

Radiation tests of key components of the ALICE TOF TDC Readout Module

Readout Systems. Liquid Argon TPC Analog multiplexed ASICs SiPM arrays. CAEN 2016 / 2017 Product Catalog

Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri

First Operational Experience from the LHCb Silicon Tracker

APV-25 based readout electronics for the SBS front GEM Tracker

SVT detector Electronics Status

Wiener-DAQ program for data acquisition with Wiener CC-USB CAMAC Controller

Technical Specification of LHC instrumentation VME crates Back plane, power supplies and transition modules

Performance of the GlueX Detector Systems

The ATLAS Conditions Database Model for the Muon Spectrometer

RESPONSIBILITIES CIEMAT, MADRID HEPHY, VIENNA INFN, PADOVA INFN, BOLOGNA INFN, TORINO U. AUTONOMA, MADRID & LV, HV PS SYSTEMS.

Technical Information Manual

arxiv: v2 [nucl-ex] 6 Nov 2008

arxiv: v1 [nucl-ex] 26 Oct 2008

Gamma spectroscopic measurements using the PID350 pixelated CdTe radiation detector

The ALICE Glance Shift Accounting Management System (SAMS)

ONLINE MONITORING SYSTEM FOR THE EXPERIMENT

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters

Level 0 trigger decision unit for the LHCb experiment

Status of the TORCH time-of-flight detector

INDEX. Digitizer Software. CAENComm library CAENDigitizer library CAENDPP library WaveDump. CAEN VME Demos WaveCatcher

BES-III off-detector readout electronics for the GEM detector: an update

Stefania Beolè (Università di Torino e INFN) for the ALICE Collaboration. TIPP Chicago, June 9-14

CLAS12 DAQ & Trigger Status and Timeline. Sergey Boyarinov Oct 3, 2017

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

A Low-Power Wave Union TDC Implemented in FPGA

FT Cal and FT Hodo DAQ and Trigger

TEST, QUALIFICATION AND ELECTRONICS INTEGRATION OF THE ALICE SILICON PIXEL DETECTOR MODULES

Detector Control System for Endcap Resistive Plate Chambers

Ultra-short reference for the MTCC-shifter arriving in USC55

The Database Driven ATLAS Trigger Configuration System

The ALICE electromagnetic calorimeter high level triggers

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

ATLAS, CMS and LHCb Trigger systems for flavour physics

Tracking and flavour tagging selection in the ATLAS High Level Trigger

RPC Trigger Overview

Implementing Online Calibration Feed Back Loops in the Alice High Level Trigger

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida

Muon Reconstruction and Identification in CMS

Data Quality Monitoring at CMS with Machine Learning

Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout

Results on Long Term Performances and Laboratory Tests on the L3 RPC system at LEP. Gianpaolo Carlino INFN Napoli

Overview. About CERN 2 / 11

Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools.

Development of scalable electronics for the TORCH time-of-flight detector

Work in Tbilisi. David Mchedlishvili (SMART EDM_lab of TSU) GGSWBS , Tbilisi. Shota Rustaveli National Science Foundation

Offline Tutorial I. Małgorzata Janik Łukasz Graczykowski. Warsaw University of Technology

Kondo GNANVO Florida Institute of Technology, Melbourne FL

Analysis of Σ 0 baryon, or other particles, or detector outputs from the grid data at ALICE

Electronics on the detector Mechanical constraints: Fixing the module on the PM base.

Real-time Analysis with the ALICE High Level Trigger.

The CMS Computing Model

The CMS L1 Global Trigger Offline Software

Adding timing to the VELO

Schematic. A: Overview of the Integrated Detector Readout Electronics and DAQ-System. optical Gbit link. 1GB DDR Ram.

High Level Trigger System for the LHC ALICE Experiment

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

Velo readout board RB3. Common L1 board (ROB)

CMS FPGA Based Tracklet Approach for L1 Track Finding

Single Photon Counting System

DT TPG STATUS. Trigger meeting, September INFN Bologna; INFN Padova; CIEMAT Madrid. Outline: most updates on Sector Collector system

The Design and Testing of the Address in Real Time Data Driver Card for the Micromegas Detector of the ATLAS New Small Wheel Upgrade

A Scintillating Fiber Tracker for Cosmic Muon Tomography

GLAST Silicon Microstrip Tracker Status

Machine Learning in Data Quality Monitoring

Data Quality Monitoring Display for ATLAS experiment

THE CMS apparatus [1] is a multi-purpose detector designed

Scintillator-strip Plane Electronics

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

Detector Housing CASCADE-U 100. Bottom-flange. Top-flange with Teflon insulating ring and special Wilson-flange designed to fit the UCN beam pipe

Status of Trigger Server electronics

Transcription:

Set up and programming of an ALICE Time-Of-Flight trigger facility and software implementation for its Quality Assurance (QA) during LHC Run 2 Summer Student Report Francesco Toschi Alma Mater Studiorum, University of Bologna Supervisor Daniele De Gruttola Co-supervisor Andreas Morsch 23 September, 216

Project Description The Cosmic and Topology Trigger Module (CTTM) is the main component of a trigger based on the ALICE TOF detector. Taking advantage of the TOF fast response, this VME board implements the trigger logic and delivers several L trigger outputs, used since Run 1, to provide cosmic triggers and rare triggers in pp, p+pb and Pb+Pb data taking. Due to TOF DCS architectural change of the PCs controlling the CTTM (from 32 bits to 64 bits) it is mandatory to upgrade the software related to the CTTM including the code programming the FPGA firmware. A dedicated CTTM board will be installed in a CERN lab (Meyrin site), with the aim of recreating the electronics chain of the TOF trigger, to get a comfortable porting of the code to the 64 bit environment. The project proposed to the summer student is the setting up of the CTTM and the porting of the software. Moreover, in order to monitor the CTTM Trigger board during the real data taking, the implementation of a new Quality Assurance (QA) code is also crucial, together with the development of new macros that will allow to perform fast checks, such as comparing QA output for real data and Monte Carlo. This will provide a test tool for the new CTTM software and automatic, fast and efficient checks of the QA outputs for the ALICE Time- Of-Flight in the reconstruction of LHC Run 2 data, paying a particular attention to the plots dedicated to the TOF trigger monitoring.

ALICE TOF ALICE (A Large Ion Collider Experiment) is one of the four main experiments at the Large Hadron Collider (LHC): it requires heavy ions collisions in order to study the properties of QGP (Quark-Gluon Plasma). Since Particle IDentification (PID) is essential for ALICE, the Time-Of-Flight detector (TOF) is crucial: it has a modular structure corresponding to 18 sectors (called supermodules) in ϕ each one composed by 5 modules in z direction. Each module contains 15 (central module) or 19 (outer modules) Multigap Resistive Plate Chambers (MRPCs) tilted by a certain angle in order to minimize the transversal path of the incident particles [1]. The MRPC detector is based on the same technology as RPC but the stack is diveded into gas gaps thanks to glass plates installed in the active volume. This allows to have smaller gaps where independent avalanches may form avoiding the risk of streamer. MRPCs developed and used by ALICE are 122 13 cm 2 1-gap double-stack strips with an active area of 12 7.4 cm 2 subdivided into two rows of 48 pads (3.7 2.5 cm 2 ) [2]. The total number of readout channels (pads) is 152 928 (three central modules in front of PHOton Spectrometer (PHOS) are not installed to reduce the amount of material crossed by photons). The signal coming out from 24 pads is sent to a Front-End ASIC (FEA) equipped with 3 NINO ASIC chips as front-end electronics [3]: signals are sent to read-out boards TDC Readout Module (TRD) which host 3 High Performance TDC (HPTDC) for a total of 24 channels each TRM. A FEA performs the logic OR of the 24 pads and the logic signal is OR-summed with another daisy-chained FEA: the signals from 6 daisy-chained couples of FEAs are collected by a FEA Controller (FEAC), which monitors FEA low voltage, temperature and it sets the thresholds. A Local Trigger Module (LTM) board receives the signals from 8 FEACs for a total of 48 ORs: there are 4 LTMs for each supermodules for a total of 72 LTMs. The read-out electronics is hosted by four crates on the edge of each supermodule (two on A side and two on C side) and they are composed as follows: Slot # Left Crate Right Crate 1 DRM DRM 2 LTM LTM 3 TRM CPDM 4 TRM TRM... 12 TRM TRM The outputs from the 72 LTMs are collected by the CTTM as a 2-fold 2

ORs: the basic element for the trigger logic is an OR-sum of 96 channels which corresponds to an area of 1 cm 2 (two half MRPCs). The CTTM is a large VME board (78 41 cm 2 ) hosting three piggy-backs, each one mounting a FPGA [4]. The top and bottom FPGAs receive the signals coming from the LTMs and apply the trigger logic, then the final decision is taken by the central FPGA which communicates through VME with a computer to the Central Trigger Processor (CTP). CTTM: porting to a 64-bit environment In the near future the computer monitoring the CTTM (alitoftrg) will be upgraded from a 32-bit architecture to 64-bit. This upgrade requires the porting of the code related to CTTM and compiling it in a 64-bit computer. A 64-bit architecture computer called alitoftm and located in 29/R-3 was used for this purpose: the entire setup was assembled in this laboratory. As first step the program ltman was compiled. It is used to monitor and control the daisy-chained FEACs connected to LTMs. In order to successfully compile it, the following objects were to be compiled as object files: ltm event defining the functions acting on the LTM; console implementing the user interface; v1392util implementing the writing and reading operations on the registers; CAENVMEDemoVme implementing the possibility to read a single register knowing its address. After compiling these sources it was possible to compile and build the executable program in the 64-bit environment. The only required change was adding the definition of Linux OS as preprocessor directives since the definition LINUX was not recognized while linux was. The CTTM board (see Fig. 1) arrived on July 18, 216 from Bologna section of INFN (Istituto Nazionale di Fisica Nucleare) in Italy. During the transportation the low voltage connections (3.3 V and 5. V) were broken and it was necessary to solder them again: using a multimeter the right connections were found and the solderings were done. Once the power supply was guaranteed inserting the low voltage boards in a Wiener crate (able to supply 3.3 V), the VME cable connection was plugged into the crate slot #4 since the base address was set to x4. The CTTM registers were read using a CAEN V2718 bridge by means of the program CAENVMEDemo, already installed on alitoftm. The reading the registers of the CTTM was impossible at the beginning because of bus errors: in order to check if the bridge was the cause of the problems it was tested on a different VME 3

Figure 1: Front side of the CTTM board: the three FPGAs are visible. On the backside there are the LVDS connectors for the signals coming from the 72 LTMs. module. After changing a faulty bridge we used a new and tested one but the errors were still there. The new bridge was not able to read the registers of the tester VME module in the Wiener crate (where the low voltage boards were plugged), while it was correctly working if mounted in another crate. It was therefore decided to use two different crates: one to host the low voltage boards and another one to host the bridge and the CTTM VME connector. The final setup is shown in Fig. 2. The clock could be sent to the CTTM through optical fiber as shown in Fig. 2 (the yellow cable connected on the front side of the CTTM) or using the one coming from the LTM; in the last case the connector is on the backside. Once the setup was completed, the central FPGA firmware was reprogrammed using a ByteBlaster cable and the software Quartus II Programmer: the file used to reprogram it was a.pof file written in VHDL language. The reprogramming was necessary since it was impossible to read some FPGA registers which had to be accessible. After reprogramming every registers was readable. The next step was to compile the software epcs. It is used to program the FPGAs on the CTTM using.rbf files with no need to use the byte- 4

VME connection crate optical fiber clock power supply crate Figure 2: Setup to test the CTTM board located in 29/R-3. blaster. Reprogramming using VHDL files and the byteblaster requires to be physically close to the CTTM board and thi is not always possible. In order to make an executable file the following object files were compiled: libvme defining the functions handling the communication through VME; libepcs, the library of functions to reprogram the FPGA firmware. The other libraries used were libncurses and libcaenvme, already compiled as shared libraries. The software epcs was used to program once more the central FPGA using a.rbf file taken from alitoftrg, more recent than the.pof file. The main part of the porting of the code in the 64-bit environment was to recompile the server of the CTTM. The source file used is running server v8.c. This program requires several other files in specified paths of alitoftrg: they were fetched and located in the same path in alitoftm in order to simulate the environment present at P2 (ALICE location). The object files to be compiled in order to have the executable were: 5

utility, a collection of functions handling the memory access, buffers and threads; stools implementing logbook messages tools; simplelog handling logbook functions; libltm implementing the functions to read LTMs from CTTM; libinfo and infologgerconfig to handle the Logger; cfgfile implementing some functions from stools and handling configuration files; cttm func implementing the functions acting on the CTTM. In addition to these files also libvme, libepcs and the library libdim (database functions) were required. A makefile was written to easily build the server. The executable produced using the source from the computer alitoftrg had problems while running in the setup arranged in building 29 due to the fact that the environment is not completely the same as P2. The following changes had to be applied to run it without errors: setting manually the crate status as ON since it was detected as OFF. To detect the CTTM status the DIM communication system is used and the node is the server alitofldc, a computer in the laboratory. The server has to detect if the crate supplying the power is on. The cause of this misdetection may be that the low voltage crate and the VME reading one are different; removing thresholds setting because of problem accessing to the memory registers of bottom FPGA. These threshold are used to improve noisy channels; Using the infobrowser tool to check the server online and the DIM database to handle commands and services of the node alitoftm it was possible to change the Detector Constrol System (DCS) status of the CTTM. The status identifies the current situation of each detector (or part of it): some functions are possible only for particular status. The CTTM status was set to STANDBY CONFIGURED from STANDBY and then to READY. This means the CTTM was ready to work and it had to be checked. In order to check if the CTTM works properly the board was pulsed. As first step the LTM located in 29/R-3 was connected to the backside of the CTTM through a LVDS cable and the clock from the LTM was passed to CTTM. Running the CTTM server (running server v8 ) the board was set to READY and using the software ltman the toggle status of the LTM was set 6

to TOGGLING. In this way it was possible to pulse the CTTM board: once stopped the toggling LTM it was possible to verify that the CTTM received the signal. This was possible because the server automatically updates the files rate pre and rate after with the trigger rates before and after setting the threshold for noisy channels. The files are updated as soon as the CTTM receives a pulse from LTMs. Since thresholds are null due to problems accessing the memory of the bottom FPGA only rate pre should change. Pulsing the CTTM the file changed and moving the LVDS connection on the backside of the CTTM the rate pre file changed accordingly. AMORE framework The Data Quality Monitor (DQM) framework used by ALICE experiment is called AMORE (Automatic MOnitoRing Environment). Several processes (analyzing a sample of raw data) called agents continously run during the data acquisition [5]. They are based on generic libraries such as AliQAChecker (used to check the histograms setting the quality flag) specialized by inheritance for each detector (e.g. for the TOF detector the class is AliTOFQAChecker). This choice of basing the DQM on AliRoot libraries is not common for all the detectors: a few of them use custom made functions. A sample of data called chunk is taken and analyzed every 6 s (MC, Monitor Cycle): each detector agent analyze this dataset over a loop, updating the QA plots and setting the quality flags. Some of these plots are checked during the data taking by the DQM shifter always present in the ALICE Control Room. Every plot produced by each detector could be set as shifter plot or expert plot: in the first case it is shown by default in the amoregui window (the graphic user interface) and it must be constantly checked by the shifter. The expert plots instead are not shown by default but if there are problems concerning a detector the expert may need to check them. Each single detector expert has to set periodically the detector quality flag for each run: in order to summarize the performances a summary image is produced with the most significant plots. Some histograms have a message box changing according to their quality flag: the colour and the message shown change as the flag changes. The possible flags are: kinfo, everything is working fine. reassuring message is displayed; The box is set to green and a kwarning, not an error but something is not working as it should. In this case the box s colour is set to yellow and a message is shown usually suggesting to check what s the type of the run; 7

kerror, some parameters are out of range and this means the detector has problems. The box is set to red and a message telling the shifter what to do is shown (checking the TWiki or calling the on-call expert); kfatal, the histogram is empty. In this case the box is empty and grey since there are no data to analyze online. This problem has to be fixed immediately. On the left side of the amoregui window all the histograms produced are shown as elements of a tree. All the histogram names end with a coloured box representing the quality flag of the detector: if these are not green, the shifter has to check what is the problem and acts accordingly. TOF DQM: QA histrograms split by trigger classes The QA plots for the TOF detector are initialized, built and updated by the class AliTOFQADataMakerRec: data are taken from the raws stream and used to fill the plots. At the end of a monitor cycle the plots are checked by AliTOFQAChecker which sets the quality flag at the end of each MC. Despite the number of raw histograms (>3) only 12 are not for experts: besides having to be checked by the DQM shifter those plots are included into the TOF summary image (see Fig. 3). The most meaningful plots regarding the performances of the Time-Of- Flight detector are implemented with a message box [6]: htof Raws plotting the multiplicity per event; htof RawsTime plotting the arrival time per hit in ns; htof RawsToT plotting the ToT (Time-over-Threshold) per hit in ns. Splitting the histograms by trigger calsses is possible declaring it in the configuration file for the TOF Quality Assurance (tofqa.config). In order to add a copy of the clone to the summary image, it is needed to include the histogram name and the selected trigger classes in the config file. For example in the summary image of Fig. 3 the clone of htof Raws and htof RawsTime are present and indicated by the trigger class kint7 (the minimum bias trigger). Respectively, the original histograms are the second and the fourth in the first line while the clones are the first and the third. Since most of the events have null multiplicity htof Raws has a peak for null multiplicity events. Selecting the minimum bias trigger will canel this peak: indeed in this case the very low multiplicity events are not taken into account. As shown in Fig. 3 the message box for the clone is disabled. On the test machine pcaldblade2 the latest version of AliRoot (version 4.4.7) was built: the changes were done on its libraries. In order to simulate the 8

Figure 3: Summary image from run 26537. The check of trigger class split histograms was not yet implemented. real data taking and the real DQM environment the data from run 254378. Working on the AliTOFQAChecker library the message box for the trigger class clones was enabled and a bug in the AMORE framework was found: the flag next to the clone name in the tree list was always purple. The bug was fixed by the AMORE expert Barthélémy von Haller. A new feature for the summary image was implemented: it is possible to decide to show or not the original histograms (trigger blind) and to specify the trigger class clones wanted to be shown in the summary image. This may allow to show the trigger class clone in the amoregui window but not in the summary image. Even if this feature is not yet used it could be useful when the number of plots in the summary image increases and only few trigger classes will be required to set the detector quality flag at the end of run. The amoregui window is shown in Fig. 4 and the final summary image is shown in Fig. 5. The message box was correctly implemented and the detector flags are well set. The trigger classes used are kint7, klifecent and kcalibbarel (different aliases for the minimum bias trigger). 9

LowMultiplicity_hTOFRaws_$TR$_kINT7 Multiplicity within limits OK!!! LowMultiplicity_hTOFRawsTime Entries 298734 Mean 214.8 RMS 48.97 Mean inside limits: OK!!! LowMultiplicity_hTOFRawTimeVsTRM35 Entries 143677 Mean x 162.5 Mean y 22.3 RMS x 12.9 RMS y 36.19 LowMultiplicity_hTOFRawHitMap24 Multiplicity within limits OK!!! LowMultiplicity_hTOFRaws_$TR$_kLifeCENT LowMultiplicity_hTOFRawsToT Entries 298734 Mean 14.45 RMS 5.981 Mean inside limits: OK!!! LowMultiplicity_hTOFRawTimeVsTRM3671 Entries 1544657 Mean x 559.4 Mean y 226.3 RMS x 97.3 RMS y 55.91 LowMultiplicity_hHitMultiVsDDL Entries 66114 Mean x 35.5 Mean y.3845 RMS x 2.78 RMS y 3.36 Multiplicity within limits OK!!! LowMultiplicity_hTOFRaws LowMultiplicity_hTOFrefMap Entries 3.679821e+7 Mean x 8.73 Mean y 44.78 RMS x 5.124 RMS y 26.5 Map of hit macropads according to CTTM numbering (Max Fired Macropad = 5) LowMultiplicity_hTOFmacropadCTTM Entries 1158358 Mean x 27.3 Mean y 9.44 RMS x 18.51 RMS y 4.592 LowMultiplicity_hNfiredMacropad 3 3 1 LowMultiplicity_hTOFRawsTime_$TR$_kINT7 Entries 217291 Mean 222.8 RMS 52.49 Mean inside limits: OK!!! LowMultiplicity_hTOFRawHitMap Entries 298734 Mean x 9.211 Mean y 45.47 RMS x 5.495 RMS y 25.61 LowMultiplicity_hTOFmacropadDeltaPhiTime Entries 1.35768e+7 Mean x 75.99 Mean y 3.981 RMS x 54.64 RMS y 5.27 3 Figure 4: The amoregui window showing the clone of htof RawsTime for the trigger class kint7. 16 14 12 1 8 6 4 2 Events TOF raw hit multiplicity kint7 Entries 43558 Mean 39.83 RMS 38.83 Events 6 5 4 3 2 1 TOF raw hit multiplicity klifecent Entries 2133 Mean 59.4 RMS 44.52 35 Events 3 25 2 15 1 5 TOF raw hit multiplicity Entries 91682 Mean 25.58 RMS 51.37 Hits 18 16 14 12 1 8 6 4 2 TOF Raws - Hit time (ns) kint7 1 2 3 4 5 6 7 8 9 1 TOF raw hits number 1 2 3 4 5 6 7 8 9 1 TOF raw hits number 1 2 3 4 5 6 7 8 9 1 TOF raw hits number 1 2 3 4 5 6 Measured Hit time [ns] Hits 25 2 15 1 5 3 1 TOF Raws - Hit time (ns) Hits 4 35 3 25 2 15 1 5 3 1 TOF Raws - Hit ToT (ns) strip 9 8 7 6 5 4 3 2 1 TOF enabled channel reference map 6 5 4 PHOS 3 2 1 strip 9 8 7 6 5 4 3 2 1 TOF raw hit map (1 bin = 1 FEA/24) 8 7 6 5 PHOS 4 3 2 1 1 2 3 4 5 6 Measured Hit time [ns] 5 1 15 2 25 3 35 4 45 Measured Hit ToT (ns) 2 4 6 8 1 12 14 16 18 sector 2 4 6 8 1 12 14 16 18 sector TOF raw time [ns] 6 5 4 3 2 1 TOF raws - Hit time vs TRM - crates to 35 TOF raws - Hit time vs TRM - crates 36 to 72 TOF raw time [ns] 6 5 4 3 2 1 bit index 22 2 18 16 14 12 1 8 6 4 2 1 5 4 3 2 1 Δ BX 2 18 16 14 12 1 8 6 4 2 Δ t vs Δ Φ of hit macropads (Max Fired Macropad = 5) 16 1 14 12 1 8 6 4 2 5 1 15 2 25 3 35 TRM index = DDL*1+TRM(-9) 4 45 5 55 6 65 7 TRM index = DDL**1+TRM(-9) 5 1 15 2 25 3 35 4 45 5 55 6 65 7 LTM index 2 4 6 8 1 12 14 16 18 Δ Φ (degrees) strip 9 8 7 6 5 4 3 TOF raw hit map (1 bin = 1 FEA/24) Entries 13436 Mean x 9.315 1.8 Mean y 46.3 RMS x 5.547 1.6 RMS y 25.16 1.4 1.2 PHOS 1.8.6 TOF raw hits number 5 45 4 35 3 25 2 15 TOF raw hit multiplicity per event vs DDL 9 8 7 6 5 4 3 Events Events 1 8 6 4 Number of fired TOF macropads per event Entries 91682 Mean 15.1 RMS 11.54 2.4 1 2 2 1.2 5 1 2 4 6 8 1 12 14 16 18 sector 1 2 3 4 5 6 7 DDL 5 1 15 2 25 3 35 4 45 5 number of fired macropads Figure 5: Summary image with message box implemented for clones. 1

Conclusions The testing setup for the CTTM board was succesfully settled in 29/R- 3. In order to avoid bus errors two different crates were used for VME connection and for power supply. At first the FPGAs firmware had to be reprogrammed using a byteblaster cable and a VHDL program, then they were reprogrammed using the epcs software. This software had to be recompiled in the 64-bit environment provided by alitoftm computer. Once all the FPGAs were correctly reprogrammed the server (running server v8 ) was compiled in the 64-bit architecture. This was the crucial part of the work: several libraries had to be compiled in order to make it possible. As next step the DCS status of the CTTM board had to be set to READY: the differencies betwen the set up settled in 29/R-3 and P2 required some minor changes in the code. After the code was correctly changed it was possible to set the DCS status of the CTTM to READY. This is the status needed by CTTM to work correctly. The LTM was used to test the board: a pulse was sent to the CTTM to check if it worked properly. The test was successful. A makefile was written in order to compile properly the server when the upgrading of the architecture of alitoftrg will be done. The code of the TOF amoreqa was changed in order to enable the message box also for trigger class clones. Using the test computer pcaldblade2 it was possible to simulate the online DQM situation: for this purpouse the dataset from run 254378 (pp collision) was used. The tests returned positive results in several situations such as having more clones for the same histogram. The possibility to decide the trigger class clones to show in the summary image was implemented as new feature duing the coding. Moreover it s possible to draw on the summary image only the clones of an histogram with no need to show the orginal one. The changes to the code will be committed in the near future to the new AliRoot version. The message boxes enabled for clones in the summary image will help the TOF experts setting the quality flag for TOF at the end of each run. This will also help the DQM shifters to check if there are problems concerning TOF detector for some particular trigger classes: so far there was no message box helping them in this case. 12

References [1] The ALICE Collaboration. The ALICE experiment at the CERN LHC. Journal of Instrumentation, 3(8):S82, 28. [2] G. Dellacasa et al. ALICE technical design report of the time-of-flight system (TOF). 2. [3] A. Akindinov et al. A topological trigger based on the Time-of-Flight detector for the ALICE experiment. Nuclear Instruments and Methods in Physics Research A, 62:372 376, 29. [4] Alessandro Silenzi. The topological trigger system of the TOF detector for the ALICE experimente at the LHC. PhD thesis, Alma Mater Studiorum, Università di Bologna, 21. Relatore Prof. Maurizio Basile. [5] AMORE Modules Developer manual. https://alice-daq.web.cern. ch/products/amore-modules-developer-manual, 215. [6] AliROOT Reference Guide. http://aliroot-docs.web.cern.ch/ aliroot-docs, 4 216. 14