The Global Trigger Emulator System for the CMS experiment

Similar documents
Global Trigger Processor Emulator

Dominique Gigi CMS/DAQ. Siena 4th October 2006

CMS Conference Report

The CMS Event Builder

Run Control and Monitor System for the CMS Experiment

Using XDAQ in Application Scenarios of the CMS Experiment

The data-acquisition system of the CMS experiment at the LHC

EMU FED. --- Crate and Electronics. ESR, CERN, November B. Bylsma, S. Durkin, Jason Gilmore, Jianhui Gu, T.Y. Ling. The Ohio State University

The Run Control and Monitoring System of the CMS Experiment

The CMS Event Builder

10 Gbps TCP/IP streams from the FPGA for the CMS DAQ eventbuilder network

S-LINK: A Prototype of the ATLAS Read-out Link

Trigger and Data Acquisition at the Large Hadron Collider

TTC/TTS Tester (TTT) Module User Manual

Velo readout board RB3. Common L1 board (ROB)

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System

The CMS High Level Trigger System: Experience and Future Development

Read-out of High Speed S-LINK Data Via a Buffered PCI Card

HCAL DCC Technical Reference E. Hazen - Revised March 27, 2007 Note: Latest version of this document should be available at:

THE ALFA TRIGGER SIMULATOR

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

ATLAS TDAQ RoI Builder and the Level 2 Supervisor system

RPC Trigger Overview

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The CMS L1 Global Trigger Offline Software

A real time electronics emulator with realistic data generation for reception tests of the CMS ECAL front-end boards

2008 JINST 3 S Data Acquisition. Chapter 9

MCC-DSM Specifications

The FTK to Level-2 Interface Card (FLIC)

USCMS HCAL FERU: Front End Readout Unit. Drew Baden University of Maryland February 2000

The Front-End Driver Card for the CMS Silicon Strip Tracker Readout.

Front End Electronics. Level 1 Trigger

A flexible stand-alone testbench for facilitating system tests of the CMS Preshower

10Gbps TCP/IP streams from the FPGA

Detector Control LHC

Istituto Nazionale di Fisica Nucleare A FADC based DAQ system for Double Beta Decay Experiments

A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade

Trigger Layout and Responsibilities

The FEROL40, a microtca card interfacing custom point-to-point links and standard TCP/IP

Level 0 trigger decision unit for the LHCb experiment

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida

AMC13 Register Display Documentation

High level trigger configuration and handling of trigger tables in the CMS filter farm

The ATLAS High Level Trigger Region of Interest Builder

Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri

Proposal for Digitizer-to-SLINK Interface Card

Data Acquisition Software for CMS HCAL Testbeams

DT3016. High-Speed, Multifunction PCI Data Acquisition Board. Overview. Key Features. Supported Operating Systems

First operational experience with the CMS Run Control System

The LHC Compact Muon Solenoid experiment Detector Control System

A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade

The ATLAS Level-1 Muon to Central Trigger Processor Interface

Results of a Sliced System Test for the ATLAS End-cap Muon Level-1 Trigger

Electronics on the detector Mechanical constraints: Fixing the module on the PM base.

Schematic. A: Overview of the Integrated Detector Readout Electronics and DAQ-System. optical Gbit link. 1GB DDR Ram.

CSC Trigger Motherboard

The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System

Features. High-precision Analog input board (Low Profile size) for PCI Express AI-1616L-LPE AI-1616L-LPE 1. Ver.1.01

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

The MROD. The MDT Precision Chambers ROD. Adriaan König University of Nijmegen. 5 October nd ATLAS ROD Workshop 1

Vertex Detector Electronics: ODE to ECS Interface

DESIGN AND IMPLEMENTATION OF AN AVIONICS FULL DUPLEX ETHERNET (A664) DATA ACQUISITION SYSTEM

The CCD-S3600-D(-UV) is a

INFN Padova INFN & University Milano

DTTF muon sorting: Wedge Sorter and Barrel Sorter

The ALICE TPC Readout Control Unit

Control and Monitoring of the Front-End Electronics in ALICE

Investigation of High-Level Synthesis tools applicability to data acquisition systems design based on the CMS ECAL Data Concentrator Card example

DaqBoard/1000. Series 16-Bit, 200-kHz PCI Data Acquisition Boards

Development of a PCI Based Data Acquisition Platform for High Intensity Accelerator Experiments

PETsys SiPM Readout System

CMS FPGA Based Tracklet Approach for L1 Track Finding

Analog Input Sample Rate

A Data Readout Approach for Physics Experiment*

Atlantis MultiRob Scenario

Field Program mable Gate Arrays

A 3-D Track-Finding Processor for the CMS Level-1 Muon Trigger

NEMbox / NIMbox Programmable NIM Module

arxiv: v1 [nucl-ex] 26 Oct 2008

LHCb Online System BEAUTY-2002

Streaming Readout, the JLab perspective. Graham Heyes Data Acquisition Support Group Jefferson Lab

BTeV at C0. p p. Tevatron CDF. BTeV - a hadron collider B-physics experiment. Fermi National Accelerator Laboratory. Michael Wang

EASY-NIM 928 Suite. High Performance, Multi-Function Nuclear MCA/Counter/Timer/Rate Meter

A LVL2 Zero Suppression Algorithm for TRT Data

The MROD. The Read Out Driver for the ATLAS MDT Muon Precision Chambers

GPS time synchronization system for T2K

The new detector readout system for the ATLAS experiment

Topics. Interfacing chips

Communication Software for the ALICE TPC Front-End Electronics

arxiv: v2 [nucl-ex] 6 Nov 2008

Interim Technical Design Report

Enhancing the Detector for Advanced Neutron Capture Experiments

Solving the Data Transfer Bottleneck in Digitizers

Vertex Detector Electronics: ODE Pre-Prototype

Chap. 18b Data acquisition

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

Presentation Outline. Data Concentrator Card for ECAL. ECAL Data Volume and Raw Data generation. DCC Conceptual Design

Level-1 Regional Calorimeter Trigger System for CMS

File-based data flow in the CMS Filter Farm

Typical modules include interfaces to ARINC-429, ARINC-561, ARINC-629 and RS-422. Each module supports up to 8 Rx or 8Tx channels.

Transcription:

The Global Trigger Emulator System for the CMS experiment K. Zachariadou a1, T. Geralis 1, S. Kyriazopoulou 1, C. Markou 1, I. Michailakis 1 Abstract--We present the development of the Global Trigger Emulator System (egtp) of the CMS Data Acquisition System (DAQ). The egtp generates Level-1 triggers and exchanges information with other DAQ components. The egtp is based on a multi-purpose PCI board (Generic PCI Platform III) built around an FPGA, hosted in a PC. The board is programmed to function as a trigger summary pseudo-data source with partitioning capabilities. Data transmission is achieved via the SLINK-64 protocol. The egtp aims at decoupling the Leve1-1 trigger system from the readout system, for testing, installation and maintenance purposes. I I. INTRODUCTION n the CMS Data Acquisition System, for every beam crossing (25ns), the Global Trigger Processor (GTP) calculates up to 128 different trigger conditions and combines them to a Level-1 Accept (L1A) signal. The L1A signal is sent via the Timing, Trigger and Control (TTC) optical network to all the Front-End Drivers of the sub-detectors and to the Data Acquisition Event Manager (EVM) (Fig. 1). Furthermore, it controls the delivery of L1A signals according to feedback signals through the Trigger Throttling System (atts, stts). At the same time it guarantees that the sequence of L1A complies with a set of trigger rules of the general form No more than a certain number of L1A triggers within a given time interval, in order to minimize the sub-detector s buffers overflow probability. Detailed description of the Global Trigger can be found in [1]. The egtp (Global Trigger Processor emulator) system emulates most of the GTP s functions. Its implementation is driven by the necessity of testing the performance of the DAQ components during development (Preseries) as well as testing the performance of the DAQ system in run conditions. (The egtp will be part of the final system running in parallel with the GTP system). Moreover, during the DAQ system installation in the surface, the GTP emulator will be used to decouple the Level1 system (underground area) from the readout system (surface) (Fig. 1). a Corresponding author, e-mail: Katerina.Zachariadou@cern.ch, (telephone: +30-210-6503536) 1 K. Zachariadou is with the Institute of Nuclear Physics, NCSR Demokritos, GR-15310 Ag. Paraskevi Attiki, Greece and the University of Aegean, Fostini 31, GR-82100- Chios, Greece 1 T. Geralis is with the Institute of Nuclear Physics, NCSR Demokritos, GR- 15310 Ag. Paraskevi Attiki, Greece (telephone: +30-210-6503536, e- mail: Theodoros.Geralis@cern.ch) 1 S. Kyriazopoulou is with the Institute of Nuclear Physics, NCSR Demokritos, GR-15310 Ag. Paraskevi Attiki, Greece (telephone: +30-210- 6503526, e-mail: Sophia.Kyriazopoulou@cern.ch) 1 C. Markou is with the Institute of Nuclear Physics, NCSR Demokritos, GR- 15310 Ag. Paraskevi Attiki, Greece (telephone: +30-210-6503509, e- mail: christos.markou@inp.demokritos.gr) 1 I. Michailakis is with the Institute of Nuclear Physics, NCSR Demokritos, GR-15310 Ag. Paraskevi Attiki, Greece (telephone: +30-210-9622947, e- mail: imich@otenet.gr) Fig. 1. The Global Trigger Emulator System( egtp) in the CMS DAQ. II. THE GTP EMULATOR SYSTEM A. Overview The egtp system performs the following functions: a) random generation of L1A triggers for each partition. These are generated at non-empty beam crossings and user defined frequency in the range of 10Hz to 123kHz. (Moreover, the delivery of L1A triggers complies with a set of trigger rules imposed by the sub-detectors of each partition.) L1A triggers

are sent via a distribution system used to broadcast signals to the FRLs, b) partitioning (there are eight DAQ partitions and the number of sub-detector partitions in each DAQ partition is limited to eight [2]), c) emulation of the LHC proton beam structure, d) generation of trigger summary pseudo-data, encapsulated in the FED Common Data Format [2], to be sent to the FED Builder and e) receipt of feedback signals from DAQ partitions and from detector partitions (stts-fmm). 3564 [( 72 b 3 + 30 e] 2 + [( 72 b 4 + 31 e] + bunches = [( 72 b 3 + 30 e] 3 + 81 where b denotes full bunch crossings and e, empty ones. 3 + () 1 B. The egtp hardware and configuration code The egtp system is based on a generic, multi-purpose PCI card, (Generic PCI Platform III-GIII) [3] developed within the CMS DAQ group. The GIII board is a PCI 64b@66MHz board featuring a single FPGA (APEX with 400k gates, from Altera), 1MB flash RAM, a 32MB SDRAM, a set of user s connectors plus connectors compliant with the S-LINK64 (64b@66MHz)[4] pin-out. The GIII board is plugged into a PC (64b@66MHz) running Linux OS. The egtp is controlled via dedicated user s interface processes (LabView virtual instruments, or C-programs). For development purposes, data are transferred, via the SLINK-64 protocol, into a second GIII board (receiver) and are retrieved by dedicated readout software. The general FPGA configuration is shown in Fig. 2. The code for the PCI controller, the SLINK-64 controller and the JTAG has been developed using Quartus 2.2 (Altera) design software [5], whereas the code for the GTP emulation has been developed using DK1 1.1 (Celoxica) design software [6]. Fig. 2. FPGA configuration schematic. C. The egtp design The main functional blocks of the egtp design are shown in Fig. 3. To emulate the LHC bunch crossing frequency a quartz (40MHz) on the GIII board is used. The BX_gen module emulates the LHC proton beam structure. The total length of each orbit is 3564 bunch crossing intervals (89µs). The proton bunches are grouped in 39 trains, 72 bunches each. At the end of the orbit there is a period of 119 missing bunches (3µs long). This structure is represented by the formula: Fig. 3. The egtp functional block diagram. The beam structure signal (BX) is fed to the Level-1 generator module (L1_Gen). For each partition the L1_Gen module generates eight random LlA signals by implementing 22-bits long random number generators. L1A signals are generated only at non-empty bunch crossings and at frequencies defined by the user (10Hz to 123kHz). The partition selection and the associated rates for each partition are set via the PCI. The L1_Gen module associates the following data, according to the trigger summary block [2]: a) a bunch crossing number (12b) counting the number of bunch crossings within each LHC orbit, b) an event number (24b) counting the number of L1A signals per DAQ partition, c) a trigger number (24b) counting all the L1A signals generated since the last egtp reset, d) a DAQ partition number (3b), indicating the DAQ partition that gives the L1A signal, e) an orbit number (32b) counting the number of orbits since the last egtp reset. The trigger summary block data are formatted with the common FED Data encapsulation format [2] as seven 65-bits words. Given the fact that the assertion of Level-1 triggers must follow a set of trigger rules in order to minimize the subdetector s buffers overflow probability, different trigger rules are implemented in the L1_Gen module code for each subdetector of the DAQ partitions. The different trigger rules for the sub-detectors are listed in Table I.

TABLE I TRIGGER RULES [6] Detectors Trigger rules All sub-detectors At least one empty BX between subsequent triggers Pixel, Tracker, Muons At least 2 empty BXs between CSC subsequent triggers HCAL No more than 22 triggers per orbit Pixel Empty orbit after a period of one second Preshower No more than 1+13 triggers over the service time of 5.4 s The L1_Gen module receives feedback signals (interpreted as inhibit of all Level-1 triggers) from: a) atts_busy (x8) (DAQ-partitions, FRL, etc) b) stts_busy (x8) (sub-detector partitions) c) S-LINK64 controller (LFF). The LFF signal is LOW (backpressure) when the receiver is not running or cannot accept more data and d) the event buffer (see below), in case that the local FIFO has less than 16 empty words. The Write_Evm module receives the data fragments (seven 65-bit words in parallel) and temporarily stores them in a cyclic event buffer of two events length. The buffering is necessary to serialize the output data stream and to avoid dead time when successive triggers occur. If the local FIFO (128 65- bit words length) has at least sixteen empty words then data fragments are fed to it and then flow out to the receiver via the S-LINK64 CMC cards upon receipt of a read_fifo command from the S-LINK64 controller (section II.E.). Furthermore, event losses and dead time statistics are kept in counters that can be read by the PCI. D. PCI controller The PCI controller provides PCI communication plus registers for control, status, error and reset operations. In total there are 13 registers that can be accessed by the PCI. E. S-LINK64 controller To transfer the GTP Emulator pseudo-data, the S-LINK64 (Simple Link Interface in a Common Mezzanine Card) protocol is used. Detailed description of the S-LINK64 can be found in [4]. Upon the reception of the UWEN command (UWEN=low) from the PCI controller, the S-LINK64 controller initiates the reading of the local FIFO, so that data start to flow to the receiver side. Backpressure signal (LFF=low) is sent to the L1_gen module when the receiver cannot accept more data or LDOWN signal is low, indicating that the receiver does not run. FRL that broadcasts the data fragments to the FED Builder. Interface to the Trigger distribution system and the Throttling (atts, stts) system: a) The egtp system distributes eight L1A signals to the eight sub-detector partitions (FRLs). b) Each of the eight sub detector partitions sends to the egtp system, via FRL-sTTS 1, four LVDS signals (READY, BUSY, OUT_OF_SYNC and WARNING). If a sub-detector sends a not READY signal the egtp system inhibits L1A for the partition in which the subdetector belongs. c) Each of the eight DAQ partition controllers (atts) receives from the egtp system its status and sends to the egtp system four LVDS signals (READY, BUSY, OUT_OF_SYNC and WARNING). If a partition sends a not READY signal, the egtp system inhibits L1A signal for this partition. The signals a), b), and c) are transmitted to the external systems via a specially designed module (egtp-io), with an Altera-ACEX 30k, which performs the necessary encoding/decoding from/to LVDS/TTL signals. The connection of the egtp-io to the egtp is done via a Mezzanine card (egtp CMC) plugged on the fast GIII connector (Fig.4, Fig.5) Fig. 4.The egtp interfaces schematic. III. INTERFACES The egtp system has the following hardware interfaces to external systems (Fig.4): Interface to the Event Manager: The egtp system sends (via SLINK-64) the L1A data fragments (encapsulated in the FED Common Data Format) to an 1 The equivalent of stts of the real system

Trigger number, partition event number and trigger rules have been verified for a large number of accumulated events (up to 2x10 9 ) for trigger rates 10kHz up to totally 750kHz and for different DAQ partition and sub-detectors partition schemes (up to eight partitions). The synchronization of the L1A signal with the corresponding event fragment has been verified up to 2x10 8 events, using a fast CAEN counter timer, at a rate of 120kHz. Moreover, the event timing has been tested, for different trigger rates, to follow Poisson statistics (Fig. 6). Fig. 5. a) The egtp-io module (left), the egtp system (middle) and the EVM receiver board (right) are shown, together with the connection of egtp to/from egtp-io (flat cable) and the connection to EVM receiver (LVDS S- LINK64 cable), b) close-up view of the egtp system board. IV. PERFORMANCE TESTS An egtp test-bench has been set up in order to test the performance of the system (Fig.4). It is implemented by a PC running Linux RedHat7.3.3 with 64b@66MHz PCI bus. The egtp is implemented by a GIII card and sends event fragments, via the S-LINK64 CMC cards, into a second GIII card (receiver) plugged into the PC. To control the egtp a set of drivers have been developed based on LabView (National Instruments [7]) whereas the FedKit 1_30 software [8] has been used to read and analyze data in the receiver side. Two different sets of tests have been performed using this test-bench. In the first test (slow test; limited to rates up to 20KHz) data arriving via the S-LINK64 are dumped into the PC s hard disk and are analyzed offline in order to verify that the system responds adequately to the S-LINK64 backpressure and that the LHC beam structure is correctly reproduced. In the second test, fast readout measurements have been taken and data are read and analyzed online at full speed. Fig. 6. Probability of observing a subsequent trigger as a function of time for events generated at 100kHz. V. CONCLUSIONS A Global Trigger Processor Emulator system (egtp) has been developed aiming at decoupling the L1 trigger system (underground) from the readout system (surface DAQ) for testing, installation and maintenance purposes. The egtp system emulates the Global Trigger Processor by generating level-1 triggers, applying partitioning, emulating the LHC proton beam structure, generating trigger summary pseudodata and receiving feedback signals from DAQ and detector partitions. The egtp system is based on the multi-purpose PCI card Generic III hosted in a Personal Computer and data transmission to the receiver board (FRL) is achieved via the SLINK-64 protocol. Extensive tests have verified that the system responds satisfactorily to the S-LINK64 backpressure and that the LHC beam structure is correctly reproduced. Furthermore, the trigger number, partition event number, trigger rules and event timing have been verified for a large number of accumulated events, for different trigger rates and for different DAQ partition and sub-detectors partition schemes.

VI. ACKNOWLEDGMENT We wish to thank Eric Cano and Dominique Gigi for their constant support during the development and testing of the egtp system. Fruitful discussions with Christoph Schwick and Joao Varela are gratefully acknowledged. VII. REFERENCES [1] CMS Collaboration, "The Trigger and Data Acquisition project, Volume I, TDR", CERN/LHCC 2000/038. [2] CMS Collaboration, "The Trigger and Data Acquisition project, Volume II, TDR", CERN/LHCC 2002/026. [3] "GIII FedKit", http://cmsdoc.cern.ch/cms/tridas/html/documents.html [4] A. Racz, R. McLaren, E. van der Bij, "The S-LINK 64 bit extension", http://hsi.web.cern.ch/hsi/s-link/spec. [5] Altera, http://www.altera.com. [6] Celoxica, http://www.celoxica.com. [7] National Instruments, http://www.ni.com. [8] V. Brigljevic, G. Bruno, E. Cano, S. Cittolin, S. Erhan, D. Gigi, F. Glege, R. Gomez-Reino Garrido, M. Gulmini, J. Gutleber, C. Jacobs, M. Kozlovszky, H. Larsen, I. Magrans de Abril, F. Meijers, E. Meschi, S. Murray, A. Oh,L. Orsini, L. Pollet, A. Racz, D. Samyn, P. Scharff- Hansen, C. Schwick, P. Sphicas, J. Varela, "FEDkit: a design reference for CMS data acquisition inputs", CMS CR 2003/036