CLAS12 DAQ & Trigger Status and Timeline. Sergey Boyarinov Oct 3, 2017

Similar documents
CLAS12 DAQ, Trigger and Online Computing Requirements. Sergey Boyarinov Sep 25, 2017

FT Cal and FT Hodo DAQ and Trigger

CLAS12 Offline Software Tools. G.Gavalian (Jlab) CLAS Collaboration Meeting (June 15, 2016)

The DAQ/Trigger Hardware Systems for Jefferson Lab's 12GeV Experimental Programs

Simulations. What we need What is working What is not working. The roadmap to the perfect CLAS12 MC detector response

Performance of the GlueX Detector Systems

CLAS 12 Reconstruction Software

Forward Time-of-Flight Detector Efficiency for CLAS12

Preparation for the test-beam and status of the ToF detector construction

Hall-C Analyzer & Hall-C Replay

TOF Electronics. J. Schambach University of Texas Review, BNL, 2 Aug 2007

Velo readout board RB3. Common L1 board (ROB)

Detector Control LHC

Streaming Readout, the JLab perspective. Graham Heyes Data Acquisition Support Group Jefferson Lab

CLAS12 Software Meeting March 17, 2010 F113

HPS DAQ Operations Manual v2.4.2

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

Forward Time-of-Flight Geometry for CLAS12

The CMS L1 Global Trigger Offline Software

Work in Tbilisi. David Mchedlishvili (SMART EDM_lab of TSU) GGSWBS , Tbilisi. Shota Rustaveli National Science Foundation

NEMbox / NIMbox Programmable NIM Module

Update on PRad GEMs, Readout Electronics & DAQ

Kondo GNANVO Florida Institute of Technology, Melbourne FL

LHC Detector Upgrades

Event reconstruction in STAR

APV-25 based readout electronics for the SBS front GEM Tracker

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

ATLAS, CMS and LHCb Trigger systems for flavour physics

CMS FPGA Based Tracklet Approach for L1 Track Finding

Overview of SVT DAQ Upgrades. Per Hansson Ryan Herbst Benjamin Reese

Topics for the TKR Software Review Tracy Usher, Leon Rochester

The GLAST Event Reconstruction: What s in it for DC-1?

SoLID GEM Detectors in US

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Analogue, Digital and Semi-Digital Energy Reconstruction in the CALICE AHCAL

CLAS12 Software Demonstration

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

Muon Reconstruction and Identification in CMS

Trigger Report. W. H. Smith U. Wisconsin. Calorimeter & Muon Trigger: Highlights Milestones Concerns Near-term Activities CMS

Nuclear Physics Division Data Acquisition Group

SiPMs for Čerenkov imaging

ATLAS PILE-UP AND OVERLAY SIMULATION

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

Development of LYSO Detector Modules for an EDM Polarimeter at COSY. for the JEDI Collaboration

The AMS-02 Anticoincidence Counter. Philip von Doetinchem I. Phys. Inst. B, RWTH Aachen for the AMS-02 Collaboration DPG, Freiburg March 2008

Extremely Fast Detector for 511 kev Gamma

Super BigBite Spectrometer: simulation and software update

Time of CDF (II)

Global Trigger Processor

Performance of the ATLAS Inner Detector at the LHC

LVL1 e/γ RoI Builder Prototype

The FTK to Level-2 Interface Card (FLIC)

Calorimeter Object Status. A&S week, Feb M. Chefdeville, LAPP, Annecy

CLAS12 Software Organization and Documentation

High Voltage system for the Double Chooz experiment

Event Displays and LArg

HPS Slow Controls: Performance and Future. N. Baltzell HPS Collaboration Meeting November 16, 2016

The ATLAS Level-1 Muon to Central Trigger Processor Interface

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

TRLO II flexible FPGA trigger control

Nuclear Physics Division Data Acquisition Group. Description and Technical Information for the VME Trigger Interface and Distribution (TID) Module

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

Module Performance Report. ATLAS Calorimeter Level-1 Trigger- Common Merger Module. Version February-2005

INDEX. Digitizer Software. CAENComm library CAENDigitizer library CAENDPP library WaveDump. CAEN VME Demos WaveCatcher

Readout Systems. Liquid Argon TPC Analog multiplexed ASICs SiPM arrays. CAEN 2016 / 2017 Product Catalog

HPS Update. N. Baltzell On behalf of the Heavy Photon Search Collaboration June 14, 2017 CLAS Collaboration Meeting

Integrated CMOS sensor technologies for the CLIC tracker

Data Acquisition Software for CMS HCAL Testbeams

Trigger/DAQ design: from test beam to medium size experiments

An FPGA Based General Purpose DAQ Module for the KLOE-2 Experiment

Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

Beam test measurements of the Belle II vertex detector modules

The CMS Computing Model

CSC Trigger Motherboard

The Intelligent FPGA Data Acquisition

SoLID GEM Detectors in US

Data acquisition system of COMPASS experiment - progress and future plans

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters

Electronics on the detector Mechanical constraints: Fixing the module on the PM base.

GNAM for MDT and RPC commissioning

GLAST Silicon Microstrip Tracker Status

DT7837. ARM Module for Embedded Applications. Overview. Key Features. Supported Operating Systems

FPGA based Sampling ADC for Crystal Barrel

Electron and Photon Reconstruction and Identification with the ATLAS Detector

Hall D and IT. at Internal Review of IT in the 12 GeV Era. Mark M. Ito. May 20, Hall D. Hall D and IT. M. Ito. Introduction.

DT7837 ARM Module for Embedded Applications

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Scintillator-strip Plane Electronics

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Status Review November 20, 2003

First Operational Experience from the LHCb Silicon Tracker

CMS Simulation Software

PANDA EMC Trigger and Data Acquisition Algorithms Development

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

The Trigger and Data Acquisition system for the NA62 experiment at CERN

Novel Signal Processing for Event Recording and Qualification in a Large Area TOF PSD System

A full digital approach in Physics Applications

Hall C Analyzer. Hall C Winter Collaboration Meeting. Eric Pooser 01/20/2017

LHC-B. 60 silicon vertex detector elements. (strips not to scale) [cm] [cm] = 1265 strips

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

Transcription:

CLAS12 DAQ & Trigger Status and Timeline Sergey Boyarinov Oct 3, 2017

Notation ECAL old EC (electromagnetic calorimeter) PCAL preshower calorimeter DC drift chamber HTCC high threshold cherenkov counter FT forward tagger TOF time-of-flight counters MM micromega tracker SVT silicon vertex tracker VTP VXS trigger processor (used on trigger stage 1 and stage 3) SSP subsystem processor (used on trigger stage 2) FADC flash analog-to-digital converter FPGA field-programmable gate array VHDL VHSIC Hardware Description Language (used to program FPGA) Xilinx FPGA manufacturer Vivado_HLS Xilinx High Level Synthesis (translates C++ to VHDL) Vivado Xilinx Design Suite (translates VHDL to binary image) GEMC CLAS12 GEANT4-based Simulation Package CLARA CLAS12 Reconstruction and Analysis Framework DAQ Data Acquisition System 2

CLAS12 DAQ Status Online computer cluster is 100% complete and operational Networking is 100% complete and operational DAQ software is operational, reliability is acceptable Performance observed (KPP): 5kHz, 200MByte/sec, 93% livetime (livetime is defined by hold-off timer = 15microsec) Performance expected: 10kHz, 400MBytes/sec, >90% livetime Performance limitation expected: 50kHz, 800MBytes/sec, >90% livetime (based on HallD DAQ performance) Electronics installed (and was used for KPP): ECAL, PCAL, FTOF, LTCC, DC, HTCC, CTOF, SVT TO DO: Electronics installed recently or will be installed soon: FT, FTMM, CND, MM, scalers, helicity TO DO: CAEN v1290/v1190 boards (re)calibration TO DO: Full DAQ performance test 3

DAQ/Trigger Hardware Layout from fast detectors PMTs (ECAL, PCAL, FTOF, CTOF, HTCC, LTCC, FT, CND) CAMAC (1) CFDs NIM (3) ORTEC CFDs 1 3 VXS (24) [18] FADCs VTPs SD TI ROC v1290n TDCs v1290 TDCs v1190 TDCs DSC2s FOUT TI ROC 24 VME (20) [18] 19 CLUSTERS SEGMENTS Readout Readout VXS (1) [1] SSPs GTP SD ROC TS SD TDs ROC 1 VXS (1) [1] 1 Readout Global Trigger Crate Trigger Distribution Crate from Drift Chamber from SVT from MM VXS (18) DCRBs VTPs SD TI ROC VXS (3) VSCMs SD TI ROC VXS (1) SSPs SD TI ROC 18 2 Readout Readout 72 Crates ~110 Readout Controllers (56 Crates installed) INSTALLED: 65 crates, 87 readout controllers 4

Runcontrol: setting DAQ/Trigger parameters

CLAS12 DAQ Commissioning Cosmic test (no central detectors): set electron trigger without HTCC (ECAL +PCAL and ECAL+PCAL+DC), check data integrity, check noise for DC and other detectors, check monitoring tools Cosmic test (central detectors only): set CTOF self-trigger and/or SVT self trigger, check data integrity and monitoring tools Random pulser test: set low channel thresholds, measure event rate, data rate and livetime; check monitoring tools performance Beam performance test: running from electron trigger measure event rate, data rates and livetime with different beam conditions and DAQ/Trigger settings Timeline: cosmic and pulser tests one month (November); beam test - one shift; we assume that most of problems will be discovered and fixed during cosmic and pulser tests 6

CLAS12 Trigger Status All available trigger electronics installed. It includes 25 VTP boards (stage 1 and 3 of trigger system) in ECAL, PCAL, HTCC, FT, R3 for all sectors and R1/R2 for sector 5 in Drift Chamber and main trigger crates, 9 SSP boards in stage 2 20 move VTP boards arrived and getting ready for installation in remaining DC crates and FTOF crates, should be ready soon All C++ trigger implementation completed for ECAL, PCAL, DC and HTCC, TOF C++ implementation and FT C++ simulation to be done Trigger hardware implementation completed for ECAL, PCAL and DC, HTCC and FT is in progress, TOF will follow pixel trigger is implemented in ECAL and PCAL for cosmic calibration purposes Validation procedures under development 7

CLAS12 Trigger System Layout S6 S5 S4 S3 S2 S1 VTP (VXS Trigger Processor) INPUTS: ECAL and LTCC FADCs OUTPUT: Energy SUMs, Clusters VTP (VXS Trigger Processor) INPUTS: PCAL FADCs OUTPUT: Energy SUMs, Clusters VTP (VXS Trigger Processor) INPUTS: FTOF FADCs OUTPUT: Hits VTP (VXS Trigger Proc) INPUTS: Central TOF OUTPUTS: Hits VTP (VXS Trigger Proc) INPUTS: HTCC OUTPUTS: Hits VTP for FT (3) SSP CEN SSP S6 SSP S5 SSP S4 SSP S3 SSP S2 SSP (Sub System Processor), Sector 1 Sector based trigger decision VTP (as Global Trig Processor) Global Trigger decision 16 16 TS Calibration S6 S5 S4 S3 S2 S1 VTP (VXS Trigger Proc) INPUTS: Drift Chamber Region 1 OUTPUT: list of track segments VTP (VXS Trigger Proc) INPUTS: Drift Chamber Region 2 OUTPUT: list of track segments VTP (VXS Trigger Proc) INPUTS: Drift Chamber Region 3 OUTPUT: list of track segments 8

PCAL+DC trigger event example 9

CLAS12 Trigger Components Summary stage name Clock (ns) algorithm input Output to trigger Output to datastream 1 ECAL 32 Cluster finding with energy correction FADC 4 clusters List of clusters 1 PCAL 32 Cluster finding with energy correction FADC 4 clusters List of clusters 1 DC 32 Segment finding DCRB Segment mask Segment mask 1 HTCC 4 Cluster finding FADC Cluster mask Cluster mask 1 FT 4 Cluster finding FADC Cluster mask Cluster mask 2 sector 4 Stage1 components coincidence, DC road finding stage1 Component mask Component mask 3 global 4 Stage2 components coincidence stage2 16-bit trigger Component mask Control registers Thresholds, windows Thresholds, windows Thresholds, windows Thresholds, windows Thresholds, windows Coincidence logic, windows Coincidence logic, windows NOTE: FTOF, CTOF and CND can be added to trigger stage 1 10

CLAS12 Trigger requirements # Nota'on Reac'on Final State Trigger 1 R1 Generator DAQ 2 R2 Random Tr. Faraday cup Background 3 E1 ep->e X e (CLAS) Ectot*(DCtrack) ECAL only, prescaled 4 E2 ep->e X e (CLAS) PCAL*Ectot*(DCtrack) PCAL*ECAL, prescaled 5 E3 ep->e X e (CLAS) HTCC*E2 Inclusive electron trigger 6 FT ep->e X e (FT) FT cluster (Emin,Emax) FT only, prescaled 7 FT HODO ep->e X e (FT) FT*HODO1*HODO2 Electron in FT, prescaled 8 FT π ep->e X e (FT)+pi0 FT HODO *2clusters Electron in FT+2γ, prescaled 9 H1 ep->e h+x 1 hadron FTOF1a*FTOF2*PCAL(DC track ) One hadron, prescaled 10 H2 ep->e 2h+X 2 hadrons FTOF1a*FTOF2*PCAL(DC track ) Two hadrons, prescaled 11 H3 ep->e 3h+X 3 hadrons FTOF1a*FTOF2*PCAL(DC track ) Three hadrons, prescaled 12 mu2 ep->e p+µ + µ p+µ + µ FTOF1a*FTOF2*[PCAL+EC tot ]* (DC track ) µ + µ +hadron 13 FT1 ep->e +h e (FT)+h+X FT hodo *H1 FT+ 1 hadron, prescaled 14 FT2 ep->e +2h e (FT)+2h+X FT hodo *H2 FT+ 2 hadrons, prescaled 15 FT3 ep->e +3h e (FT)+3h+X FT hodo *H3 FT+3 hadrons

Trigger Configuration file VTP_CRATE adcecal1vtp # trigger stage 1 VTP_ECINNER_HIT_EMIN 100 # strip energy threshold for 1-dim peak search VTP_ECINNER_HIT_DT 8 # +-8clk = +-32ns: strip coincidence window to form 1-dim peaks, # and peak-coincidence window to form 2-dim clusters VTP_ECINNER_HIT_DALITZ 568 584 # x8: 71<dalitz<73.. SSP_CRATE trig2 # trigger stage 2 SSP_SLOT all SSP_GT_STRG_ECALIN_CLUSTER_EMIN_EN 0 1 # enable ECAL_INNER clusters in trigger SSP_GT_STRG_ECALIN_CLUSTER_EMIN 0 2000 # ECAL_INNER cluster threshold SSP_GT_STRG_ECALIN_CLUSTER_WIDTH 0 100 # coincidence window to coincide with PCAL etc.. VTP_CRATE trig2vtp # trigger stage 3 # slot: 10 13 09 14 08 15 07 16 06 17 05 18 04 19 03 20 # payload: 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 VTP_PAYLOAD_EN 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 # 6780 corresponds to 7900 FADC latency VTP_GT_LATENCY 6780 # sector bits: trig number, ssp trig mask, ssp sector mask, multiplicity, # coincidence=number_of_extended_clock_cycles VTP_GT_TRGBIT 8 1 1 1 1 There is a plan to provide GUI to handle configuration files 12

CLAS12 Trigger Validation All stage 1 trigger algorithms ether implemented in C++ (ECAL, PCAL, HTCC) or have C++ simulation of the hardware implementation (FT to do) Higher stages will be simulated under discussion, likely the same way as stage 1 Validation process has 2 steps: 1. Comparison between C++ implementation and hardware response 2. Comparison between C++ implementation and GEANT and reconstruction results 13

FPGA Image Building Chain and Validation Step 1: Comparison between C++ implementation and hardware response (cosmic and beam) Xilinx Vivado_hls.VHDL Xilinx Vivado Trigger C++.bin (FPGA image) Step 1 Validation.evio DAQ 14

Xilinx Vivado_HLS (C++ to VHDL) 15

Xilinx Vivado (VHDL to FPGA image) 16

Validation Step 1: Comparison between C++ implementation and hardware response /work/boiarino/data/vtp1_001212.evio.0 event 3: FPGA implementation data bank ========================== TRIG PEAK [0][0]: coord=298 energy=1610 time=7 TRIG PEAK [1][0]: coord=174 energy=1012 time=7 TRIG PEAK [1][1]: coord=174 energy=294 time=8 TRIG PEAK [2][0]: coord=447 energy=1226 time=7 TRIG HIT [0]: coord=298 174 447 energy=4936 time=7 C++ implementation using FADC data ====================== U: IT=7 peaku[0]: coord=298 energy=1610 V: IT=7 peakv[0]: coord=174 energy=1012 W: IT=7 peakw[0]: coord=447 energy=1226 SOFT HIT: coord=298 174 447 energy=4936 17

Validation Step 2: Comparison between C++ implementation and GEANT/ CLARA GEMC Or DAQ (.evio) Trigger C++ (.evio) CLARA Offline Reconstruction NOTE: that step allows Offline Reconstruction team to get ready for trigger results processing during beam time 18

C++ trigger on GEANT data ECAL event example 19

Validation Step 2: Comparison between C++ implementation and GEANT/ CLARA (cont.) ECAL cluster finding: difference trigger-offline 600 Delta energy (%) denergy Entries 4319 Mean 1.284 RMS 0.4379 4000 3500 Delta coordu (strip*8) dcoordu Entries 4319 Mean -0.04168 RMS 0.2766 500 3000 400 2500 300 2000 200 1500 1000 100 500 0-10 -8-6 -4-2 0 2 4 6 8 10 0-20 -15-10 -5 0 5 10 15 20 4000 3500 Delta coordv (strip*8) dcoordv Entries 4319 Mean -0.02292 RMS 0.2725 4000 3500 Delta coordw (strip*8) dcoordw Entries 4319 Mean -0.01273 RMS 0.252 3000 3000 2500 2500 2000 2000 1500 1500 1000 1000 500 500 0-20 -15-10 -5 0 5 10 15 20 0-20 -15-10 -5 0 5 10 15 20 20

Validation Step 2: Comparison between C++ implementation and GEANT/ CLARA (cont.) ECAL cluster finding: 5GeV electron, no PCAL, energy deposition, offline and C++ Trigger 300 250 200 Energy for single cluster events (MeV) - offline energy2 Entries 4319 Mean 1329 RMS 82 χ / ndf 120.9 / 69 2 Constant 285.9 ± 5.5 Mean 1335 ± 0.9 Sigma 58.6 ± 0.7 300 250 200 Energy for single cluster events (MeV) - trigger energy1 Entries 4319 Mean 1339 RMS 84.42 χ / ndf 111.8 / 73 2 Constant 278.9 ± 5.4 Mean 1345 ± 0.9 Sigma 60.22 ± 0.71 150 150 100 100 50 50 0 0 500 1000 1500 2000 2500 3000 0 0 500 1000 1500 2000 2500 3000 Sampling fraction 27% 21

CLAS12 Trigger Commissioning Validation step 1 (cosmic): running from electron trigger (ECAL+PCAL and ECAL+PCAL+DC) and comparing hardware trigger implementation results with C++ trigger implementation results Validation step 2 (GEANT with different event generators): processing GEANT evio file (currently contains ECAL and PCAL, will add HTCC, DC and FT) through C++ trigger implementation and producing evio file with trigger banks for following offline reconstruction analysis Validation step 1(beam): running from electron trigger (ECAL+PCAL, ECAL +PCAL+DC, ECAL+PCAL+HTCC) and comparing hardware trigger banks with C++ trigger implementation banks Beam data with random pulser for trigger efficiency studies in offline reconstruction Beam data with electron trigger (ECAL+PCAL [+HTCC] [+DC]) with different trigger settings for trigger efficiency studies in offline reconstruction Beam data with FT trigger - for trigger efficiency studies in offline reconstruction Timeline: cosmic and GEANT tests underway until beam time; beam validation 1 shift, after that data taking for detector and physics groups 22

Future trigger development plans Geometrical match between different detectors stage 2 Drift Chamber road finding, probably using drift time information stage 2 Trigger on tracks multiplicity stage 3 Timeline: during following year, may require new hardware, requests from physics groups will help 23

Conclusion CLAS12 DAQ, computing and network was tested during KPP run and worked as expected, reliability meets CLAS12 requirements, performance looks good but final tests have to be done with full CLAS12 configuration (waiting for remaining detectors electronics and complete trigger system) Trigger system (included ECAL and PCAL) worked as expected during KPP. Other trigger components (DC, HTCC and FT) were added after KPP and being tested with cosmic; remaining hardware has been received and being installed Timeline: few weeks before beam full system will be commissioned by taking cosmic data and running from random pulser; planning 2 shifts for beam commissioning 24