GNAM for MDT and RPC commissioning

Similar documents
Anode Electronics Crosstalk on the ME 234/2 Chamber

Event Displays and LArg

Kondo GNANVO Florida Institute of Technology, Melbourne FL

CLAS12 Offline Software Tools. G.Gavalian (Jlab) CLAS Collaboration Meeting (June 15, 2016)

Results on Long Term Performances and Laboratory Tests on the L3 RPC system at LEP. Gianpaolo Carlino INFN Napoli

ATLAS DCS Overview SCADA Front-End I/O Applications

The ATLAS Conditions Database Model for the Muon Spectrometer

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

ATLAS PILE-UP AND OVERLAY SIMULATION

Update on PRad GEMs, Readout Electronics & DAQ

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Track based absolute alignment in the ATLAS muon spectrometer and in the H8 test beam

Data Quality Monitoring Display for ATLAS experiment

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Muon Reconstruction and Identification in CMS

Event reconstruction in STAR

Test Beam Task List - ECAL

Detector Control System for Endcap Resistive Plate Chambers

Atlantis event display tutorial

A LVL2 Zero Suppression Algorithm for TRT Data

I/O Choices for the ATLAS. Insertable B Layer (IBL) Abstract. Contact Person: A. Grillo

The new detector readout system for the ATLAS experiment

RPC Trigger Overview

CSC Trigger Motherboard

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Alignment of the ATLAS Inner Detector

Search for hidden high-z materials inside containers with the Muon Portal Project

Status Report of PRS/m

AIDA-2020 Advanced European Infrastructures for Detectors at Accelerators. Deliverable Report. Data acquisition software

Status of the ATLAS Central Trigger

CLAS12 DAQ & Trigger Status and Timeline. Sergey Boyarinov Oct 3, 2017

SLAC Testbeam Data Analysis: High Occupancy Tracking & FE-I4 Cluster Study

Development and test of a versatile DAQ system based on the ATCA standard

Real-time dataflow and workflow with the CMS tracker data

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France

Forward Time-of-Flight Detector Efficiency for CLAS12

CMS data quality monitoring: Systems and experiences

CSPAD FAQ And starting point for discussion. Philip Hart, LCLS Users Workshop, Detector session, 2 Oct 2013

Tracking POG Update. Tracking POG Meeting March 17, 2009

A Web-based control and monitoring system for DAQ applications

LHC Detector Upgrades

Development and test of the DAQ system for a Micromegas prototype to be installed in the ATLAS experiment

High Voltage system for the Double Chooz experiment

Performance quality monitoring system (PQM) for the Daya Bay experiment

The ATLAS MDT Remote Calibration Centers

Comments from the Review committee:

Streaming Readout, the JLab perspective. Graham Heyes Data Acquisition Support Group Jefferson Lab

The GLAST Event Reconstruction: What s in it for DC-1?

Beam test measurements of the Belle II vertex detector modules

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

The full detector simulation for the ATLAS experiment: status and outlook

SoLID GEM Detectors in US

ATLAS TDAQ RoI Builder and the Level 2 Supervisor system

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

Design and construction of Micromegas detectors for the ATLAS Muon Spectrometer Upgrade

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade

The ATLAS Level-1 Muon to Central Trigger Processor Interface

L1 and Subsequent Triggers

SoLID GEM Detectors in US

Modeling Resource Utilization of a Large Data Acquisition System

An Upgraded ATLAS Central Trigger for 2015 LHC Luminosities

The WaveDAQ system: Picosecond measurements with channels

Ultra-short reference for the MTCC-shifter arriving in USC55

First Operational Experience from the LHCb Silicon Tracker

First results from the LHCb Vertex Locator

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Trigger Layout and Responsibilities

Evolution of ATLAS conditions data and its management for LHC Run-2

Alignment of the ATLAS Inner Detector tracking system

PoS(ACAT)049. Alignment of the ATLAS Inner Detector. Roland Haertel Max-Planck-Institut für Physik, Munich, Germany

S-LINK: A Prototype of the ATLAS Read-out Link

Hall-C Analyzer & Hall-C Replay

DQM4HEP - A Generic Online Monitor for Particle Physics Experiments

Detector Control LHC

The raw event format in the ATLAS Trigger & DAQ

Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools.

Construction of the Phase I upgrade of the CMS pixel detector

GLAST. Prototype Tracker Tower Construction Status

The CMS data quality monitoring software: experience and future prospects

Spring 2010 Research Report Judson Benton Locke. High-Statistics Geant4 Simulations

An FPGA Based General Purpose DAQ Module for the KLOE-2 Experiment

Work in Tbilisi. David Mchedlishvili (SMART EDM_lab of TSU) GGSWBS , Tbilisi. Shota Rustaveli National Science Foundation

Production Testing of ATLAS MDT Front-End Electronics.

Trigger/DAQ design: from test beam to medium size experiments

Performance of the MRPC based Time Of Flight detector of ALICE at LHC

Event visualisation for the ATLAS experiment - the technologies involved

INDEX. Digitizer Software. CAENComm library CAENDigitizer library CAENDPP library WaveDump. CAEN VME Demos WaveCatcher

FAST site EMU CSC test results a global view from ROOT N. Terentiev

Full Silicon Tracking Studies for CEPC

Performance of the ATLAS Inner Detector at the LHC

Hall C 12 GeV Analyzer Update. Stephen Wood

Belle & Belle II. Takanori Hara (KEK) 9 June, 2015 DPHEP Collaboration CERN

TPC Detector Response Simulation and Track Reconstruction

Radiation test and application of FPGAs in the Atlas Level 1 Trigger.

HPS Data Analysis Group Summary. Matt Graham HPS Collaboration Meeting June 6, 2013

Data handling and processing at the LHC experiments

TPC Detector Response Simulation and Track Reconstruction

Semiconductor sensors for the CALICE SiW EMC and study of the cross-talk between guard rings and pixels in the CALICE SiW prototype

The CMS L1 Global Trigger Offline Software

Timing distribution and Data Flow for the ATLAS Tile Calorimeter Phase II Upgrade

Transcription:

GNAM for MDT and RPC commissioning G. Crosetti, D. Salvatore, P.F. Zema (Cosenza) M. Della Pietra (Napoli) R. Ferrari, G. Gaudio, W. Vandelli (Pavia) 10 th Muon Week, Cetraro July 2005

Summary * GNAM and OHP In collaboration with Pisa GNAM group * GNAM for MDT and RPC test beam 2004 * From CTB to commissioning: MDT * From CTB to commissioning: RPC * Open issues * Conclusions 10th Muon Week, Cetraro Daniela Salvatore 2

GNAM GNAM is a low-level monitoring system. It separates common actions from detector specific ones. * Core: interaction with DAQ, data unpacking, histogram management, histogram related command handling * Plugins: data decoding, histogram filling to implement detector specific physics ---> people from different detectors developed their own code 10th Muon Week, Cetraro Daniela Salvatore 3

Monitoring chain 10th Muon Week, Cetraro Daniela Salvatore 4

Online Histogram Presenter OHP retrieves histos from the OH service and displays them in a nice and powerful GUI. * ROOT canvas available * manage reference histograms * possibility to use OHP as a browser, to quickly find and look at any histogram published on OHS... *... or to display selected significant histograms 10th Muon Week, Cetraro Daniela Salvatore 5

OHP as a browser for OHS 10th Muon Week, Cetraro Daniela Salvatore 6

OHP: a configured tab 10th Muon Week, Cetraro Daniela Salvatore 7

GNAM at MDT and RPC test beam GNAM was used during 2004 test beam to inspect detector status, find faulty states, check calibrations. * Four detector groups provided their specific codes: TileCal, Pixel, MDT and RPC MDT code by I.Boyko, D.Dedovitch, K.Nikolaev MDT plots RPC plots (M. Della Pietra) Hit maps Noise TDC Errors Beam profile Nr. of hits per event Hit profile for eta and phi Time correlation between confirm and pivot planes Time distribution for each strip 10th Muon Week, Cetraro Daniela Salvatore 8

GNAM at MDT and RPC test beam Hits Strip Profile ETA BML PIVOT Beam profile Time vs ijk PAD-1_CM-0 Pivot Confirm Trigger Generated trigger is on time 10th Muon Week, Cetraro Daniela Salvatore 9

From CTB to commissioning: MDT MDT code modified from CTB to satisfy different requests for commissioning at Pit: 1. manage a variable nr. and type of chambers instead of a pre-fixed one 2. choose how to retrieve MDTs parameters and mapping ---> new MapInterface routine (under construction) can access DB and/or a text file 3. once read, information is used in the same way independently of source ---> map format modifications only require changes in MapInterface 10th Muon Week, Cetraro Daniela Salvatore 10

From CTB to commissioning: MDT 4. some new histos have been implemented... Noise Frequency Occupancy...and others can be added on request 10th Muon Week, Cetraro Daniela Salvatore 11

From CTB to commissioning: RPC (M. Della Pietra) No major changes for the monitoring @ BB5 cosmic ray test stand. Some work needed for commissioning at Pit: 1. Better manage different kinds of stations. 2. Retrieve/Store the results of the test (dead or noisy strips) from/into a DB 10th Muon Week, Cetraro Daniela Salvatore 12

GNAM will provide * On-line histograms * MDT: TDC spectra vs mezzanine, Noise vs channel, Occupancy,... * RPC: Strip profile, Time vs channels, correlation between two layers on the same plane * Correlation histos (MDT vs RPC) * Histograms saved in a ROOT file * MDT: Text file with noise frequency per tube * RPC: Text file with dead and/or noisy channels 10th Muon Week, Cetraro Daniela Salvatore 13

Resources needed * DAQ: * GNAM needs ATLAS TDAQ environment * Events must have ATLAS structure * ROD level is enough: GNAM doesn t need the whole DataFlow chain Requirements already fulfilled thanks to the work done by Enrico * Computational power Computational power to be understood: depends on: number of events, trigger rate, total time, sampling efficiency 10th Muon Week, Cetraro Daniela Salvatore 14

Info needed: MDT Configuration * Many mapping formats up to now; trying to define a common unique one * Database: which interface? Producers are kindly asked to provide cabling schemes and geometrical information P. Fleischmann already contacted for BB5 mappings 10th Muon Week, Cetraro Daniela Salvatore 15

Info needed: RPC * Configuration (M. Della Pietra) * As for the MDT, some effort is needed to prepare and add the cabling scheme, both for RPC readout and LVL1 trigger. * Database: which interface? * LVL1 Trigger * Not yet defined what kind of Online Monitoring has to be done for the LVL1 Trigger. 10th Muon Week, Cetraro Daniela Salvatore 16

More open issues * Error handling * Any kind of automatic check to be performed? * What kind of alarms should be generated? * Time scale When would be necessary a running version? End of August? 10th Muon Week, Cetraro Daniela Salvatore 17

Conclusions GNAM is constantly upgrading to satisfy commissioning requests * many different chambers can be managed * Information will be retrieved from DB or files * Histograms can be added if needed but we are trying to clarifying pending open issues. Soon more documentation at http://pfzema.cs.infn.it/phys/gnam 10th Muon Week, Cetraro Daniela Salvatore 18