CMS event display and data quality monitoring at LHC start-up

Similar documents
CMS data quality monitoring: Systems and experiences

The CMS data quality monitoring software: experience and future prospects

IGUANA Architecture, Framework and Toolkit for Interactive Graphics

Experience with Data-flow, DQM and Analysis of TIF Data

Persistent storage of non-event data in the CMS databases

Real-time dataflow and workflow with the CMS tracker data

Persistent storage of non-event data in the CMS databases

The ATLAS Conditions Database Model for the Muon Spectrometer

CMS Simulation Software

A Prototype of the CMS Object Oriented Reconstruction and Analysis Framework for the Beam Test Data

CMS conditions database web application service

Data Quality Monitoring Display for ATLAS experiment

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Run Control and Monitor System for the CMS Experiment

Track reconstruction of real cosmic muon events with CMS tracker detector

The ALICE Glance Shift Accounting Management System (SAMS)

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

Andrea Sciabà CERN, Switzerland

CMS Alignement and Calibration workflows: lesson learned and future plans

The CMS L1 Global Trigger Offline Software

The GAP project: GPU applications for High Level Trigger and Medical Imaging

DQM4HEP - A Generic Online Monitor for Particle Physics Experiments

CMS - HLT Configuration Management System

The CMS Computing Model

Monte Carlo Production Management at CMS

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Online remote monitoring facilities for the ATLAS experiment

Large scale commissioning and operational experience with tier-2 to tier-2 data transfer links in CMS

The Run Control and Monitoring System of the CMS Experiment

The full detector simulation for the ATLAS experiment: status and outlook

The CMS High Level Trigger System: Experience and Future Development

A new approach for ATLAS Athena job configuration

The LHC Compact Muon Solenoid experiment Detector Control System

Deferred High Level Trigger in LHCb: A Boost to CPU Resource Utilization

ATLAS Nightly Build System Upgrade

a new Remote Operations Center at Fermilab

ATLAS software configuration and build tool optimisation

The ATLAS EventIndex: an event catalogue for experiments collecting large amounts of data

A DAQ system for CAMAC controller CC/NET using DAQ-Middleware

First operational experience with the CMS Run Control System

The NOvA DAQ Monitor System

THE Compact Muon Solenoid (CMS) is one of four particle

The Database Driven ATLAS Trigger Configuration System

An FPGA Based General Purpose DAQ Module for the KLOE-2 Experiment

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

AN OVERVIEW OF THE LHC EXPERIMENTS' CONTROL SYSTEMS

Investigation of High-Level Synthesis tools applicability to data acquisition systems design based on the CMS ECAL Data Concentrator Card example

The evolving role of Tier2s in ATLAS with the new Computing and Data Distribution model

Reprocessing DØ data with SAMGrid

Kondo GNANVO Florida Institute of Technology, Melbourne FL

First LHCb measurement with data from the LHC Run 2

WLCG Transfers Dashboard: a Unified Monitoring Tool for Heterogeneous Data Transfers.

AMI: AMS Monitoring Interface

L1 and Subsequent Triggers

CONTROL AND MONITORING OF ON-LINE TRIGGER ALGORITHMS USING GAUCHO

Status Report of PRS/m

Fast access to the CMS detector condition data employing HTML5 technologies

FAMOS: A Dynamically Configurable System for Fast Simulation and Reconstruction for CMS

Update of the BESIII Event Display System

Tests of PROOF-on-Demand with ATLAS Prodsys2 and first experience with HTTP federation

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Stefan Koestner on behalf of the LHCb Online Group ( IEEE - Nuclear Science Symposium San Diego, Oct.

The NOvA software testing framework

CMS Conference Report

Evolution of Database Replication Technologies for WLCG

Trigger Coordination Report

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract

Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns

CMS users data management service integration and first experiences with its NoSQL data storage

GAUDI - The Software Architecture and Framework for building LHCb data processing applications. Marco Cattaneo, CERN February 2000

Summary of the LHC Computing Review

Xrootd Monitoring for the CMS Experiment

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

ATLAS, CMS and LHCb Trigger systems for flavour physics

Software for implementing trigger algorithms on the upgraded CMS Global Trigger System

CMS High Level Trigger Timing Measurements

Use of ROOT in the DØ Online Event Monitoring System

Atlantis: Visualization Tool in Particle Physics

The coolest place on earth

Evolution of ATLAS conditions data and its management for LHC Run-2

CernVM-FS beyond LHC computing

Detector Control LHC

A data handling system for modern and future Fermilab experiments

File Access Optimization with the Lustre Filesystem at Florida CMS T2

Event visualisation for the ATLAS experiment - the technologies involved

Physics-level job configuration

The CMS Event Builder

Evaluation of the computing resources required for a Nordic research exploitation of the LHC

Control and Monitoring of the Front-End Electronics in ALICE

Data handling and processing at the LHC experiments

Common Software for Controlling and Monitoring the Upgraded CMS Level-1 Trigger

Online data handling and storage at the CMS experiment

Front-End Electronics Configuration System for CMS. Philippe Gras CERN - University of Karlsruhe

Benchmarking the ATLAS software through the Kit Validation engine

The Legnaro-Padova distributed Tier-2: challenges and results

Ten Years of Three-dimensional Metrology Applied to the LHC Detectors

Agents and Daemons, automating Data Quality Monitoring operations

The virtual geometry model

Visita delegazione ditte italiane

Transcription:

Journal of Physics: Conference Series CMS event display and data quality monitoring at LHC start-up To cite this article: I Osborne et al 2008 J. Phys.: Conf. Ser. 119 032031 View the article online for updates and enhancements. Related content - The CMS magnet test and cosmic challenge H Sakulin - LHC first beam event display at CMS from online to the world press - the first 3 minutes G Alverson, G Eulisse, T McCauley et al. - Implementation of the DAQ system for the RPC PAC muon trigger in the CMS W M Zabolotny, M Bluj, K Bunkowski et al. This content was downloaded from IP address 148.251.232.83 on 24/09/2018 at 21:57

CMS Event Display and Data Quality Monitoring at LHC Start-up Ianna Osborne 1, *, George Alverson 1, Giulio Eulisse 1, Shahzad Muzaffar 1, Lucas Taylor 1, Lassi Tuura 1, Nicola Amapane 2, 3, Riccardo Bellan 2, 3, Gianluca Cerminara 3, Domenico Giordano 4, Norbert Neumeister 5, Chang Liu 5, Gianni Masetti 6, Maria-Santa Mennea 7, Giuseppe Zito 7, and Ilaria Segoni 8 1 Northeastern University, Boston, MA, USA 2 Università di Torino 3 INFN, Torino, Italy 4 Universita' & INFN di Bari, Italy 5 Purdue University, West Lafayette, USA 6 Universita' di Bologna e Sezione dell'infn, Bologna, Italy 7 INFN Bari, Italy 8 CERN * To whom any correspondence should be addressed. E-mail: Ianna.Osborne@cern.ch Abstract. The event display and data quality monitoring visualisation systems are especially crucial for commissioning CMS in the imminent CMS physics run at the LHC. They have already proved invaluable for the CMS magnet test and cosmic challenge. We describe how these systems are used to navigate and filter the immense amounts of complex event data from the CMS detector and prepare clear and flexible views of the salient features to the shift crews and offline users. These allow shift staff and experts to navigate from a top-level general view to very specific monitoring elements in real time to help validate data quality and ascertain causes of problems. We describe how events may be accessed in the higher level trigger filter farm, at the CERN Tier-0 centre, and in offsite centres to help ensure good data quality at all points in the data processing workflow. Emphasis has been placed on deployment issues in order to ensure that experts and general users may use the visualization systems at CERN, in remote operations and monitoring centres offsite, and from their own desktops. 1. Introduction Heading toward the LHC startup the CMS offline group is preparing and testing its final software systems especially the tools for the data quality monitoring. An important component in monitoring the quality of data in any experiment is an event display. This software provides a visual interpretation of the data and detector on an event-by-event basis. The deployment and operation of the CMS event display has started for the CMS magnet test and cosmic challenge (MTCC) [1]. The ongoing subdetector commissioning and the monthly global runs give an invaluable feedback to the developers. This operation in a nearly running experiment experience helps to build a stable and robust system, which should operate with a minimum maintenance effort for years. c 2008 IOP Publishing Ltd 1

The CMS event display is implemented based on the Interactive Graphics for User ANAlysis (IGUANA) [4, 5, 6] visualization framework and toolkit and is fully integrated with the CMS software framework. 2. CMS Display Operation during MTCC During the MTCC a 20 degree combined sub-detectors slice of CMS with a magnet has triggered and recorded the cosmic rays, checked the noise and the inter-operability using as near as possible final readout and the auxiliary systems. The 24/7 CMS operation procedures have been tried out which encouraged the use of the final systems as far as possible, deliberately limiting any MTCC specific development. Around 25 million good events, e.g. the events in which a cosmic muon has been reconstructed, have been recorded with at least the drift tubes (DT) triggers and electromagnetic calorimeter (ECAL) + tracker (TK) in readout, including a 15M events at a stable field 3.8T. The data-taking efficiency reached over 90% for extended periods. Figure 1: The CMS event display workflow during MTCC. The real time feedback from the CMS event display and the Data Quality Monitoring system helped in tuning and timing-in the sub-detectors. Both systems were deployed at the LHC Point 5 (P5) control room (see Fig.1) for the 24/7 MTCC operations. Multiple instances of the running event displays can be divided into three categories: On-line event displays which connect directly to the central DAQ data streams and display data in real time. 2

Quasi-on-line event displays receiving the data from the local files within local network with a very short latency. Off-line event displays which get the data from the Tier 0 or Tier 1. An on-line event display is installed in the control room on a private network. The connection to the data source is implemented via an http connection to the storage manager running in the central DAQ. Whenever the storage manager was not running the event display would fetch the data from the local files or an events playback server. The on-line event displays regularly produce the snapshots and publish them to a Web server. The MTCC quasi on-line event displays, such as the custom tracker event display, the event displays at ROC, etc. are installed on a public network and may experience certain latency in accessing the latest data. The off-line event displays access the data distributed by the Data Management systems. The event display operation at P5 is monitored via Cacti logging and graphing system [9] and a web-cam. It is remotely controlled via Virtual Network Computing (VNC). The event display operator on shift responsibility is to submit regular reports to a LogBook. 3. What is IGUANA IGUANA is a visualization environment. It allows a user to build his or her visualization application at run-time, for example, CMS event display(s). IGUANA is based on plug-ins a.k.a. iglets. An iglet is a software module that adds a specific feature(s) or service(s) to an IGUANA application. An iglet-based event display is fully configurable at the start-up, and its functionality is defined by the capabilities of the loaded iglets (See Fig. 2). A text label assigned to each iglet at compile time can be used in a configuration file to load the iglet at runtime. 4. IGUANA and CMSSW Figure 2: CMS Event Display loads listed in the configuration file iglets to define its functionality. 4.1. CMSSW Event Data Model The CMSSW event data model [5] requires that all event data processing modules communicate only through a single data structure called the Event. The modules are executed according to the schedule specified in the job configuration path. A special input source module can be used to insert in the 3

Event the raw data received from the CMS DAQ. Thus, the offline software can run in a completely transparent fashion both in purely offline applications and in the HLT. The reconstruction modules get from the Event the needed input data collection and produce the higher level reconstructed data collections which become part of the Event. The Filter modules are used online to avoid forwarding events to the Storage Manager application, whose task is to collect accepted events from several Filter Unit (FU) nodes and assign them to event streams identified on the basis of the HLT pattern [3]. 4.2. Event Data Collection Display The event display is fully integrated with the CMSSW framework: it instantiates and runs the Event Processor - just as a usual CMS analysis job does. Just as the analysis job, the event display needs a configuration file to define a process with a data source and/or Event Setup. The event display uses the CMS Event Data Model to display Event data and allows the user to add CMSSW modules to the process path to produce and display new transient data collections. The event display takes the auxiliary information from Event Setup (See Fig. 3). The configuration file can also define the event display configuration, for example, a list of iglets to load. Figure 3: CMS Event Processing IGUANA runs on a configuration file: a process, which defines the data source, the Event Setup producers and the modules schedule. The data source or the data file has to be accessible from the system where the event display is run from - preferably within a very short latency. The event display dynamically discovers the Event content, it retrieves all event data collections from the data source. The data collections within the Event are uniquely identified by four quantities: C++ class type of the data A module label A product instance label A process name The event display retrieves a complete list of the data collections from the data source and presents them in a hierarchical tree. The data collection type is used to register an iglet capability. A Twig, a C++ class, represents every data collection. 4

There can be more then one Twig for the collection data type defined in the different iglets. There can be more then one collection represented by a single Twig. Every Twig defines how it displays the data collection within its Twig::update member functions. At startup a user decides which iglet to load to get a specific representation (See Fig. 4). If a collection cannot be displayed, it means that an iglet capable of displaying this particular data type has not been loaded. Figure 4: Dynamic Track representation. 4.3. Event Setup Display The Event data constituencies usually belong to a detector unit and positioned in its local coordinate system. IGUANA displays the data and geometry in the global coordinates. Local to global transformation is done via Event Setup (See Fig. 5). Figure 5: Local to global coordinate conversion is done via Event Setup. The iglets are notified when the Event Setup producer delivers the geometry. 5

The configuration file defines which Event Setup producers deliver the geometry for the application. It is possible to have more then one geometries producer in the same application. For example, if it is necessary to compare the simulated geometry with the reconstructed one. The definition of the geometry comes from an XML description. The same event setup producer delivers either the MTCC geometry or the reconstructed geometry. That is why these geometries cannot be used together in the same application. 5. Statistical Data Quality Monitoring Complimentary to the event-by-event view of the detector the cumulative information gives a very good overview of what s going on. The Physics and Data Quality Monitoring framework (DQM) [10] aims at providing such information within a homogeneous monitoring environment across various applications related to data taking. The statistical DQM data can be presented as 1D-, 2D- and 3Dhistograms, 1D- and 2D-profiles, scalars (integer and real numbers) and string messages can be booked and filled or updated anywhere in the context of reconstruction and analysis code. The DQM infrastructure takes care of publishing, tracking updates, and transporting these updates to subscriber processes. IGUANA-based DQM GUI prototype has been deployed and successfully tested during MTCC (Fig. 1). The statistical data histograms have been produced online and compared with the reference histograms as part of the daily shift operation. The main advantage of current DQM GUI implementation is its seamless integration with the event display s event-by-event views and vise versa. The event display views can include statistical data produced by DQM. 6. Summary and Outlook IGUANA itself does not guarantee good quality data, but it does guarantee an honest view of the detector response in real time. The IGUANA-based event display and data quality monitoring systems described here are being used on a daily basis in the ongoing commissioning effort and global runs. The systems proved to be robust and flexible. For example, the amount of displayed information can be configured and the systems can be tailored to the needs of a detector specific task. The feedback from the shift crew is being collected and future improvements are being implemented. The experience with the event displays and visualization systems used in other experiments is reviewed when a newly requested feature is designed. The multiple specific views developed by the detector people are being integrated into the central system. Finally, CMS physicists are involved in the development process. Early deployment of the tools guarantees better training of the CMS community, rapid improvement of the tools themselves, and helps in understanding the detector operation, which is the key to CMS readiness at the LHC startup. References [1] CMS Collaboration 2006 The CMS Magnet Test and Cosmic Challenge (MTCC Phase I and II) Operational Experience and Lessons Learnt, CMS NOTE-2007/005 [2] G. Bruno, Software for the CMS Cosmic Challenge, CMS CR 2006/017 [3] CMS Collaboration 2002 The Trigger and Data Acquisition project, Technical Design Report, Volume 2: Data Acquisition & High-Level Trigger CERN/LHCC 2002-26 [4] CMS Collaboration 2006 CMS Physics Technical Design Report, Volume 1, Section 2.10: Visualization [5] V. Innocente, G. Eulisse, S. Muzaffar, I. Osborne, L.A. Tuura, L. Taylor, Composite Framework for CMS Applications CHEP04, Interlaken, Switzerland, September 27 October 1, 2004. [6] G. Alverson, G. Eulisse, S. Muzaffar, I. Osborne, L.A. Tuura, L. Taylor, IGUANA 6

Architecture, Framework and Toolkit for Interactive Graphics CHEP03, La Jolla, California, March 24 28, 2003. [7] M.S. Mennea, I. Osborne, A. Regano and G. Zito, CMS tracker visualization tools, Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment Volume 548, Issue 3, 21 August 2005, Pages 391-400 [8] M. S. Mennea, A. Regano, G. Zito, I. Osborne, CMS Visualisation Tools, CHEP04, Interlaken, Switzerland, September 26-October 1, 2004. [9] http://www.cacti.net/ [10] C. Leonidopoulos, E. Meschi, I. Segoni, G. Eulisse, D. Tsirigkas, Physics and Data Quality Monitoring at CMS, CHEP06, Mumbai, India, February, 2006. 7