The ATLAS Conditions Database Model for the Muon Spectrometer

Similar documents
LCG Conditions Database Project

Muon Reconstruction and Identification in CMS

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France

A Tool for Conditions Tag Management in ATLAS

The full detector simulation for the ATLAS experiment: status and outlook

PoS(High-pT physics09)036

Evolution of ATLAS conditions data and its management for LHC Run-2

ATLAS, CMS and LHCb Trigger systems for flavour physics

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Alignment of the ATLAS Inner Detector tracking system

Design and construction of Micromegas detectors for the ATLAS Muon Spectrometer Upgrade

Performance of the MRPC based Time Of Flight detector of ALICE at LHC

LHCb Distributed Conditions Database

Access to ATLAS Geometry and Conditions Databases

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The ATLAS MDT Remote Calibration Centers

The CMS alignment challenge

Detector Control LHC

Alignment of the ATLAS Inner Detector

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

PoS(ACAT)049. Alignment of the ATLAS Inner Detector. Roland Haertel Max-Planck-Institut für Physik, Munich, Germany

Persistent storage of non-event data in the CMS databases

PoS(IHEP-LHC-2011)002

PoS(Vertex 2007)030. Alignment strategy for the ATLAS tracker. Tobias Golling. Lawrence Berkeley National Laboratory (LBNL)

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

The CMS L1 Global Trigger Offline Software

Update of the BESIII Event Display System

The Database Driven ATLAS Trigger Configuration System

High Precision Optical Instrumentation for Large Structure Position Monitoring: The BCAM System Applied to the CMS Magnet

CMS Alignement and Calibration workflows: lesson learned and future plans

Atlantis event display tutorial

Track reconstruction of real cosmic muon events with CMS tracker detector

CMS Conference Report

Persistent storage of non-event data in the CMS databases

Evaluation of the computing resources required for a Nordic research exploitation of the LHC

First LHCb measurement with data from the LHC Run 2

Update of the BESIII Event Display System

The GAP project: GPU applications for High Level Trigger and Medical Imaging

ATLAS DCS Overview SCADA Front-End I/O Applications

Alignment of the CMS Silicon Tracker

PoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout

Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout

The CLICdp Optimization Process

Performance of the GlueX Detector Systems

CMS data quality monitoring: Systems and experiences

PoS(ACAT08)100. FROG: The Fast & Realistic OPENGL Event Displayer

CMS event display and data quality monitoring at LHC start-up

First experiences with the ATLAS pixel detector control system at the combined test beam 2004

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

The Optical Alignment System of the ATLAS Muon Spectrometer Endcaps

Event cataloguing and other database applications in ATLAS

Data Quality Monitoring Display for ATLAS experiment

V. Karimäki, T. Lampén, F.-P. Schilling, The HIP algorithm for Track Based Alignment and its Application to the CMS Pixel Detector, CMS Note

Full Offline Reconstruction in Real Time with the LHCb Detector

The ATLAS Level-1 Muon to Central Trigger Processor Interface

Stefania Beolè (Università di Torino e INFN) for the ALICE Collaboration. TIPP Chicago, June 9-14

Beam test measurements of the Belle II vertex detector modules

arxiv:hep-ph/ v1 11 Mar 2002

Track based absolute alignment in the ATLAS muon spectrometer and in the H8 test beam

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

Control slice prototypes for the ALICE TPC detector

ATLAS ITk Layout Design and Optimisation

ALICE tracking system

Performance of the ATLAS Inner Detector at the LHC

COOL performance optimization using Oracle hints

Tracking and flavour tagging selection in the ATLAS High Level Trigger

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

ATLAS PILE-UP AND OVERLAY SIMULATION

Tracking and Vertex reconstruction at LHCb for Run II

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract

Event reconstruction in STAR

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Updated impact parameter resolutions of the ATLAS Inner Detector

Track reconstruction with the CMS tracking detector

Results on Long Term Performances and Laboratory Tests on the L3 RPC system at LEP. Gianpaolo Carlino INFN Napoli

PoS(Baldin ISHEPP XXII)134

The CMS data quality monitoring software: experience and future prospects

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

Overview of the American Detector Models

TEC single wire resolution

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

ATLAS Simulation Computing Performance and Pile-Up Simulation in ATLAS

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

Atlantis: Visualization Tool in Particle Physics

arxiv: v1 [hep-ex] 7 Jul 2011

The Sorting Processor Project

Raw data format for the MUON spectrometer

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

arxiv: v1 [physics.ins-det] 13 Dec 2018

b-jet identification at High Level Trigger in CMS

Adding timing to the VELO

CMS conditions database web application service

Production and Quality Assurance of Detector Modules for the LHCb Silicon Tracker

Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer

First Operational Experience from the LHCb Silicon Tracker

CMS - HLT Configuration Management System

VERY HIGH VOLTAGE CONTROL FOR ALICE TPC

Forward Time-of-Flight Geometry for CLAS12

Transcription:

The ATLAS Conditions Database Model for the Muon Spectrometer Monica Verducci 1 INFN Sezione di Roma P.le Aldo Moro 5,00185 Rome, Italy E-mail: monica.verducci@cern.ch on behalf of the ATLAS Muon Collaboration The Muon System has extensively started to use the LCG conditions database project 'COOL' as the basis for all its conditions data storage both at CERN and throughout the worldwide collaboration as decided by the ATLAS Collaboration. The management of the Muon COOL conditions database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored. The Muon conditions database is responsible for almost all of the 'non-event' data and detector quality flags storage needed for debugging of the detector operations and for performing reconstruction and analysis. The COOL database allows database applications to be written independently of the underlying database technology and ensures long-term compatibility with the entire ATLAS Software. COOL implements an interval of validity database, i.e. objects stored or referenced in COOL have an associated start and end time between which they are valid, the data is stored in folders, which are themselves arranged in a hierarchical structure of folder sets. The structure is simple and mainly optimized to store and retrieve object(s) associated with a particular time. In this work, an overview of the entire Muon conditions database architecture is given, including the different sources of the data and the storage model used. In addiction the software interfaces used to access to the conditions data are described, more emphasis is given to the Offline Reconstruction framework ATHENA and the services developed to provide the conditions data to the reconstruction. XII Advanced Computing and Analysis Techniques in Physics Research Erice, Italy 3-7 November, 2008 1 Speaker Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike Licence. http://pos.sissa.it

1. Introduction The ATLAS Muon Spectrometer is the outer part of the ATLAS detector and is designed to detect charged particles exiting the barrel and end-cap calorimeters and to measure their momentum in the pseudorapidity range η < 2.7. It is also designed to trigger on these particles in the region η < 2.4 [1]. The quality of the muon measurement has been one of the guiding design criteria for the ATLAS experiment, the performance goal is a stand-alone transverse momentum resolution of approximately 10% for 1 TeV tracks, which translates into a sagitta along the z (beam) axis of about 500 µm, to be measured with a resolution of ~60 µm. This concern is reflected by the choice of the main components of the muon spectrometer: a system of three large superconducting air-core toroid magnets, precision tracking detectors with ~60µm intrinsic resolution, and a powerful dedicated trigger system. Muon chamber planes are attached to the toroids to measure the muon trajectories. In the barrel, the natural layout consists of three layers of chambers: at the inner and outer edges of the magnetic volume and in the mid-plane to measure the sagitta. In the forward direction the chambers are placed at the front and back faces of the toroid cryostats, with a third layer against the cavern wall, to maximize the lever-arm of the point-angle measurement. The high-precision chambers are complemented with an independent fast trigger chamber system. The precision momentum measurement is performed by the Monitored Drift Tube chambers (MDT s) and the Cathode-Strip Chambers (CSC s). The MDT chambers cover the pseudorapidity range η < 2.7 (except in the innermost end-cap layer where their coverage is limited to η < 2.0). These chambers consist of three to eight layers of drift tubes, operated at an absolute pressure of 3 bar, which achieve an average resolution of 80 µm per tube, or about 35 µm per chamber. In the forward region (2 < η < 2.7), CSC chambers are used in the innermost tracking layer due to their higher rate capability and time resolution. The precision-tracking chambers have therefore been complemented by a system of fast trigger chambers capable of delivering track information within a few tens of nanoseconds after the passage of the particle. In the barrel region ( η <1.05), Resistive Plate Chambers (RPC) were selected for this purpose, while in the end-cap (1.05 < η < 2.4) Thin Gap Chambers (TGC) were chosen. The conditions data will be used during the ATLAS data taking, reconstruction and subsequent processing to describe the environments in which the events have been taken. These data have many different origins, and are stored in many different types of databases: configurations database and a conditions database see [2]. The configuration database will store all the data needed at the start of the run, including sub-detector hardware and software configuration. The conditions database will store all the parameters describing run conditions and logging, all the data which will be accessed offline, i.e. by the reconstruction or analysis software. 2

The conditions database is closely related to the configuration database, needed to set up and run the detector hardware and associated online and event selection software. Conditions data varies with time, and is usually characterized by an interval of validity (IoV). It includes data archived from the ATLAS detector control system (DCS), online book-keeping data, online and offline calibration and alignment data, and monitoring data characterizing the performance of the detector and software during any particular period of time. In the figure 1 there is a draft of the connections between the configuration and the conditions database in the ATLAS offline and online environment. Fig.1: Data Flow of the Events and Conditions Data in the ATLAS experiment. The configurations and the conditions databases are shown (on the left of the picture) and their links with the Detector Hardware and Software of the experiment. 2. The ATLAS Muon Conditions Data The Muon conditions data have different sources both from the online side and from the offline data quality analysis including the calibration and alignment parameters. The online data primary define the configuration setup of the data taking (DAQ parameters) and of the Detector Control System (DCS), including hardware monitor values as: temperatures, gas pressure, voltage, etc. 3

Calibration and alignment constants come from analysis algorithms running on a dedicated muon calibration stream coming from the second level of the trigger [3,4]. Finally, the data quality information, summarizing the quality flags from the online and offline sides in a common result to be given to the reconstruction, for example detector flag as dead status is a merge of this status for the online, offline and calibration streams. Each subdetector has its own database architecture and its own schema inside the ATLAS conditions servers: ATONR and ATLR, Oracle RAC running in the CERN computer centre (offline and online servers) and at Tier-1s. The folders structures, as the amount of the data, are different due to the different hardware layouts and analysis approaches of the detectors. The amount of the data for each subdetector depends on the different role and strategy in the tracking algorithm, the data varies strictly with the time. The Muon conditions deployment is in line with the entire ATLAS project. 3. The ATLAS conditions database The ATLAS conditions database is based on Oracle DB, all the conditions database, in particular for the offline reconstruction, is implemented using COOL technology [5]. COOL, an LCG product is a library to manage conditions data in terms of Interval of Validity (IoV), versions and tags, using CORAL as backend. CORAL allows database applications to be written independently of the underlying database technology (this means that COOL databases can be stored in Oracle, SQLite or MySQL), see for more details [5]. Moreover, the COOL API has been integrated into the ATLAS online software. Several special-purpose higher level interfaces are also being developed, including the Condition Database Interface (CDI) for archiving Information System (IS) data to COOL, the PVSS to COOL interface for archiving Detector Control System (DCS) data, and specialized interfaces for saving monitoring data. The objects stored or referenced in COOL have an associated start and end time between which they are valid (IoV). COOL data is stored in folders, which are themselves arranged in a hierarchical structure of folder sets. Within each folder, several objects of the same type are stored, each with its interval of validity range. These times are specified either as run/event, or as absolute timestamps, and the choice between formats is made according to meta-data associated with each folder. The objects in COOL folders can be optionally identified by a channel number (or channel ID) within the folder. Each channel has its own intervals of validity, but all channels can be dealt with together in bulk updates or retrievals. COOL implements each folder as a relational database table, with each stored object corresponding to a row in the table. COOL creates columns for the start and end times of each object, and optionally the channel ID and tag if used. Several other columns are also created (e.g. insertion time and object ID), to be used internally by the COOL system, but these are generally of no concern to the user. The payload columns (where the data are stored) are defined by the user when the table is created. In ATLAS, the payload data can be stored in the three following ways. 4

The payload data can be stored directly in one or more payload columns (inline data), where the columns directly represent the data being stored (e.g. a mixture of float and integer values in the columns representing status and parameter information). In second way, the payload data (in this case a single column) can be used to reference data stored elsewhere. This reference can be a foreign key to another database table, or a reference to something outside of COOL - e.g. a POOL object reference allowing an external object to be associated to intervals of validity. A third approach involves storing the data as an inline CLOB in the database, i.e. defining the payload to be a large character object (CLOB) which has an internal structure invisible to the COOL database. COOL is then responsible only for storing and retrieving the CLOB, and its interpretation is up to the client code. The retrieving and storing of the data inside a reconstruction job in the Athena framework (offline reconstruction framework) is possible using the IOVService, a software interface between the COOL DB and the reconstruction algorithms via IOV range. 4. Conclusions In this paper, an overview of the Muon conditions database has been given. The Model has been tested during the Computing and Detector Commissioning. During the Cosmics tests a full chain test has been performed, including the production of the calibration and alignment constant subsequent storage in the COOL folders and later access during a reconstruction job. The results have been promising, and the Muon conditions schema is now in production, additional features will be implemented in the next months. References [1] ATLAS Muon Collaboration, The ATLAS Muon Spectrometer Technical Design Report, CERN- LHCC/97-22 [Online], May 31 1997, [2] ATLAS Collaboration, ATLAS Computing Technical Design Report, CERN/LHCC/2005-022, (2005) [3] E.Pasqualucci, et al. Muon detector calibration in the ATLAS experiment: data extraction and distribution, in Proceedings of Computingin High Energy and Nuclear Physics CHEP06, Mumbai,India, 13-17 Feb. 2006 [4] M. Verducci, ATLAS Conditions Database and Calibration Streams, Nuclear Physics B (Proc.Suppl.) 172 (2007) 250-252 [5] A. Valassi, COOL web page, Available at http://lcgapp.cern.ch/project/conddb/ 5