The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Similar documents
PoS(ACAT08)100. FROG: The Fast & Realistic OPENGL Event Displayer

FROG. Loïc Quertenmont. Université Catholique de Louvain & FNRS Center for Particle Physics and Phenomenology IPNL Lyon - 09 Janvier 2009

PoS(High-pT physics09)036

Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Tracking and flavour tagging selection in the ATLAS High Level Trigger

Atlantis: Visualization Tool in Particle Physics

CMS Conference Report

MIP Reconstruction Techniques and Minimum Spanning Tree Clustering

Alignment of the ATLAS Inner Detector tracking system

Muon Reconstruction and Identification in CMS

PoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout

Simulation and Physics Studies for SiD. Norman Graf (for the Simulation & Reconstruction Team)

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

Performance of the MRPC based Time Of Flight detector of ALICE at LHC

Electron and Photon Reconstruction and Identification with the ATLAS Detector

Deep Learning Photon Identification in a SuperGranular Calorimeter

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

The full detector simulation for the ATLAS experiment: status and outlook

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

b-jet identification at High Level Trigger in CMS

PoS(IHEP-LHC-2011)002

ATLAS, CMS and LHCb Trigger systems for flavour physics

Detector Control LHC

The CLICdp Optimization Process

FAMOS: A Dynamically Configurable System for Fast Simulation and Reconstruction for CMS

Determination of the aperture of the LHCb VELO RF foil

Event Displays and LArg

Adding timing to the VELO

The GAP project: GPU applications for High Level Trigger and Medical Imaging

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The ATLAS Conditions Database Model for the Muon Spectrometer

arxiv:hep-ph/ v1 11 Mar 2002

CMS Alignement and Calibration workflows: lesson learned and future plans

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

Performance of the ATLAS Inner Detector at the LHC

Clustering and Reclustering HEP Data in Object Databases

Data Quality Monitoring Display for ATLAS experiment

The CMS data quality monitoring software: experience and future prospects

Trigger information in the Atlantis event display

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger

Study of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider.

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

The ATLAS Trigger System: Past, Present and Future

The Database Driven ATLAS Trigger Configuration System

Robustness Studies of the CMS Tracker for the LHC Upgrade Phase I

The Pandora Software Development Kit for. - Particle Flow Calorimetry. Journal of Physics: Conference Series. Related content.

Direct photon measurements in ALICE. Alexis Mas for the ALICE collaboration

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.

V. Karimäki, T. Lampén, F.-P. Schilling, The HIP algorithm for Track Based Alignment and its Application to the CMS Pixel Detector, CMS Note

Tracking POG Update. Tracking POG Meeting March 17, 2009

CSCS CERN videoconference CFD applications

CMS reconstruction improvements for the tracking in large pile-up events

ATLAS PILE-UP AND OVERLAY SIMULATION

Report Semesterarbeit MORE-Web

The CMS Computing Model

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

First LHCb measurement with data from the LHC Run 2

Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S)

ATLAS Simulation Computing Performance and Pile-Up Simulation in ATLAS

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France

Physics and Detector Simulations. Norman Graf (SLAC) 2nd ECFA/DESY Workshop September 24, 2000

ATLAS ITk Layout Design and Optimisation

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

PoS(ACAT)049. Alignment of the ATLAS Inner Detector. Roland Haertel Max-Planck-Institut für Physik, Munich, Germany

Track reconstruction of real cosmic muon events with CMS tracker detector

Virtualizing a Batch. University Grid Center

DD4hep Based Event Reconstruction

Update of the BESIII Event Display System

IEPSAS-Kosice: experiences in running LCG site

A Novel Strip Energy Splitting Algorithm for the Fine Granular Readout of a Scintillator Strip Electromagnetic Calorimeter

Analogue, Digital and Semi-Digital Energy Reconstruction in the CALICE AHCAL

Prompt data reconstruction at the ATLAS experiment

Full Offline Reconstruction in Real Time with the LHCb Detector

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

Track reconstruction with the CMS tracking detector

The ATLAS Tier-3 in Geneva and the Trigger Development Facility

NEW CERN PROTON SYNCHROTRON BEAM OPTIMIZATION TOOL

INTRODUCTION TO THE ANAPHE/LHC++ SOFTWARE SUITE

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

Status of the TORCH time-of-flight detector

CMS Simulation Software

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

Detector Response Simulation

ACTS: from ATLAS software towards a common track reconstruction software

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Performance of the GlueX Detector Systems

Update of the BESIII Event Display System

An ATCA framework for the upgraded ATLAS read out electronics at the LHC

A Novel Strip Energy Splitting Algorithm for the Fine Granular Readout of a Scintillator Strip Electromagnetic Calorimeter

Distributed Engineering Data Management (EDM) using HyperText

Level-1 Regional Calorimeter Trigger System for CMS

Jet algoritms Pavel Weber , KIP Jet Algorithms 1/23

Capturing and Analyzing User Behavior in Large Digital Libraries

Deeply Virtual Compton Scattering at Jefferson Lab

Monte Carlo programs

Data Quality Monitoring at CMS with Machine Learning

Computing in High Energy and Nuclear Physics, March 2003, La Jolla, California

Overview. About CERN 2 / 11

Transcription:

Available on CMS information server CMS CR -2009/098 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 15 April 2009 FROG: The Fast And Realistic OpenGL Event Displayer Abstract FROG [1] [2] is a generic framework dedicated to visualisation of events in high energy experiment. It is suitable to any particular physics experiment or detector design. The code is light (< 3 MB) and fast (browsing time 20 events per second for a large High Energy Physics experiment) and can run on various operating systems, as its object-oriented structure (C++) relies on the cross-platform OPENGL [3] and GLUT [4] libraries. Moreover, FROG does not require installation of third party libraries for the visualisation. This documents describes the features and principles of FROG version 1.106, its working scheme and numerous functionalities such as: 3D and 2D visualisation, graphical user interface, mouse interface, configuration files, production of pictures of various format, integration of personal objects, etc. Finally the application of FROG for physic experiment/environement, such as Gastof, CMS, ILD, Delphes will be presented for illustration. Presented at 17th International Conference on Computing in High Energy and Nuclear Physics (CHEP09),21-27 March 2009,Prague,Czech Republic,30/04/2009

1 Introduction In high energy physics experiments, the possibility to visualise each event is crucial for several reasons. Understanding the event topologies can help in developing better data analysis algorithms. It can also be used as a powerful debugging tool for the simulation and reconstruction experiment softwares. A visualisation tool has to be : fluid : draw > 60 frames per second fast : scan hundreds of events in few seconds (required by commissioning/debugging purposes) light : the entire package should not exceed 10 MB (required by outreach purposes) easily upgradable an intuitive debugging tool able to provide nice illustrations for communications Satisfying simultaneously the above requirements could require heavy usage of all the resources of a computer. In general, these resources are the main limitation of the 3D visualisation. In large experiments such as CMS [5], complex algorithms are used to reconstruct and analyse physics data. Since these algorithms are processor and memory consuming, a fast visualisation tools should be decoupled from simultaneous physics calculations. The FROG [1] [2] philosophy is to divide the software into a Producer and a Displayer (Figure 1). It has an impact on several other elements of the software, like the input file format, the operating system portability and the structure of the software. Figure 1: Factorisation of FROG into the Producer and Displayer components. The Producer is integrated in the experimental software. It extracts, processes and stores the data required by the Displayer, using the FROG File Format for data encapsulation. The Displayer is completely disconnected from the experimental software. 2 The FROG File Format A dedicated FROG File Format (FFF) is needed in order to make the two parts of code communicate with each other. It has to be highly compact, but the decoding time of the files has also to remain as reduced as possible. Finally, the FFF should also be flexible enough. Its binary encoding ensures its universality with respect to the computer type and to the operating systems. 2

3 The FROG Producer The Producer is the interface between the physics software of the detector/experiment and the FROG Displayer. It is the interface between the detector software 1) and FROG. The FROG Producer builds the Events and the Geometry of a particular detector and stores them respectively into.vis and.geom files. The Producer is the only part of FROG that has to be interfaced to the user needs, and to the experiment software and data format. Some producer examples are available online as tutorials [6]. The Producer is in general code (generally C++), that converts the experiment data format to the FFF. Its task is to extract and process once for all the data required by the Displayer. For instance, it can convert the position of a hit from local coordinates of the detector to global coordinates. Such computations are in general relatively fast but can slow down the 3D visualisation when repeated many times. Since the Producer does not use specific graphical libraries, it can be run on parallel on a computer cluster. The experiment software only needs to include the FROG classes definition in order to produce the.vis and.geom files. This reduced FROG package, containing only the class definition, takes of the order of 0.5MB on disk and is freely distributable. 4 The FROG Displayer The Displayer has the unique function of displaying the content of the.geom and.vis files. It does not depend on the experiment software. The Displayer can be distributed and run on different platforms : Windows, Linux and MacOS are currently supported. The only requirement to execute the Displayer is to use some graphic libraries like OPENGL [3] and GLUT [4]. FROG is completely generic in the sense that it has been designed to be usable for any experiment and also to allow each user to easily upgrade/add some files in order to use FROG for his particular case. It can then be used for any physics experiments and eventually also for other purposes. All the style parameters are loaded from the FROG configuration file (FROG/config.txt). The geometry and the current event are displayed in the different 3D/2D views. The Displayer can easily render more than 60 frames per second. The mouse clicks are handled in order to outline (flashing) and print out information of the mouse selected object. The Figure 2 shows a screenshot taken by the Displayer for a fictitious tracking experiment composed of eleven tracking layers, a beam source and an Ending Block that stops the beam. 5 The FROG Features FROG is fully configurable by a set of parameters defined in an ASCII file (FROG/config.txt). This file contains, amongst other things 2) : the path to the input.vis file and/or input.geom file the first event to display the objects colour, styles and thresholds the views to be used The paths in the configuration can be either local path or URL pointing to.vis/.geom or to gzipped files (.vis.gz/.geom.gz). In the second case, the files are automatically downloaded and uncompressed if needed. Many types of view are available (e.g. 3D, 2D, LegoPlot, Text, etc.) and users can easily add new types of view suiting better their needs. FROG can be used to take screenshots (this is done with key <ENTER>) in various picture format. Both the vectorial picture formats (e.g. PS, EPS, TEX, PDF, SVG and PGF) and the pixelized picture formats (e.g. PNG) are supported thanks to the GL2PS [7] and PNGLIB [8] libraries. FROG can be used online to have quickly clues on the quality of data taken by the experiment. This is transparent to the user, and usage of FROG online and offline only differs by the fact that in the online case, the Producer and the Displayer are used simultaneously. It is indeed possible to read the.vis file while the Producer is still pushing events into the same file. 1) Two examples of detector software are CMSSW in the CMS experiment and Athena in the ATLAS experiment. 2) For a complete list of features, please see [1] 3

Figure 2: Three different FROG Display views: big 3D view (top), 2D longitudinal (bottom right), transversal view (bottom left). The Particle Gun (grey), the Ending Block (grey) and the eleven tracking layers (blue) are visible. The particle track is shown in red. 6 Applications FROG is already used in different experiments and environments, some of them are shown in this section. The two examples described in this section show that FROG can be used in many different applications. 6.1 GASTOF: The Ultra-Fast Gas Time-of-Flight Detector GASTOF [9][10] detectors are Cherenkov gas detectors that will be located at 420 m from the CMS [5] and ATLAS [11] interaction point (IP), as part of the FP420 project [12] for the LHC [13]. The aim of these detectors is to reduce backgrounds due to accidental coincidence of events detected in the central detectors and in the FP420 detectors on each side of the IP. To achieve that, the event vertex z-coordinate measured by the central detectors is compared to the vertex reconstructed by measuring the time difference of forward proton arrivals to the GASTOF detectors on two sides of the given IP. Cherenkov photons produced by high energy protons traversing gas medium are reflected by a mirror onto a very fast photomultiplier. The Figure 3 shows a simulated GASTOF event displayed by FROG. Figure 3: 3D view zoomed on the GASTOF mirror. The proton trajectory is represented by the blue line while photon rays are shown in yellow. The Cherenkov photon production is well visible along the proton path. The majority of the produced photons is reflected by the curved mirror and focused on the photo-cathode. 4

6.2 CMS: The Compact Muon Solenoid The Compact Muon Solenoid [5] is one of two general-purpose particle physics detectors built on the Large Hadron Collider (LHC) [13]. It is capable of studying many aspects of protons collisions with a center of mass energy up to 14 TeV. A FROG Producer has been created as an additional module of the CMS Software (CMSSW) [14]. The displayed geometry is guaranteed to be exactly the same than the geometry used by CMSSW for the event reconstruction since it is automatically extracted from the CMS database by the Producer. This is important for debugging purposes. An open view of the CMS geometry is visible on the Figure 4. This FROG producer has been intensively used during the summer 2008 for the cosmic data taking, one screenshot of a cosmic muon crossing the CMS detector is well visible on the Figure 4 Jets, from QCD events and other energetic processes, are very important at hadron colliders and challenge event display applications. In FROG, jets can be displayed in many different ways, one of them is represented on Figure 5. Figure 4: (left) An open 3D view of the CMS Geometry. (right) A Transversal view of a real data event recorded during the summer 2008. This event shows a cosmic muon crossing the CMS detector. Figure 5: A 3D (left) and transversal (right) view of a QCD Di-jets event in CMS. The jets are represented by a cone pointing towards the calorimeter hits associated to the jet. The hardest P T jet is red, the second one is blue and the third one is green. In this particular event, a third jet likely coming from a soft radiation of the second jet is also visible. In the 3D view, the muon geometry is also shown. 5

6.3 DELPHES: a Framework for the Fast Simulation of a General Purpose Collider Experiment The DELPHES framework [15] is a fast and realistic simulator of a general purpose experiment, like CMS or ATLAS at the LHC. In addition of the usual components of such detector, Delphes also simulates very forward detectors arranged along the beamline. The visualisation of the collision final states, as well as the detector geometry, is possible via a dedicated Producer interfacing FROG to DELPHES. One example of a Detector Parametrisation used by DELPHES are shown on the Figure 6. The same Figure also shows how the DELPHES events are visualised at the vertex level (before the propagation into the detector). Figure 6: (left) A 3D open view of the generic detector used by DELPHES. The parametrisation of the detector is given by the user. (right) Display of a photon-photon event at the vertex. 6.4 ILD: The ILC Detector Design The ILD [16] is a detector design for the future International Linear Collider. A part of the CALICE [17] collaboration is using FROG for the design of the ILD Hadronic Calorimeter but also during test beams. The Figure 7 shows one of the ILD HCAL Barrel Design, this one is composed of 48 RPC layers. In this simulated event, all the details of the hadronic shower initiated by a a single K-long are well visible. Figure 7: A single K-long simulated event creating an hadronic shower in one of the ILD HCAL Barrel Design. 6

7 Summary The goal of the FROG project was to create a new event display able to display quickly a significative fraction of the event data, in a high energy physics experiment. As a high rendering speed is desirable, the FROG software has been split into two parts: the Producer and the Displayer. In addition to the speed enhancement, this structure coupled with a home-made file format brings many interesting features for free, like the portability of the framework on several operating systems, the independence with respect to the experiment or environment, the possibility to share easily visualisation data, etc. Another important application of FROG is the online experiment data display, as the Producer and the Displayer can work simultaneously. In addition of the previous features, FROG is based on an object oriented design. Therefore, it can display almost any imaginable object without degradation of the Displayer s speed. It can work on almost any recent computer with one of the following Operating Systems: Linux, MacOS or Windows. The size of the full FROG package, including the source code, is less than 4 Mb. It is really suitable for outreach applications since it can be freely distributed and installed in only few clicks. Moreover, latest available data can be automatically downloaded from the internet by the FROG Displayer. One of the main goal of the Displayer was to allow a large number of users to visualise event data in quasi-real time. In conclusion, FROG can be used in many applications: it is highly tunable and allows a large number of users to visualise event data. The software is already used by several physicist communities for outreach and detector debugging. References [1] L. Quertenmont and V. Roberfroid, FROG: the Fast and Realistic OpenGL Displayer, http://projects.hepforge.org/frog/, 2009. arxiv:hep-ex/09012718v1. [2] L. Quertenmont and V. Roberfroid, FROG: the Fast and Realistic OpenGL Displayer: CMS Note in Preparation, 2009. [3] OPENGL: Open Graphical Library, http://www.opengl.org. [4] GLUT: OPENGL Utility Toolkit, http://www.opengl.org/resources/libraries/glut/. [5] M. Della Negra, A. Petrilli, A. Hervé, and al., CMS: The Compact Muon Solenoid, CMS Physics: Technical Design Report 1, 2006. CERN/LHCC 2006-001. [6] L. Quertenmont and Roberfroid, FROG Tutorials, http://projects.hepforge.org/frog/tutorial/tutorial.html. [7] GL2PS: an OPENGL to PostScript printing library, http://www.geuz.org/gl2ps/. [8] PNGLIB: Portable Network Graphics Library, http://www.libpng.org/. [9] L. Bonnet, T. Pierzchala, K. Piotrzowski, and P. Rodeghiero, GASTOF: Ultra-fast ToF forward detector for exclusive processes at the LHC, 2006. arxiv:hep-ph/0703320; CP3-06-18. [10] L. Bonnet and al., GASTOF: Paper in preparation, 2009.. [11] T. A. Collaboration, ATLAS detector and physics: Performance technical design report, 1999. CERN/LHCC 1999-14/15. [12] M. Albrow, R. Appleby, and al., FP420: The FP420 R&D Project: Higgs and New Physics with forward protons at the LHC, 2008. arxiv:hep-ex/08060302. [13] M. Benedikt, P. Collier, V. Mertens, J. Poole, and K. Schindl, LHC Design Report. Geneva: CERN, 2004. [14] M. Della Negra, A. Petrilli, A. Hervé, and al., CMS: The Compact Muon Solenoid, CMS Physics: Technical Design Report 2, 2006. CERN/LHCC 2006-021. [15] V. Lemaitre, S. Ovyn, and X. Rouby, DELPHES, 2009. arxiv:hep-ph/09032225. [16] The ILD Collaboration, http://www.ilcild.org/. [17] The CALICE Collaboration, https://twiki.cern.ch/twiki/bin/view/calice. 7