ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract
|
|
- Merilyn Norman
- 5 years ago
- Views:
Transcription
1 ATLAS NOTE December 4, 2014 ATLAS offline reconstruction timing improvements for run-2 The ATLAS Collaboration Abstract ATL-SOFT-PUB /12/2014 From 2013 to 2014 the LHC underwent an upgrade to boost the available centre-ofmass energy for collisions from 8 TeV to 13 TeV. During this interval of time, known as Long Shutdown 1 (LS1), the ATLAS software group began a campaign to substantially reduce the CPU time needed to process data. This reduction could not come at the expense of physics performance. The campaign was undertaken to prepare for the increase of the trigger bandwidth from 500 Hz to 1 khz and for the increase in the number of interactions per LHC proton bunch crossing, which is commonly referred to as pile-up. This article summarises the main improvements and presents measurements of the data processing time and of a key performance indicator, the tracking efficiency, as a function of major software releases. c Copyright 2014 CERN for the benefit of the ATLAS Collaboration. Reproduction of this article or parts of it is allowed as specified in the CC-BY-3.0 license.
2 1 Introduction The performance of the LHC in run-1 exceeded the design specifications for pile-up mainly because of proton bunch crossings occuring every 50 ns. The average number of interactions per bunch crossing, µ, which corresponds to the mean of the poisson distribution on the number of interactions per crossing calculated for each luminosity bunch, is a direct measure of pile-up. It is calculated from the instantaneous per bunch luminosity as µ = L bunch σ inel /(n bunch f r ) where L bunch is the per bunch instantaneous luminosity, σ inel is the inelastic cross section which is assumed to be 71.5 mb for 7 TeV collisions and 73.0 mb for 8 TeV collisions, n bunch is the number of colliding bunches and f r is the LHC revolution frequency. More details on this can be found in Ref. [1]. During run-1 the pile-up benchmark of µ 20 was exceeded in a majority of fills, with µ 35 fills being common in the latter part of run-1. In run-2 pile-up will increase, with the first 1 fb 1 expected to be taken with a 50 ns bunch crossing interval, one can expect µ 40 fills in the near term. Furthermore one should be prepared for fills with µ 60. Coupled with increased collision energy and pile-up in run-2, is an increase in trigger bandwidth of 1 khz, which is required to maintain important single lepton triggers near to the run-1 level transverse energy and momentum thresholds. Given the run-2 requirements the following goals were set: reduce data processing time by a factor 3 without compromising physics performance; increase maintainability of the code; validate that the physics results are invariant to code changes. The data processing time of interest here is the time taken to process Raw Data Object (RDO) files into Event Summary Data (ESD) files, in what is known as the reconstruction step. Therefore the data processing time is referred to as the reconstruction time. Reconstruction time measurements were conducted on both simulated and real data samples. The Monte Carlo simulated samples consist of top-quark pair production events (t t) generated at a collision energy of 14 TeV with conditions of a bunch crossing (BC) spacing time of 25 ns and an average number of interactions per BC spanning µ = 0, 20 and 40. Run-1 data sets were sourced from the JetTauEtmiss slice and span measured µ = 16.3, , and These samples provide events with the highest high track multiplicities and therefore provide an upper bound on the reconstruction time. Measurements of the reconstruction time unless otherwise stated are performed on a machine with HEPSPEC scaling factor of 11.95, where specifically the CPU used was a Intel Xeon 2.26 GHz 2 processor - 16 core. The following sections describe updates and improvements made to the reconstruction software. Bear in mind that estimates of improvements in the reconstruction time are approximate. It is very difficult to factorise out the improvement due to one change as many changes were implemented concurrently. The entire ATLAS software suite consists of around two thousand software packages and is currently maintained by around 400 developers. The software is especially complex because it necessarily matches the complex nature of the ATLAS detector. Moreover that further complexity is needed for very sophisticated analysis of proton on proton collision data. Reconstruction times were measured for three versions of the software release, namely: version used to reconstruct data at the end of run version with updates in software technology and optimised Inner Detector track seeding strategy for 8 TeV version with updates for track seeding at 13 TeV and region of interest (ROI) seeded back tracking. 1
3 2 Upgrades and improvements in software technology Following are a list of upgrades and improvements in software technology. A new method to read the value of the magnetic field strength within ATLAS was implemented because this functionality was identified as a CPU bottleneck. Here the code was newly written in C++, where previously it was written in FORTRAN. The field value was cached for a fast lookup. Unit conversions between Tesla and Gauss and vice-versa were minimised, which also had the effect of reducing the call depth. Moreover the functions were made auto-vectorisable. These changes resulted in a 20% gain in speed in detector simulation tasks. Replace the use of the CLHEP library, which is used for linear algebra vector and matrix operations, with the Eigen C++ template library. The use of expression templates removes intermediate steps performed in calculations. This migration affected thousands of lines of code in up to a thousand packages and took approximately eight months to complete. However the CLHEP library is still necessary as it s used to declare Lorentz vectors and in the description of the detector geometry. Millions of evaluations of trigonometric functions occur in the reconstruction. In run-1 these were handled by the GNU libm math library. A switch was made to using the Intel math library. This library is a part of the Intel C++ compiler, which contains highly optimized and very accurate mathematical functions. The average time spent in evaluating trigonometric functions when using the libm library was 2.1 seconds out of a total event processing time of 14.1 seconds. The use libimf reduces the evaluation time by, on average, 10%. Updated from a 32-bit to a 64-bit architecture, which provided a 25% overall reduction in data processing time. Updated Google memory allocator package, tcmalloc, from version 0.99 to 2.1, in order to fix issue in unaligned memory blocks, which caused problems with Eigen. Moreover the updated version provides effective use of single instruction multiple data (SIMD) CPU functionality. Update the compiler from version 4.3 to 4.7, which allows for study of auto-vectorisation. Simplification of the event data model resulting in the reduction of dynamic memory allocation. 3 Optimisation of track seeding and track finding in high pile-up environments The ATLAS Inner Detector (ID) charged particle tracking algorithms are the biggest consumers of the CPU budget in reconstruction. In a typical reconstruction job of run-1 data the Inner Detector algorithms consumed up to 60% of the total reconstruction time alone. This expense is to be expected because with more pile-up comes more space points with which one can form tracks in the ID, and so ID algorithms are susceptible to an exponential growth in reconstruction time. At µ = 40 ID algorithms are expected to take as much as 75% of the reconstruction time 1. Therefore tuning and renewed optimisation of ID tracking has been a top priority during LS1. ATLAS has commissioned dedidated optimal track seeding strategies for run-2 that depend on the level of pile-up and moreover were re-tuned to fully exploit the capabilities of newly installed Insertable B-Layer (IBL). These have resulted in a factor of 2 speedup in reconstruction time in conditions of 1 In general, the tracking optimisation depends on the expected level of pile-up both for timing and physics performance. 2
4 µ = 40 and a bunch crossing of 25 ns. A first optimisation using 8 TeV data-sets was implemented in release 19.0, where release 17.2, referred to in figures later, was used in reconstruction at the end of run- 1. Further it was found that, for the purpose of photon conversion reconstruction, dedictated tracking in the Transistion Radiation Tracker sub-component of the ID (known within ATLAS as back tracking and TRT only tracking) need only be run in the region of interest defined by the presence of a energy deposit in the calorimeter. This change resulted in a factor of three reduction in the reconstruction time expended in TRT only tracking. The only client of TRT only tracking, conversion finding, was not affected by the change. This change was commissioned in software release 19.1 together with a further evolution in the track seeding for 13 TeV. 4 Measurements Fig. 1 displays the measured reconstruction time for all and Inner Detector only algorithms as a function of the software release for top-quark pair production events. It shows that a factor 3 reduction in processing time has been achieved in LS1. The majority of the improvement has been due to improvements in ID algorithms. Fig. 2 displays the ID track reconstruction efficiency as a function of the software release. It has slightly improved from release to release, indicating that the performance has not been compromised by the changes that reduced the overall reconstruction time. Fig. 3 displays the measured reconstruction time as a function of the average number of interactions per bunching crossing in data events from the so-called JetETMiss stream. These are events triggered either by the presence of jets, missing transverse energy or tau-leptons. The data was collected in the latter part of run-1. It shows that a factor 4 reduction in processing time has been achieved in LS1 when comparing the reconstruction time between the three software releases. Fig. 4 displays the measured reconstruction time as a function of the average number of interactions per bunching crossing in data events from the same JetETMiss stream in the latter part of The reconstruction times shown are taken from the actual Tier-0 prompt reconstruction log files and plotted separately for each CPU type deployed in the Tier-0. Fig. 4 demonstrates that real reconstruction times are consistent with dedicated measurements used on the benchmark machine but can sometimes fluctuate to as high as 100 seconds in data when µ = 35.4 on some machines. References [1] G. Aad et al. [ATLAS Collaboration], Eur. Phys. J. C 71 (2011) 1630 [arxiv: [hep-ex]]. 3
5 Reconstruction time per event [s] ATLAS Simulation RDO to ESD s = 14 TeV <µ> = ns bunch spacing Run 1 Geometry pp tt HS06 = Full reconstruction Inner Detector only , 32bit , 64bit , 64bit Software release Figure 1: Time per event as measured in seconds to reconstruct Monte Carlo top-quark pair production events (t t) as a function of the ATLAS software release version. These events are generated at LHC collision energy of 14 TeV with conditions of a bunch crossing (BC) spacing time of 25 ns and an average number of interactions per BC of 40 ( µ ). Two sets of data are displayed: the full reconstruction time (red); and the reconstruction time used for the Inner Detector sub-system reconstruction only (blue), which is the dominant sub-component to the full reconstruction time. The simulation is performed for the run-1 ATLAS detector geometry. Measurements were performed on a machine with a HS06 scaling factor of The data processing time of interest here is the time taken to process Raw Data Object (RDO) files into Event Summary Data (ESD) files, in what is known as the reconstruction step. 4
6 Efficiency [%] s = 14 TeV <µ> = ns bunch spacing Run 1 Geometry pp tt events ATLAS Simulation , 32bit Software release Figure 2: ATLAS Inner Detector track reconstruction efficiency for true charged particles from t t that originate within an radius of 20 mm from the z-axis of the ATLAS detector, which is defined along the beam-line. The true charged particle must have a true transverse momentum of greater than 0.8 GeV/c and create a least 7 hits in the silicon tracker. These events are generated at LHC collision energy of 14 TeV with conditions of a bunch crossing (BC) spacing time of 25 ns and an average number of interactions per BC of 40 ( µ ). 5
7 Full reconstruction time per event [s] ATLAS (Data 2012) Software release Average number of interactions per bunch crossing µ Figure 3: Time per event as measured in seconds to reconstruct data events triggered by the presence of jets, missing transverse energy or tau-leptons, as a function of the number of primary vertices and the software release. The data was collected at the end of 2012 at the conclusion of LHC run-1. 6
8 Reconstruction time per event [s] Different CPU types at Tier0 Intel L GHz/8192 KB (15242 jobs) 80 Intel L GHz/12288 KB (11631 jobs) 70 Intel L GHz/6144 KB (15846 jobs) 60 Intel E GHz/6144 KB (8600 jobs) Intel E5-2630L GHz/15360 KB (6941 jobs) ATLAS Average number of interactions per bunch crossing µ Figure 4: Time per event as measured in seconds to reconstruct data events triggered by the presence of jets, missing transverse energy or tau-leptons, as a function of the average number of primary vertices. The time is given for a number of thousands of jobs as measured on various CPU chips. The data was collected at the end of 2012 at the conclusion of LHC run-1. The software release 17.2 was deployed at Tier0 to reconstruct these events. The colours of the points distinguish between the CPU machine used for the reconstruction job as detailed in the legend. 7
Tracking and flavour tagging selection in the ATLAS High Level Trigger
Tracking and flavour tagging selection in the ATLAS High Level Trigger University of Pisa and INFN E-mail: milene.calvetti@cern.ch In high-energy physics experiments, track based selection in the online
More informationATLAS PILE-UP AND OVERLAY SIMULATION
ATLAS PILE-UP AND OVERLAY SIMULATION LPCC Detector Simulation Workshop, June 26-27, 2017 ATL-SOFT-SLIDE-2017-375 22/06/2017 Tadej Novak on behalf of the ATLAS Collaboration INTRODUCTION In addition to
More informationb-jet identification at High Level Trigger in CMS
Journal of Physics: Conference Series PAPER OPEN ACCESS b-jet identification at High Level Trigger in CMS To cite this article: Eric Chabert 2015 J. Phys.: Conf. Ser. 608 012041 View the article online
More informationMonitoring of Computing Resource Use of Active Software Releases at ATLAS
1 2 3 4 5 6 Monitoring of Computing Resource Use of Active Software Releases at ATLAS Antonio Limosani on behalf of the ATLAS Collaboration CERN CH-1211 Geneva 23 Switzerland and University of Sydney,
More informationPoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN
CERN E-mail: mia.tosi@cern.ch During its second period of operation (Run 2) which started in 2015, the LHC will reach a peak instantaneous luminosity of approximately 2 10 34 cm 2 s 1 with an average pile-up
More informationThe GAP project: GPU applications for High Level Trigger and Medical Imaging
The GAP project: GPU applications for High Level Trigger and Medical Imaging Matteo Bauce 1,2, Andrea Messina 1,2,3, Marco Rescigno 3, Stefano Giagu 1,3, Gianluca Lamanna 4,6, Massimiliano Fiorini 5 1
More informationCMS Conference Report
Available on CMS information server CMS CR 2005/021 CMS Conference Report 29 Septemebr 2005 Track and Vertex Reconstruction with the CMS Detector at LHC S. Cucciarelli CERN, Geneva, Switzerland Abstract
More informationPoS(High-pT physics09)036
Triggering on Jets and D 0 in HLT at ALICE 1 University of Bergen Allegaten 55, 5007 Bergen, Norway E-mail: st05886@alf.uib.no The High Level Trigger (HLT) of the ALICE experiment is designed to perform
More informationLHCb Computing Resources: 2018 requests and preview of 2019 requests
LHCb Computing Resources: 2018 requests and preview of 2019 requests LHCb-PUB-2017-009 23/02/2017 LHCb Public Note Issue: 0 Revision: 0 Reference: LHCb-PUB-2017-009 Created: 23 rd February 2017 Last modified:
More informationPrompt data reconstruction at the ATLAS experiment
Prompt data reconstruction at the ATLAS experiment Graeme Andrew Stewart 1, Jamie Boyd 1, João Firmino da Costa 2, Joseph Tuggle 3 and Guillaume Unal 1, on behalf of the ATLAS Collaboration 1 European
More informationCMS High Level Trigger Timing Measurements
Journal of Physics: Conference Series PAPER OPEN ACCESS High Level Trigger Timing Measurements To cite this article: Clint Richardson 2015 J. Phys.: Conf. Ser. 664 082045 Related content - Recent Standard
More informationTHE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.
Proceedings of the PIC 2012, Štrbské Pleso, Slovakia THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE. E.STANECKA, ON BEHALF OF THE ATLAS COLLABORATION Institute of Nuclear Physics
More informationEvaluation of the computing resources required for a Nordic research exploitation of the LHC
PROCEEDINGS Evaluation of the computing resources required for a Nordic research exploitation of the LHC and Sverker Almehed, Chafik Driouichi, Paula Eerola, Ulf Mjörnmark, Oxana Smirnova,TorstenÅkesson
More informationATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine
Journal of Physics: Conference Series PAPER OPEN ACCESS ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine To cite this article: Noemi Calace et al 2015 J. Phys.: Conf. Ser. 664 072005
More informationPerformance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger
Journal of Physics: Conference Series PAPER OPEN ACCESS Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger To cite this article: Mia Tosi 205 J. Phys.: Conf. Ser.
More informationATLAS Simulation Computing Performance and Pile-Up Simulation in ATLAS
ATLAS Simulation Computing Performance and Pile-Up Simulation in ATLAS John Chapman On behalf of the ATLAS Collaboration LPCC Detector Simulation Workshop 6th-7th October 2011, CERN Techniques For Improving
More informationATLAS ITk Layout Design and Optimisation
ATLAS ITk Layout Design and Optimisation Noemi Calace noemi.calace@cern.ch On behalf of the ATLAS Collaboration 3rd ECFA High Luminosity LHC Experiments Workshop 3-6 October 2016 Aix-Les-Bains Overview
More informationFast pattern recognition with the ATLAS L1Track trigger for the HL-LHC
Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC On behalf of the ATLAS Collaboration Uppsala Universitet E-mail: mikael.martensson@cern.ch ATL-DAQ-PROC-2016-034 09/01/2017 A fast
More informationThe performance of the ATLAS Inner Detector Trigger Algorithms in pp collisions at the LHC
X11 opical Seminar IPRD, Siena - 7- th June 20 he performance of the ALAS Inner Detector rigger Algorithms in pp collisions at the LHC Mark Sutton University of Sheffield on behalf of the ALAS Collaboration
More informationPoS(IHEP-LHC-2011)002
and b-tagging performance in ATLAS Università degli Studi di Milano and INFN Milano E-mail: andrea.favareto@mi.infn.it The ATLAS Inner Detector is designed to provide precision tracking information at
More informationFirst LHCb measurement with data from the LHC Run 2
IL NUOVO CIMENTO 40 C (2017) 35 DOI 10.1393/ncc/i2017-17035-4 Colloquia: IFAE 2016 First LHCb measurement with data from the LHC Run 2 L. Anderlini( 1 )ands. Amerio( 2 ) ( 1 ) INFN, Sezione di Firenze
More informationA LVL2 Zero Suppression Algorithm for TRT Data
A LVL2 Zero Suppression Algorithm for TRT Data R. Scholte,R.Slopsema,B.vanEijk, N. Ellis, J. Vermeulen May 5, 22 Abstract In the ATLAS experiment B-physics studies will be conducted at low and intermediate
More informationPoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)
Universitá degli Studi di Padova e INFN (IT) E-mail: mia.tosi@gmail.com The trigger systems of the LHC detectors play a crucial role in determining the physics capabilities of the experiments. A reduction
More informationMonte Carlo programs
Monte Carlo programs Alexander Khanov PHYS6260: Experimental Methods is HEP Oklahoma State University November 15, 2017 Simulation steps: event generator Input = data cards (program options) this is the
More informationIntegrated CMOS sensor technologies for the CLIC tracker
Integrated CMOS sensor technologies for the CLIC tracker Magdalena Munker (CERN, University of Bonn) On behalf of the collaboration International Conference on Technology and Instrumentation in Particle
More informationThe ATLAS Trigger System: Past, Present and Future
Available online at www.sciencedirect.com Nuclear and Particle Physics Proceedings 273 275 (2016) 1065 1071 www.elsevier.com/locate/nppp The ATLAS Trigger System: Past, Present and Future F. Pastore Royal
More informationarxiv:hep-ph/ v1 11 Mar 2002
High Level Tracker Triggers for CMS Danek Kotliński a Andrey Starodumov b,1 a Paul Scherrer Institut, CH-5232 Villigen, Switzerland arxiv:hep-ph/0203101v1 11 Mar 2002 b INFN Sezione di Pisa, Via Livornese
More informationModules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade
Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade Carlos García Argos, on behalf of the ATLAS ITk Collaboration University of Freiburg International Conference on Technology
More informationCMS reconstruction improvements for the tracking in large pile-up events
Journal of Physics: Conference Series CMS reconstruction improvements for the tracking in large pile-up events To cite this article: D Giordano and G Sguazzoni 2012 J. Phys.: Conf. Ser. 396 022044 View
More informationPerformance of the ATLAS Inner Detector at the LHC
Performance of the ALAS Inner Detector at the LHC hijs Cornelissen for the ALAS Collaboration Bergische Universität Wuppertal, Gaußstraße 2, 4297 Wuppertal, Germany E-mail: thijs.cornelissen@cern.ch Abstract.
More informationCMS Alignement and Calibration workflows: lesson learned and future plans
Available online at www.sciencedirect.com Nuclear and Particle Physics Proceedings 273 275 (2016) 923 928 www.elsevier.com/locate/nppp CMS Alignement and Calibration workflows: lesson learned and future
More informationSoftware and computing evolution: the HL-LHC challenge. Simone Campana, CERN
Software and computing evolution: the HL-LHC challenge Simone Campana, CERN Higgs discovery in Run-1 The Large Hadron Collider at CERN We are here: Run-2 (Fernando s talk) High Luminosity: the HL-LHC challenge
More informationSystem upgrade and future perspective for the operation of Tokyo Tier2 center. T. Nakamura, T. Mashimo, N. Matsui, H. Sakamoto and I.
System upgrade and future perspective for the operation of Tokyo Tier2 center, T. Mashimo, N. Matsui, H. Sakamoto and I. Ueda International Center for Elementary Particle Physics, The University of Tokyo
More informationReliability Engineering Analysis of ATLAS Data Reprocessing Campaigns
Journal of Physics: Conference Series OPEN ACCESS Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns To cite this article: A Vaniachine et al 2014 J. Phys.: Conf. Ser. 513 032101 View
More informationCMS FPGA Based Tracklet Approach for L1 Track Finding
CMS FPGA Based Tracklet Approach for L1 Track Finding Anders Ryd (Cornell University) On behalf of the CMS Tracklet Group Presented at AWLC June 29, 2017 Anders Ryd Cornell University FPGA Based L1 Tracking
More informationACTS: from ATLAS software towards a common track reconstruction software
1 2 3 4 5 6 7 8 ACTS: from ATLAS software towards a common track reconstruction software C Gumpert 1, A Salzburger 1, M Kiehn 2, J Hrdinka 1,3 and N Calace 2 on behalf of the ATLAS Collaboration 1 CERN,
More informationThe Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland
Available on CMS information server CMS CR -2008/100 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 02 December 2008 (v2, 03 December 2008)
More informationConference The Data Challenges of the LHC. Reda Tafirout, TRIUMF
Conference 2017 The Data Challenges of the LHC Reda Tafirout, TRIUMF Outline LHC Science goals, tools and data Worldwide LHC Computing Grid Collaboration & Scale Key challenges Networking ATLAS experiment
More informationTrack pattern-recognition on GPGPUs in the LHCb experiment
Track pattern-recognition on GPGPUs in the LHCb experiment Stefano Gallorini 1,2 1 University and INFN Padova, Via Marzolo 8, 35131, Padova, Italy 2 CERN, 1211 Geneve 23, Switzerland DOI: http://dx.doi.org/10.3204/desy-proc-2014-05/7
More informationAdding timing to the VELO
Summer student project report: Adding timing to the VELO supervisor: Mark Williams Biljana Mitreska Cern Summer Student Internship from June 12 to August 4, 2017 Acknowledgements I would like to thank
More informationThe Database Driven ATLAS Trigger Configuration System
Journal of Physics: Conference Series PAPER OPEN ACCESS The Database Driven ATLAS Trigger Configuration System To cite this article: Carlos Chavez et al 2015 J. Phys.: Conf. Ser. 664 082030 View the article
More informationTracking and Vertexing performance in CMS
Vertex 2012, 16-21 September, Jeju, Korea Tracking and Vertexing performance in CMS Antonio Tropiano (Università and INFN, Firenze) on behalf of the CMS collaboration Outline Tracker description Track
More informationarxiv: v1 [physics.ins-det] 11 Jul 2015
GPGPU for track finding in High Energy Physics arxiv:7.374v [physics.ins-det] Jul 5 L Rinaldi, M Belgiovine, R Di Sipio, A Gabrielli, M Negrini, F Semeria, A Sidoti, S A Tupputi 3, M Villa Bologna University
More informationThe LHCb Upgrade. LHCC open session 17 February Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014
The LHCb Upgrade LHCC open session 17 February 2010 Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014 Andreas Schopper on behalf of Motivation LHCb is a high precision experiment
More informationMIP Reconstruction Techniques and Minimum Spanning Tree Clustering
SLAC-PUB-11359 July 25 MIP Reconstruction Techniques and Minimum Spanning Tree Clustering Wolfgang F. Mader The University of Iowa, 23 Van Allen Hall, 52242 Iowa City, IA The development of a tracking
More informationThe LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions
The LHCb upgrade Burkhard Schmidt for the LHCb Collaboration Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions OT IT coverage 1.9
More informationATLAS, CMS and LHCb Trigger systems for flavour physics
ATLAS, CMS and LHCb Trigger systems for flavour physics Università degli Studi di Bologna and INFN E-mail: guiducci@bo.infn.it The trigger systems of the LHC detectors play a crucial role in determining
More informationStudy of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider.
Study of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider. Calvet Thomas CPPM, ATLAS group PhD day 25 novembre 2015 2 The
More informationAlignment of the ATLAS Inner Detector tracking system
Alignment of the ATLAS Inner Detector tracking system Instituto de Física Corpuscular (IFIC), Centro Mixto UVEG-CSIC, Apdo.22085, ES-46071 Valencia, E-mail: Regina.Moles@ific.uv.es The ATLAS experiment
More informationVirtualizing a Batch. University Grid Center
Virtualizing a Batch Queuing System at a University Grid Center Volker Büge (1,2), Yves Kemp (1), Günter Quast (1), Oliver Oberst (1), Marcel Kunze (2) (1) University of Karlsruhe (2) Forschungszentrum
More informationFirst results from the LHCb Vertex Locator
First results from the LHCb Vertex Locator Act 1: LHCb Intro. Act 2: Velo Design Dec. 2009 Act 3: Initial Performance Chris Parkes for LHCb VELO group Vienna Conference 2010 2 Introducing LHCb LHCb is
More informationFAMOS: A Dynamically Configurable System for Fast Simulation and Reconstruction for CMS
FAMOS: A Dynamically Configurable System for Fast Simulation and Reconstruction for CMS St. Wynhoff Princeton University, Princeton, NJ 08544, USA Detailed detector simulation and reconstruction of physics
More informationElectron and Photon Reconstruction and Identification with the ATLAS Detector
Electron and Photon Reconstruction and Identification with the ATLAS Detector IPRD10 S12 Calorimetry 7th-10th June 2010 Siena, Italy Marine Kuna (CPPM/IN2P3 Univ. de la Méditerranée) on behalf of the ATLAS
More informationDesign of the new ATLAS Inner Tracker (ITk) for the High Luminosity LHC
Design of the new ATLAS Inner Tracker (ITk) for the High Luminosity LHC Jike Wang (DESY) for the ATLAS Collaboration May/2017, TIPP 2017 LHC Machine Schedule In year 2015, ATLAS and CMS went into Run2
More informationIEPSAS-Kosice: experiences in running LCG site
IEPSAS-Kosice: experiences in running LCG site Marian Babik 1, Dusan Bruncko 2, Tomas Daranyi 1, Ladislav Hluchy 1 and Pavol Strizenec 2 1 Department of Parallel and Distributed Computing, Institute of
More informationDetermination of the aperture of the LHCb VELO RF foil
LHCb-PUB-214-12 April 1, 214 Determination of the aperture of the LHCb VELO RF foil M. Ferro-Luzzi 1, T. Latham 2, C. Wallace 2. 1 CERN, Geneva, Switzerland 2 University of Warwick, United Kingdom LHCb-PUB-214-12
More informationMuon Reconstruction and Identification in CMS
Muon Reconstruction and Identification in CMS Marcin Konecki Institute of Experimental Physics, University of Warsaw, Poland E-mail: marcin.konecki@gmail.com An event reconstruction at LHC is a challenging
More informationThe creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM
The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM Lukas Nellen ICN-UNAM lukas@nucleares.unam.mx 3rd BigData BigNetworks Conference Puerto Vallarta April 23, 2015 Who Am I? ALICE
More informationDisentangling P ANDA s time-based data stream
Disentangling P ANDA s time-based data stream M. Tiemens on behalf of the PANDA Collaboration KVI - Center For Advanced Radiation Technology, University of Groningen, Zernikelaan 25, 9747 AA Groningen,
More informationA New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment
EPJ Web of Conferences 108, 02023 (2016) DOI: 10.1051/ epjconf/ 201610802023 C Owned by the authors, published by EDP Sciences, 2016 A New Segment Building Algorithm for the Cathode Strip Chambers in the
More informationPerformance Study of GPUs in Real-Time Trigger Applications for HEP Experiments
Available online at www.sciencedirect.com Physics Procedia 37 (212 ) 1965 1972 TIPP 211 Technology and Instrumentation in Particle Physics 211 Performance Study of GPUs in Real-Time Trigger Applications
More informationATLAS PAPER. 27th June General guidelines for ATLAS papers. ATLAS Publication Committee. Abstract
ATLAS PAPER 27th June 2016 General guidelines for ATLAS papers ATLAS Publication Committee Abstract This document gives general guidelines on how an ATLAS paper should be structured and what authors should
More informationPoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout
An Overview of the b-tagging Algorithms in the CMS Offline Software Christophe Saout CERN, Geneva, Switzerland E-mail: christophe.saout@cern.ch The CMS Offline software contains a widespread set of algorithms
More informationATLAS strategy for primary vertex reconstruction during Run-2 of the LHC
Journal of Physics: Conference Series PAPER OPEN ACCESS ATLAS strategy for primary vertex reconstruction during Run- of the LHC To cite this article: G Borissov et al J. Phys.: Conf. Ser. 664 74 Related
More informationPerformance studies of the Roman Pot timing detectors in the forward region of the IP5 at LHC
TOTEM NOTE 2014 001 August 1, 2014 Performance studies of the Roman Pot timing detectors in the forward region of the IP5 at LHC M. Berretti (CERN) Abstract CERN-TOTEM-NOTE-2014-001 01/08/2014 The detection
More informationAtlantis: Visualization Tool in Particle Physics
Atlantis: Visualization Tool in Particle Physics F.J.G.H. Crijns 2, H. Drevermann 1, J.G. Drohan 3, E. Jansen 2, P.F. Klok 2, N. Konstantinidis 3, Z. Maxa 3, D. Petrusca 1, G. Taylor 4, C. Timmermans 2
More informationData Reconstruction in Modern Particle Physics
Data Reconstruction in Modern Particle Physics Daniel Saunders, University of Bristol 1 About me Particle Physics student, final year. CSC 2014, tcsc 2015, icsc 2016 Main research interests. Detector upgrades
More informationTrack reconstruction with the CMS tracking detector
Track reconstruction with the CMS tracking detector B. Mangano (University of California, San Diego) & O.Gutsche (Fermi National Accelerator Laboratory) Overview The challenges The detector Track reconstruction
More informationPoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.
Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector. CERN E-mail: agnieszka.dziurda@cern.ch he LHCb detector is a single-arm forward spectrometer
More informationInside-out tracking at CDF
Nuclear Instruments and Methods in Physics Research A 538 (25) 249 254 www.elsevier.com/locate/nima Inside-out tracking at CDF Christopher Hays a,, Yimei Huang a, Ashutosh V. Kotwal a, Heather K. Gerberich
More informationData oriented job submission scheme for the PHENIX user analysis in CCJ
Journal of Physics: Conference Series Data oriented job submission scheme for the PHENIX user analysis in CCJ To cite this article: T Nakamura et al 2011 J. Phys.: Conf. Ser. 331 072025 Related content
More informationDESY at the LHC. Klaus Mőnig. On behalf of the ATLAS, CMS and the Grid/Tier2 communities
DESY at the LHC Klaus Mőnig On behalf of the ATLAS, CMS and the Grid/Tier2 communities A bit of History In Spring 2005 DESY decided to participate in the LHC experimental program During summer 2005 a group
More informationFull Offline Reconstruction in Real Time with the LHCb Detector
Full Offline Reconstruction in Real Time with the LHCb Detector Agnieszka Dziurda 1,a on behalf of the LHCb Collaboration 1 CERN, Geneva, Switzerland Abstract. This document describes the novel, unique
More informationRobustness Studies of the CMS Tracker for the LHC Upgrade Phase I
Robustness Studies of the CMS Tracker for the LHC Upgrade Phase I Juan Carlos Cuevas Advisor: Héctor Méndez, Ph.D University of Puerto Rico Mayagϋez May 2, 2013 1 OUTLINE Objectives Motivation CMS pixel
More informationData handling and processing at the LHC experiments
1 Data handling and processing at the LHC experiments Astronomy and Bio-informatic Farida Fassi CC-IN2P3/CNRS EPAM 2011, Taza, Morocco 2 The presentation will be LHC centric, which is very relevant for
More informationLHC Detector Upgrades
Su Dong SLAC Summer Institute Aug/2/2012 1 LHC is exceeding expectations in many ways Design lumi 1x10 34 Design pileup ~24 Rapid increase in luminosity Even more dramatic pileup challenge Z->µµ event
More informationSimulating the RF Shield for the VELO Upgrade
LHCb-PUB-- March 7, Simulating the RF Shield for the VELO Upgrade T. Head, T. Ketel, D. Vieira. Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, Brazil European Organization for Nuclear Research
More informationUse of hardware accelerators for ATLAS computing
Use of hardware accelerators for ATLAS computing Matteo Bauce 2, Rene Boeing 3 4, Maik Dankel 3 4, Jacob Howard 1, Sami Kama 5 for the ATLAS Collaboration 1 Department of Physics Oxford University, Oxford,
More informationMachine Learning for (fast) simulation
Machine Learning for (fast) simulation Sofia Vallecorsa for the GeantV team CERN, April 2017 1 Monte Carlo Simulation: Why Detailed simulation of subatomic particles is essential for data analysis, detector
More informationFrom raw data to new fundamental particles: The data management lifecycle at the Large Hadron Collider
From raw data to new fundamental particles: The data management lifecycle at the Large Hadron Collider Andrew Washbrook School of Physics and Astronomy University of Edinburgh Dealing with Data Conference
More informationPoS(ACAT)049. Alignment of the ATLAS Inner Detector. Roland Haertel Max-Planck-Institut für Physik, Munich, Germany
Max-Planck-Institut für Physik, Munich, Germany E-mail: haertel@mppmu.mpg.de The ATLAS experiment at the LHC is currently under construction at CERN and will start operation in summer 2008. The Inner Detector
More informationTHE ATLAS DATA ACQUISITION SYSTEM IN LHC RUN 2
THE ATLAS DATA ACQUISITION SYSTEM IN LHC RUN 2 M. E. Pozo Astigarraga, on behalf of the ATLAS Collaboration CERN, CH-1211 Geneva 23, Switzerland E-mail: eukeni.pozo@cern.ch The LHC has been providing proton-proton
More informationUpdate of the Computing Models of the WLCG and the LHC Experiments
Update of the Computing Models of the WLCG and the LHC Experiments September 2013 Version 1.7; 16/09/13 Editorial Board Ian Bird a), Predrag Buncic a),1), Federico Carminati a), Marco Cattaneo a),4), Peter
More informationAlignment of the ATLAS inner detector tracking system
Journal of Instrumentation OPEN ACCESS Alignment of the ATLAS inner detector tracking system To cite this article: Heather M Gray Related content - The ATLAS trigger: high-level trigger commissioning and
More informationAlignment of the CMS Silicon Tracker
Alignment of the CMS Silicon Tracker Tapio Lampén 1 on behalf of the CMS collaboration 1 Helsinki Institute of Physics, Helsinki, Finland Tapio.Lampen @ cern.ch 16.5.2013 ACAT2013 Beijing, China page 1
More informationThe CMS Computing Model
The CMS Computing Model Dorian Kcira California Institute of Technology SuperComputing 2009 November 14-20 2009, Portland, OR CERN s Large Hadron Collider 5000+ Physicists/Engineers 300+ Institutes 70+
More informationMonte Carlo Production Management at CMS
Monte Carlo Production Management at CMS G Boudoul 1, G Franzoni 2, A Norkus 2,3, A Pol 2, P Srimanobhas 4 and J-R Vlimant 5 - for the Compact Muon Solenoid collaboration 1 U. C. Bernard-Lyon I, 43 boulevard
More informationATLAS Dr. C. Lacasta, Dr. C. Marinas
ATLAS Dr. C. Lacasta, Dr. C. Marinas cmarinas@uni-bonn.de 1 http://www.atlas.ch/multimedia/# Why? In particle physics, the processes occur on a scale that is either too brief or too small to be observed
More informationTracking and Vertex reconstruction at LHCb for Run II
Tracking and Vertex reconstruction at LHCb for Run II Hang Yin Central China Normal University On behalf of LHCb Collaboration The fifth Annual Conference on Large Hadron Collider Physics, Shanghai, China
More informationPrecision Timing in High Pile-Up and Time-Based Vertex Reconstruction
Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction Cedric Flamant (CERN Summer Student) - Supervisor: Adi Bornheim Division of High Energy Physics, California Institute of Technology,
More information1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France
&+(3/D-ROOD&DOLIRUQLD0DUFK 1 Design, implementation and deployment of the Saclay muon reconstruction algorithms (Muonbox/y) in the Athena software framework of the ATLAS experiment A. Formica DAPNIA/SEDI,
More informationTrigger and Data Acquisition: an ATLAS case study
Trigger and Data Acquisition: an ATLAS case study Standard Diagram of ATLAS Trigger + DAQ Aim is to understand most of this diagram by the end of the lecture! 1 Outline Basic Trigger and DAQ concepts The
More informationThe LHCb VERTEX LOCATOR performance and VERTEX LOCATOR upgrade
Journal of Instrumentation OPEN ACCESS The LHCb VERTEX LOCATOR performance and VERTEX LOCATOR upgrade To cite this article: P Rodríguez Pérez Related content - Upgrade of the LHCb Vertex Locator A Leflat
More informationThe ATLAS Conditions Database Model for the Muon Spectrometer
The ATLAS Conditions Database Model for the Muon Spectrometer Monica Verducci 1 INFN Sezione di Roma P.le Aldo Moro 5,00185 Rome, Italy E-mail: monica.verducci@cern.ch on behalf of the ATLAS Muon Collaboration
More informationSoftware for implementing trigger algorithms on the upgraded CMS Global Trigger System
Software for implementing trigger algorithms on the upgraded CMS Global Trigger System Takashi Matsushita and Bernhard Arnold Institute of High Energy Physics, Austrian Academy of Sciences, Nikolsdorfer
More informationPreparing ATLAS reconstruction software for LHC's Run 2
Journal of Physics: Conference Series PAPER OPEN ACCESS Preparing ALAS reconstruction software for LHC's Run 2 o cite this article: Jovan Mitrevski and 215 J. Phys.: Conf. Ser. 664 7234 View the article
More informationThe evolving role of Tier2s in ATLAS with the new Computing and Data Distribution model
Journal of Physics: Conference Series The evolving role of Tier2s in ATLAS with the new Computing and Data Distribution model To cite this article: S González de la Hoz 2012 J. Phys.: Conf. Ser. 396 032050
More informationCMS Simulation Software
CMS Simulation Software Dmitry Onoprienko Kansas State University on behalf of the CMS collaboration 10th Topical Seminar on Innovative Particle and Radiation Detectors 1-5 October 2006. Siena, Italy Simulation
More informationarxiv: v4 [physics.comp-ph] 15 Jan 2014
Massively Parallel Computing and the Search for Jets and Black Holes at the LHC arxiv:1309.6275v4 [physics.comp-ph] 15 Jan 2014 Abstract V. Halyo, P. LeGresley, P. Lujan Department of Physics, Princeton
More informationKalman Filter Tracking on Parallel Architectures
Kalman Filter Tracking on Parallel Architectures Giuseppe Cerati 1,a, Peter Elmer,b, Slava Krutelyov 1,c, Steven Lantz 3,d, Matthieu Lefebvre,e, Kevin McDermott 3,f, Daniel Riley 3,g, Matevž Tadel 1,h,
More information