ATLAS PILE-UP AND OVERLAY SIMULATION

Similar documents
ATLAS Simulation Computing Performance and Pile-Up Simulation in ATLAS

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract

Prompt data reconstruction at the ATLAS experiment

CMS FPGA Based Tracklet Approach for L1 Track Finding

PoS(High-pT physics09)036

Electron and Photon Reconstruction and Identification with the ATLAS Detector

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

Event Displays and LArg

The full detector simulation for the ATLAS experiment: status and outlook

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Tracking and Vertexing performance in CMS

ATLAS ITk Layout Design and Optimisation

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

Track reconstruction with the CMS tracking detector

The performance of the ATLAS Inner Detector Trigger Algorithms in pp collisions at the LHC

Alignment of the ATLAS Inner Detector

ATLAS, CMS and LHCb Trigger systems for flavour physics

Tracking and flavour tagging selection in the ATLAS High Level Trigger

CMS Alignement and Calibration workflows: lesson learned and future plans

Alignment of the ATLAS Inner Detector tracking system

PoS(IHEP-LHC-2011)002

LHCb Computing Resources: 2018 requests and preview of 2019 requests

Event reconstruction in STAR

LHC Detector Upgrades

Performance studies of the Roman Pot timing detectors in the forward region of the IP5 at LHC

First results from the LHCb Vertex Locator

Design of the new ATLAS Inner Tracker (ITk) for the High Luminosity LHC

Software and computing evolution: the HL-LHC challenge. Simone Campana, CERN

Direct photon measurements in ALICE. Alexis Mas for the ALICE collaboration

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

Data handling and processing at the LHC experiments

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

CMS Conference Report

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

Performance of the ATLAS Inner Detector at the LHC

First LHCb measurement with data from the LHC Run 2

Integrated CMOS sensor technologies for the CLIC tracker

8.882 LHC Physics. Analysis Tips. [Lecture 9, March 4, 2009] Experimental Methods and Measurements

CMS Simulation Software

Trigger and Data Acquisition at the Large Hadron Collider

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger

The CMS Computing Model

The CLICdp Optimization Process

The ATLAS Trigger System: Past, Present and Future

Evolution of ATLAS conditions data and its management for LHC Run-2

b-jet identification at High Level Trigger in CMS

arxiv:hep-ph/ v1 11 Mar 2002

Computing in High Energy and Nuclear Physics, March 2003, La Jolla, California

Atlantis event display tutorial

Status Report of PRS/m

Muon Reconstruction and Identification in CMS

Real-time Analysis with the ALICE High Level Trigger.

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

A LVL2 Zero Suppression Algorithm for TRT Data

Physics and Detector Simulations. Norman Graf (SLAC) 2nd ECFA/DESY Workshop September 24, 2000

FAMOS: A Dynamically Configurable System for Fast Simulation and Reconstruction for CMS

Topics for the TKR Software Review Tracy Usher, Leon Rochester

ALICE tracking system

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

Jet algoritms Pavel Weber , KIP Jet Algorithms 1/23

ALICE ANALYSIS PRESERVATION. Mihaela Gheata DASPOS/DPHEP7 workshop

The LHCb Upgrade. LHCC open session 17 February Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014

PoS(ACAT)049. Alignment of the ATLAS Inner Detector. Roland Haertel Max-Planck-Institut für Physik, Munich, Germany

Data Reconstruction in Modern Particle Physics

LArTPC Reconstruction Challenges

The GLAST Event Reconstruction: What s in it for DC-1?

LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

The ATLAS Conditions Database Model for the Muon Spectrometer

The CMS L1 Global Trigger Offline Software

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Simulation and Physics Studies for SiD. Norman Graf (for the Simulation & Reconstruction Team)

Endcap Modules for the ATLAS SemiConductor Tracker

Final Report of the ATLAS Detector Simulation Performance Assessment Group

Determination of the aperture of the LHCb VELO RF foil

Reducing CPU Consumption of Geant4 Simulation in ATLAS

Study of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider.

SoLID GEM Detectors in US

First Operational Experience from the LHCb Silicon Tracker

Expected feedback from 3D for SLHC Introduction. LHC 14 TeV pp collider at CERN start summer 2008

Calorimeter Object Status. A&S week, Feb M. Chefdeville, LAPP, Annecy

MIP Reconstruction Techniques and Minimum Spanning Tree Clustering

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Fast simulation in ATLAS

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France

Alignment of the ATLAS inner detector tracking system

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

The Database Driven ATLAS Trigger Configuration System

PrimEx Trigger Simultation Study D. Lawrence Mar. 2002

b-jet slice performances at L2/EF

Tracking and compression techniques

Analogue, Digital and Semi-Digital Energy Reconstruction in the CALICE AHCAL

Track pattern-recognition on GPGPUs in the LHCb experiment

Computing. DOE Program Review SLAC. Rainer Bartoldus. Breakout Session 3 June BaBar Deputy Computing Coordinator

Anode Electronics Crosstalk on the ME 234/2 Chamber

Tracking and Vertex reconstruction at LHCb for Run II

Trigger and Data Acquisition: an ATLAS case study

1. Introduction. Outline

DESY at the LHC. Klaus Mőnig. On behalf of the ATLAS, CMS and the Grid/Tier2 communities

ATLAS NOTE ATLAS-CONF July 20, Commissioning of the ATLAS high-performance b-tagging algorithms in the 7 TeV collision data

Transcription:

ATLAS PILE-UP AND OVERLAY SIMULATION LPCC Detector Simulation Workshop, June 26-27, 2017 ATL-SOFT-SLIDE-2017-375 22/06/2017 Tadej Novak on behalf of the ATLAS Collaboration

INTRODUCTION In addition to hard interaction: pile-up from other collisions in current and surrounding bunch crossings, cosmics, beam-gas, beam-halo, cavern background, detector noise, Option 1: Pile-up MC (current default) Simulate both hard and soft interactions in MC and mix together in proper ratios. Option 2: Overlay Simulate only hard interaction in MC and overlay a random background event. Data Overlay Overlay a random data event. MC+MC Overlay Overlay a premixed zero-hard-scatter MC event. Pile-up MC method mostly used so far at ATLAS Data Overlay being used for some studies, specialized analyses, and Heavy- Ion, MC+MC Overlay being researched as an alternative to Pile-up MC. 2

ATLAS SENSITIVE TIME WINDOW -800 ns 25 ns Could affect trigger BC No effect on trigger BC +150 ns MDT LAr Tile RPC TGC CSC TRT BCM SCT Pixels bunch crossing -32 bunch crossing 0 +6 3

OPTION 1: SIMULATING PILE-UP 1. Run the event generation for minbias for single pp interactions: Pythia8, A3 tune, NNPDF23LO PDFs, inelastic (non-diffractive) and single/double diffractive. 2. Run GEANT4 on each minbias event to simulate detector energy/time HITS. 3. Combine multiple (thousands!) of HITS events during digitization: use representative number of interactions per bunch crossing, shifted in time, to reproduce in-time and out-of-time pile-up, sample bunch spacing/pattern within the sensitive time window of ATLAS detectors [-800, 150] ns. 4. Add model of detector noise separately. 5. No cavern background is added by default. 4

SIMULATING PILE-UP: VARIABLE BUNCH STRUCTURE Example of a pile-up model with fixed 50ns spacing between filled bunches: filled bunches: Signal Minbias Cavern 25ns tick ('bunch') In reality the structure of filled and empty bunch crossings can be more complicated: Filled BCs The pile-up/detector response is affected by the position of the triggering BCID in the bunch train. This is modelled in the simulation: Patterns can be up to 3564 elements in length and wrap-around if required. The triggering bunch crossing is picked from the filled bunch crossings in the pattern, with a probability proportional to the relative luminosities of each bunch crossing. 5

SIMULATING PILE-UP: VARIABLE BUNCH LUMINOSITY Well known that <μ> varies over time. μ can also vary greatly from BCID to BCID in data. We usually set same mu for each BCID, as so far no analysis has shown sensitivity to this effect. Based on the real <μ> from data, a PDF is prepared for simulation to choose the values for each event: Entries 41 Mean 24.63 /0.1] -1 Delivered Luminosity [pb 240 220 200 180 160 140 120 100 80 60 40 20 0 ATLAS Online, s=13 TeV 0 5 10 15 20 25 30 35 40 45 50 Mean Number of Interactions per Crossing Ldt=42.7 fb 2015: <µ> = 13.7 2016: <µ> = 24.9 Total: <µ> = 23.7-1 2/17 calibration 50 40 30 20 10 Std Dev 7.072 Integral 1000 0 0 5 10 15 20 25 30 35 40 45 Mean number of interactions per crossing 6

SIMULATING PILE-UP: DIGITIZATION For each hard-scatter event, make a selection of in-time and out-of-time pile-up events. To reduce memory usage one filled bunch crossing is processed at a time for all sensitive sub-detectors and then immediately discarded. Only read in / cache the parts of each event that are needed (e.g. discard HITS in silicon strips outside [-50,50] ns). Generating huge samples of minbias background is expensive! 40M minbias events simulated for 2015 & 2016 (20M low-pt, 20M high-pt ), low-pt : no AntiKt6Truth jets with p T > 35 GeV. Simulated minbias events are reused: across various MC samples, low-pt out-of-time minbias events within a MC sample. 7

OPTION 2: OVERLAY PILE-UP FROM DATA 1. Start with input zerobias lumi-weighted RAW events, ordered in time. 2. Simulate a hard-scatter G4 event, with conditions matching each selected data event (beamspot/tilt, alignments, magnetic fields, etc.). 3. Overlay each zerobias data event with matching GEANT4 event at the detector channel level, then digitize combined signals. 4. Reconstruct the combined event as data. Generator 4-vectors Run/ LBN/ Time 2) G4 1) Zerobias data (RAW) HITS 3) Overlay digitization RDO 4) Data reconstruction Analysis Conditions database 8

BACKGROUND DATA PREPARATION Zerobias data: Select event one accelerator turn after a high-pt trigger (e.g. L1_EM14) fires LHC Run2 Data next event in the same BC position, proportional to luminosity in each BC. Prescaled to keep ~10 Hz in Run 2 ~3 Hz is selected at HLT to have a jet with p T > 40 GeV ~100 M events/year No zero-suppression (except in silicon), ~2 MB/event compressed. Data Selected zerobias Offline, zerobias data are sampled from lumi-blocks in the desired time-period to reproduce the Run number luminosity profile of a high-pt trigger (account for dead-time, prescales, mix of HLT jets, etc.). 9

DATA OVERLAY VS. PILE-UP DIGITIZATION p-p overlay will get the details of the pile-up correct - using real data: Analyses sensitive to pile-up modelling should benefit. Should help detector performance groups where pile-up modelling is a problem. E.g. for Jet calibration at high pile-up. True detector noise and occupancy, including lumi-weighted variations and correlations between channels. Conditions (beamspot, dead channels, etc.) will track data more closely. Faster jobs with lower memory requirements. Can not simulate imaginary situations (e.g. Future high luminosity, different energy events, or upgrades to the detector). Less accurate when combining overlapping background and signal on the same channel for some sub- detectors (e.g. silicon). RDO contains less information than background HITS. Inaccuracies when simulating the trigger from overlay, since not all info used in real trigger is stored in the RAW data. Probably can not have as many overlay events as simulated events. Have to wait until all data is collected before generating simulations. Don't have the background MC truth information. 10

EXAMPLE: HEAVY ION COLLISIONS Very challenging to simulate the soft parts of nuclear collisions: generators such as HIJING have limitations, simulating the ~10k particles in G4 is slow. ATLAS Heavy-Ion group has made extensive use of overlay simulation: Record minimum-bias HI events (want a collision, and for HI runs there is <1 collision per BC). Overlay minimum-bias HI event on simulated hard-scatter (pp!) collision at same vertex position. Use matching alignments and conditions, just like for pp overlay simulation. 11

SPECIAL OPTION: SIMULATION USING EMBEDDING Embedding takes a data event (e.g. Z mumu) and replaces objects with another type of object (e.g. taus) to emulate a related process. Critical for modeling Z tautau background in H tautau analysis. So far this has been done at the reconstruction level at ATLAS, with tracks and calorimeter cells, but this has inaccuracies: tau's are simulated and reconstructed without pile-up, no cells below zero-suppression threshold in data event, can t run L1 trigger simulation. Working on using overlay MC techniques to perform embedding: Recorded ~5 Hz of non-zero-suppressed Z mumu in 2016, to use as overlay input (instead of zerobias). Simulate taus in G4 at same vertex/momenta as reconstructed muons. Overlay the MC taus and Z mumu data event, removing digits that were used to form the reconstructed Z mumu tracks' hits. Reconstruct overlaid event, performing subtraction of the muons' calorimeter cell energies (as in original embedding technique). 12

OPTION 2: OVERLAY PILE-UP FROM PRE-MIXED MC 1. Simulate a hard-scatter G4 event with usual configuration. 2. Pre-mixing of pile-up events: Digitization of zero-hard-scatter events (e.g. single neutrinos). 3. Digitize simulated hard-scatter event and overlay it on pre-mixed pile-up digits. 4. Reconstruct the combined event as ordinary MC. minimum bias events zero-hardscatter event HITS 2) Pre-mixing digitization RDO 1) Simulation hard-scatter event 3) MC+MC 4) MC HITS RDO Analysis Overlay reconstruction 13

MC+MC OVERLAY VS. PILE-UP DIGITIZATION CPU requirements for an MC campaign would be substantially reduced. Only have to digitize the background dataset once. I/O requirements for most production jobs would be reduced. A 2k event overlay + reconstruction job would only require one 8GB pile-up RDO file, rather than 40 GB of pile-up HITS. Different datasets would have the same pile-up backgrounds. Channels with HS and PU deposits just below threshold would be lost. RDOs are 4 MB per event compared to 900 kb per event for HITS. 1B event dataset would be 4 PB. Could probably get away with fewer copies of the pile-up dataset. 14

CONCLUSIONS ATLAS uses several methods for modelling pile-up and other detector backgrounds in our simulation: simulated Pythia8 minbias event selection and digitization is currently used for most simulations, data overlay is an alternate method, currently used for some performance studies, pp and HI physics analyses, and detector upgrade studies, embedding is used for specialised studies, e.g. for Z ττ, MC+Mc overlay is being investigated as an alternative to pile-up simulation. Pile-up will become increasingly important with larger instantaneous luminosity. Working to improve accuracy and speed of all these methods. 15