Computing strategy for PID calibration samples for LHCb Run 2

Size: px
Start display at page:

Download "Computing strategy for PID calibration samples for LHCb Run 2"

Transcription

1 Computing strategy for PID calibration samples for LHCb Run 2 LHCb-PUB /07/2016 Issue: 1 Revision: 0 Public Note Reference: LHCb-PUB Created: June, 2016 Last modified: July 18, 2016 Prepared by: Lucio Anderlini a, Sean Benson b, Vladimir Gligorov b, Oliver Lupton c, Barbara Sciascia d, a INFN, Sezione di Firenze b CERN c University of Oxford d INFN, Laboratori Nazionali di Frascati

2

3 Date: July 18, 2016 Abstract The samples for the calibration of the Particle Identification (PID) are sets of data collected by LHCb where decay candidates have a kinematic structure which allows unambiguous identification of one the daughters with a selection strategy not relying on its PID-related variables. PID calibration samples are used in a data-driven technique to measure the efficiency of selection requirements on PID variables in many LHCb analyses. Since the second run of the LHCb experiment, PID variable are computed as part of the software trigger and then refined in the offline reconstruction (this includes two different applications, named Brunel and DaVinci). Physics analyses rely on selections combining PID requirements on the online and offline versions of the PID variables. Because of the large, but not full, correlation between online and offline PID variables, the PID samples must be built such that it is possible to access, particle by particle, both versions of the full PID information, such that combined requirements on the offline and online versions can be defined. The only viable solution is to write the PID Calibration samples to a dedicated stream where the information evaluated in the trigger (the online version) is stored together with the full raw event, which is then reconstructed and processed offline. A dedicated algorithm allows to match online and offline candidates and to produce datasets to be used as input for the package that allows analysts to access the calibration samples. This note is focused on PID Calibration samples, but the general layout should be applied to any other calibration sample in Run 2. Contents 1 Introduction PIDCalib package Trigger configuration for LHCb Run Use cases for PID calibration Analyses on Full stream with PID unbiased trigger Analyses on Turbo Sream datasets Analyses on Full stream with PID requirements in the Trigger line Considerations about statistics of control samples Considerations about online-offline convergence Summary of the specifications Calibration lines in Run Operations with calibration lines during Run Matching the online and offline candidates Conclusion Acknowledgements References A Appendix page 1

4 1 Introduction Date: July 18, 2016 List of Figures 1 Schematic representation of the data flow for the Calibration samples. Note that there is no offline selection since candidates are produced online and resurrected using Tesla. Daughter particles are matched to the offline candidates in the DaVinci step while producing ntuples. stables contain the s Weight needed to statistically subtract the residual background following the s Plot technique [4] List of Tables 1 Summary of processing schemes. Each stream is routed by a specific routing bit (RB). In 2015 for validation purposes,turbo data have been duplicated as different output of Turbo and Turbo validation workflows. In 2016 only the Turbo workflow is used List of PID-related variables that are inputs to the PIDANN algorithms (Run 1 version) and stored in the TurboReport List of PID-related variables that are inputs to the PIDANN algorithms (Run 2 version), and stored in the TurboReport Overlap between different samples. Each cell is normalised to the rate expected from each sample (shown in the second column) Introduction Most of the LHCb analyses rely on Particle Identification (PID) to distinguish between the nature of charged tracks reconstructed by the detector. The detectors used to identify these particles are the Rich detectors (Rich1 and Rich2), the muon system (Muon), the electromagnetic calorimeter (ECAL), and marginally the hadronic calorimeter (HCAL) [1]. As part of the candidate selection of physics analyses, requirements on PID are applied by setting threshold on variables built to summarize the different contributions to the overall PID of the detectors listed above. These variables are grouped in two families: Combined Differential Log-likelihood (DLL, defined for each species of particle as the log-likelihood difference between particle and pion hypotheses), and the response of an Artificial Neural Network (named PIDANN) [2]. (The network output is normalized between 0 and 1 and therefore named ProbNN.) The PIDANN algorithm is tuned on simulated signal and background samples. Depending on the arrangement of the input samples, on the quality of the simulation, and on the available number of simulated events, the response of the PIDANN algorithm can vary. The algorithm that calculates the ProbNN variables has, therefore, been retuned several times. Everytime the data is processed for a specific physics analyses, the input variables are read, and the ProbNN variables are calculated using the latest tuning. Calibration samples provide pure samples of the five most common charged, long-lived particle species produced in LHCb: kaons, pions, protons, electrons and muons, used to measure the distribution of the PID variables for the five different particles. These distributions and their correlations are an essential ingredient to measure the efficiency and background rejection of the PID requirements applied as part of the physics analyses. In Section 2, we briefly introduce the PIDCalib package which is the tool provided to the Collaboration to measure the efficiency of complex PID requirements. Section 3 summarizes the LHCb Trigger structure during the second run of the LHC. The Trigger configuration dictates analysis needs in terms of PID calibration samples as discussed in Section 4. Section 5 is devoted to the discussion of the implementation in the computing model to fulfill the requirements on the size and availability of the calibration samples, while section 6 complements the discussion by defining the context in which the path for these special kind of data has been defined. This includes the allocated bandwidth and the overlap between different streams. Finally, Section 7 discusses the matching of the online and offline candidates used to build the full PID information for calibration samples, input of the PIDCalib package. Concluding remarks and outlook compose Section 8. page 2

5 3 Trigger configuration for LHCb Run 2 Date: July 18, PIDCalib package The PIDCalib package is the interface between the needs of analysts and the samples prepared by the PID group. It provides simple access to the Calibration samples through a set of scripts which allow to perform different studies using as input the PID calibration samples. A detailed description of the PIDCalib project is available elsewhere [3], here only a short introduction of the main use case is presented to introduce naming conventions and concepts used throughout the discussion. The first step performed as part of the PIDCalib procedure is the statistical background subtraction of the calibration samples. This is achieved with the s Plot technique [4]. Most LHCb analyses that apply PID requirements as part of their candidate selection need a datadriven technique to estimate the efficiency of this requirement on their signal, and on possible backgrounds. The efficiency is measured by assigning to each i-th candidate in the calibration sample a weight w i. The weight w i is chosen to ensure that the binned distributions of a simulated signal (or background) sample and of the weighted calibration sample are consistent. The considered distributions are usually the joint pdf with respect to kinematic variables such as momentum and pseudorapidity, and event-complexity observables such as the number of tracks, or the number of hits in the ECAL preshower (SPD). After reweighting, the efficiency of PID requirements on the calibration samples is the same (within systematic uncertainty) as in the analysed data sample. To ensure flexibility, the PIDCalib package allows to combine selection requirements on different variables associated to the same track. This is essential to allow analysts to assess the efficiency of widely used requirements combining the DLL of different hypotheses (e.g. the request DLLp > 2 and DLLp DLLK > 2 to select protons while rejecting pions and kaons), or even requirements combining DLL and ProbNN variables. 3 Trigger configuration for LHCb Run 2 In order to face the new challenges of the second run of the LHC, the LHCb trigger [5, 6] has evolved into a heterogeneous configuration with a different output data format for different groups of trigger selections. Here we focus on two alternative data formats for physics analyses, named Full stream and Turbo stream. Trigger selections (also called trigger lines) that are part of the Full stream are intended for precision measurements and searches. While the software trigger actually reconstructs candidates, including vertexing and isolation requirements, those are not saved. If the trigger decision is affirmative, the raw event is saved together with some summary information on the trigger decision, named SelReport. The raw event is then reprocessed offline. Trigger lines writing to Turbo stream are intended for analyses of very large samples where only the information related to the candidates is needed. Trigger lines that are part of the Turbo stream produce a decay candidate, for which a large number of detector-related variables is computed, and stored in a summarized report (named TurboReport) together with the candidate itself. The TurboReport can be processed offline using the Tesla application [7]. This is an application designed to process the information computed by the trigger, with the resulting output used to directly perform physics measurements. Any data processing that relies on the TurboReport is independent on the detector raw data which is no longer part of the long-term computing model [8]. During the very first part of Run 2 (June-July 2015, named Early Measurement run), to aid in the commissioning of the Turbo stream mechanism, also the raw information has been saved together with the TurboReport to allow comparison between data stored in the TurboReport and equivalent data obtained after offline processing. The PID information saved in the SelReport (Full stream) will be limited to variables commonly used in the Trigger selection strategy. This includes the basic information from the Muon System [9] and the DLL variables. More PID information is stored in TurboReport and includes all the input variables used to compute the ProbNN discriminants. (The full lists, for both 2015 and 2016 data taking periods, are reported in the Appendix, Tables 2 and 3.) page 3

6 4 Use cases for PID calibration Date: July 18, Use cases for PID calibration Both analyses based on Full and Turbo stream need to measure the efficiency of PID-based selection requirements. In the former case, these requirements may be applied offline and may have been applied online, using the respective set of computed PID observables. Potential differences between online and offline calculated PID observables lead to the need for data samples usable to calibrate both. This section is devoted to the case-by-case consideration of the physics needs in terms of calibration samples, specifically: the case where no PID-based selection is applied in the trigger (4.1), the case of analyses using Turbo stream data (4.2), and the case where PID-based selections are applied both in the trigger and offline (4.3). 4.1 Analyses on Full stream with PID unbiased trigger The highest precision analyses which need to fully control systematic uncertainties related to PIDbased selection generally do not introduce PID requirement as part of their trigger selection. The PID-based selection strategy relies on offline-reconstructed variables, which are available in both the calibration sample and the analysis sample. In case neural net based combinations of PID observables [2, 10] (ProbNN) are used, these may regularly be recomputed offline as a result of regular updates to their tunings. Therefore, it must be possible to compute ProbNN variables on demand both on analysis samples and on calibration samples. To allow this, all the variables that are inputs to the algorithms that calculate the ProbNN variables are reconstructed offline and stored in the output file (called (micro)dst, from the historical name Data Summary Tape) of both Calibration and Physics samples. Specification: Calibration samples must include all the PID observables (see Tables 2 and 3 in the Appendix) as reconstructed online. When PID performance is very important for an analysis, dedicated studies of the PID performance down to the detector level are often performed. When this is the case, assessing PID efficiency of the new algorithms on real data may require access to the detector raw data stored in the calibration samples. This includes Rich and Calorimeter raw data (to study detector calibrations) Muon raw data (for custom association of hits to the muon candidate) Trigger raw data. These are particularly important because part of the PID information (namely the one from Calorimeter and Muon systems) is included in the earlier stages of the trigger, care is required to ensure that the calibration samples are not biased against the trigger itself. A method, TisTos [5], has been implemented to allow physics analyses to factorise the PID and trigger contributions to total efficiencies. Tracking raw data (for isolation studies) Specification: Calibration samples must include Rich, Calorimeter, Muon, Trigger and Tracker raw data. 4.2 Analyses on Turbo Sream datasets In case of analyses based on the Turbo stream the offline version of the PID observables does not exists. The Calibration samples have to provide the PID information as computed online in order to assess the efficiency of cuts applied in the Trigger line on online variables; of cuts applied offline on the PID variables stored in the TurboReport. page 4

7 4 Use cases for PID calibration Date: July 18, 2016 Since many trigger selections use requirements on ProbNN, it is important that the information needed to compute these variables is stored in the TurboReport. Once restored with Tesla, the online version of ProbNN variables can be recomputed together with different tunings of the same variables, that can be useful for analyses relying on Turbo stream. Specification: Calibration samples must include the input variables of PIDANN algorithms as they are available online and stored in the TurboReport of analyses based on Turbo stream. 4.3 Analyses on Full stream with PID requirements in the Trigger line In case PID-based selection requirements are applied as part of both online and offline selections, a measurement of their efficiency require the presence in the calibration sample of both the online and offline reconstructed PID observables, on a candidate-by-candidate basis. Consider, for example a physics analysis selecting prompt Λ c pk π + decays. In order to decrease the trigger selection accepted rate without applying a downscale, analysts decide to put a loose DLL requirement on the proton (DLLp > 5). This is enough to reduce the trigger rate to an acceptable level, and has very high efficiency on signal. Since the requirement is very loose, the offline analysis relies on a further cut on the proton (P robnnp > 0.2). Then the analysts want to use the PIDCalib package to assess the efficiency of the PID requirement on the proton. Namely, [DLLp] online > 5 AND [P robnnp] offline > 0.2. Because of the different algorithms used online and offline this is different from the requirement and, of course, from [DLLp] online > 5 AND [P robnnp] online > 0.2, [DLLp] offline > 5 AND [P robnnp] offline > 0.2. Furthermore, as a result of the strong correlation between the two variables, the online requirement on DLLp drastically modifies the distribution of ProbNNp, and thus considering the efficiency ε one can write that ε([dllp] online > 5 AND [P robnnp] offline > 0.2) ε([dllp] online > 5) ε([p robnnp] offline > 0.2). It is therefore clear that calibration samples must to allow the evaluation of mixed requirements such as the one in the example above, on an candidate-by-candidate basis. This can be achieved by matching the candidates reconstructed online, and saved as part of the Turbo stream, to candidates reconstructed offline. The easiest way to achieve this, is to create a sample containing, for each event, both online and offline candidates. Specification: It must be possible to match online and offline candidates stored in the calibration samples, and for each candidate, the online and offline reconstructed PID observables must be stored. 4.4 Considerations about statistics of control samples A key point for the calibration samples is to be properly sampled in order to provide acceptable uncertainty in the determination of the selection efficiency in the binning scheme defined by the analysis needs. The challenge is to cover the wide range of kinematic variables used to measure the detector and PID performance. During Run 1, for a dataset corresponding to an integrated luminosity of 3 fb 1, each calibration sample accounted for roughly 10M events. Despite the size of the samples, some analyses were already dominated by systematic uncertainty on the PID selection efficiency (especially when considering protons). During Run 2, more and more results will face limitations due to systematics effects and control samples are one of the handles to keep these effects under control. Moreover, samples need to be of a sufficient size to allow the study of correlated systematic effects. It has been estimated that calibration samples one order of magnitude larger than those taken in Run 1, together with binned prescaling to page 5

8 5 Calibration lines in Run 2 Date: July 18, 2016 improve the coverage of low-statistics regions (cf. [11]), should result in systematic uncertainties of an acceptable size. Roughly 100M events per particle type (p, K, π, µ, e), implies a total of 500M events for the calibration stream. This should be achievable allocating 100 Hz bandwidth for the trigger output dedicated to calibration samples during Run 2 operations. Different types of events that are of interest for the PID calibration sample are selected by a suite of trigger lines, which are appropriately prescaled to match the above requirements [11]. For most of Run 2, the analysis of data collected in 2015 allowed the pre-scale factors for the calibration samples to be determined. 4.5 Considerations about online-offline convergence In Run 2, the availability of a larger CPU budget in the HLT and optimized code allow the same reconstruction to be run online and offline. Online and offline tracking reconstructions show very small differences allowing for the same offline quality also in online reconstruction. Reconstruction of PID observable is the same online and offline. Residual differences are still present due to a slightly different reconstruction of the Calorimeter information. Although efforts are in place to reduce this difference to a negligible level, having both online and offline information stored in the calibration samples allows to safely measure PID performance. Even in case of identical online and offline reconstruction, it makes sense to keep the online information, as this is used as part of trigger selections. Future developments of the offline reconstruction and subsequent improvements of the offline performance could cause offline reconstructed PID observables to change, and the loss of the values computed and used online, if those are not retained. To avoid this, the online reconstructed PID observables are stored in the raw data. 4.6 Summary of the specifications The specifications discussed above are listed here for convenience: 1. Calibration samples must include all the PID observables (see Tables 2 and 3 in the Appendix) as reconstructed online. 2. Calibration samples must include Rich, Calorimeter, Muon, Trigger and Tracking raw event. 3. Calibration samples must include the input variables of PIDANN algorithms as they are available online and stored in the TurboReport of analyses based on Turbo stream. 4. It must be possible to match online and offline candidates stored in the calibration samples, and for each candidate, the online and offline reconstructed PID observables must be stored. As a first results, Calibration samples were available together with the first data collected in June and July 2015 allowing the presentation of physics results at the 2015 Summer Conferencec [12, 13]. 5 Calibration lines in Run 2 During the second Run of the LHC, the calibration samples are selected by Hlt2 lines writing to a dedicated stream, TurboCalib, that contains Turbo information and retains the detector raw data. Throughout Run 2, both the TurboReport and the raw event have to be stored in the TurboCalib stream. The TurboReport will be processed offline using Tesla [7]. The Brunel reconstruction will be run centrally, while no further offline processing, i.e. selections and streaming of events, is foreseen, because further selections in addition to those that select the events in the trigger would defeat the purpose of the sample. The creation of candidate decays will be part of the ntuple production step, instead of a separate offline processing step. To produce the final samples needed by PIDCalib, three steps are needed, described in a single configuration file: page 6

9 6 Operations with calibration lines during Run 2 Date: July 18, 2016 Trigger & Computing PIDCalib expert part PIDCalib user part HLT2 TurboCalib TurboReport + RAW lines DaVinci Create CommonParticles Load Tesla candidates Match Tesla cand. and CommonParticles DaVinci Run your own Reconstruction Try improving PID performance Take precomputed sweights Offline CALIBRATION stream Write ntuples Write sweighted ntuples RECO Brunel Project PIDCalib/CalibDataSel Resurrect Tesla Project Create one histograms per ntuple file Merge all the histograms splot of the samples Write stables PIDCalib/PerfToolScripts Bookkeeping FullTurbo.DST Stream ROOT Apply sweights Script Produce efficiency tables Use reweighting tools MultiTrack Performance Figure 1 Schematic representation of the data flow for the Calibration samples. Note that there is no offline selection since candidates are produced online and resurrected using Tesla. Daughter particles are matched to the offline candidates in the DaVinci step while producing ntuples. stables contain the sweight needed to statistically subtract the residual background following the s Plot technique [4]. Production and selection of the decay candidates; Matching of the online and offline candidates; Output to ntuples, providing the input of the PIDCalib package. An extension of the PIDCalib package has been implemented to support the the production of MicroDST datasets including the s Weight of the calibration candidate obtained from a binned likelihood fit of the signal-background discriminating distributions. These light-weight calibration samples will be available to the Collaboration to study the performance of ad-hoc tunings of the existing algorithms or to study new algorithms without rerunning the fit and the s Plot parts of PIDCalib. The data flow is sketched in Figure 1. 6 Operations with calibration lines during Run 2 The needs that have to be fullfilled by the calibration samples were different in the Early Measurement in 2015, the validation (second part of 2015 data taking), and long Run 2 phases. Throughout 2015, the rate of the calibration samples (for both PID and Tracking lines) was between 1.2 khz (Early Measurement) and 0.6 khz (validation). Data taken in 2015 allowed for a first optimization of both PID and Tracking lines that lead to a rate of 0.4 khz in the 2016 configuration. The optimization will continue in 2016 with the aim of reaching a rate 0.1 khz for TurboCalib for most of Run 2. Besides the rate, another important aspect to be kept under control is the overlap between different streams, because it can badly affects the use of computing resources, in particular the storage. Details on these aspects depend on the specific configuration used. (The situation at the time of writing can be found in the Appendix, see Table 4.) As of this writing, HLT provides different streams: Full, Turbo, TurboCalib, defined in Sect. 3, and two ancillar ones, Lumi, providing compact information about recorded luminosity, the Turbo stream goes through the Turbo and Turbo validation workflows, the latter only in 2015 for validation purposes. A schematic representation is issued in Table 1. From the beginning of Run 2 data taking, page 7

10 8 Conclusion Date: July 18, 2016 Trigger RB Workflow Offline Raw data File name Full 87 Full Brunel+DaVinci RAW FULL.DST Turbo 88 Turbo Tesla RAW+TurboReport TURBO.MDST Turbo 88 Turbo validation Brunel+Tesla RAW+TurboReport FULLTURBO.DST TurboCalib 90 Calibration Brunel+Tesla RAW+TurboReport FULLTURBO.DST Table 1 Summary of processing schemes. Each stream is routed by a specific routing bit (RB). In 2015 for validation purposes,turbo data have been duplicated as different output of Turbo and Turbo validation workflows. In 2016 only the Turbo workflow is used. calibration data followed the Calibration workflow. The latter, schematically shown in the left part of Fig 1, is meant to process TurboCalib data. Offline and online recreated candidates are saved separately (technically, defining two different Transient Event Store (TES) locations in the same file), allowing for an easy matching of online and offline candidates (see 4.3). The same matching was needed during the validation of the Turbo stream reconstruction. As stated in Section 4.4, for PID calibration purposes, a total trigger rate of about 100 Hz should be sufficient to guarantee the evaluation of the PID performance with the required precision. For very high yield Turbo analyses, this might not be the case. By taking advantage of the fact that only online reconstructed quantities are required to calibrate PID observables for Turbo analyses, additional selections could be added to the Turbo stream itself, where only the small TurboReport is retained, to provide the required additional calibration samples. 7 Matching the online and offline candidates The online and offline candidate are stored separately within each processed event, so comparing their PID observables requires them to be matched. The matching can be done on the basis of the number of shared LHCbIDs (corresponding to physical hits/clusters in the detector), exploiting the TisTos algorithm [5], or combining the two techniques. To build the samples, first the online candidate - as selected by the trigger - is written. Then the track of the probe particle is matched to a track reconstructed in the offline processing. The PID variables of the online and offline reconstructed candidates are written into two different locations and are available for the users to create combined requirements as described in Section 4. The Tesla-Brunel matching algorithm is now part of the LHCb software. 8 Conclusion In order to provide calibration samples for analyses relying on Turbo and Full stream, their selection strategy has been implemented in the trigger. Selected events need to be reconstructed offline and therefore detector raw data must be retained. On the other hand, the full PID information, needed to reconstruct the ProbNN variables, can only be stored for candidates selected by a line in the Turbo stream, The simplest technical solution to satisfy these specification is to write the Calibration samples to a dedicated stream, TurboCalib, containing both the TurboReport and the raw event. The stream is processed centrally to restore the online candidates and with the standard offline reconstruction to produce the offline candidates, which are stored separately in each event in the output file. Online and offline candidates are then matched to produce ntuples containing, candidate-by-candidate, the online and offline PID variables. The TurboCalib stream developed for the PID calibration samples, has been already used also for the Tracking calibration samples. It had a key role in the validation process of both the Turbo stream and the Tesla application. Finally it is also a fundamental tool to develop a viable strategy for the calibration samples in the LHCb Upgrade. page 8

11 10 References Date: July 18, Acknowledgements We warmly thank our colleagues R. Aaij and P. Charpentier for the careful review of this text in the passage from Internal to Public note 10 References [1] LHCb Collaboration The LHCb Detector at the LHC J. Instrum. 3 (2008) S08005 [2] Chris Jones, ANN PID, [3] Sneha Malde, PIDCalib Packages, [4] Muriel Pivk, Francois R. Le Diberder, splot: a statistical tool to unfold data distributions arxiv:physics/ [5] R. Aaij et al., The LHCb Trigger and its Performance in 2011, JINST 8 (2013) P [6] J. Albrecht et al. [LHCb HLT project Collaboration], Performance of the LHCb High Level Trigger in 2012, J. Phys. Conf. Ser. 513 (2014) [7] R. Aaij et al., Tesla : an application for real-time data analysis in High Energy Physics, arxiv: [8] LHCb Collaboration, LHCb Trigger and Online Upgrade Technical Design Report, CERN-LHCC ; LHCB-TDR-016 [9] LHCb MuonID group, Performance of the Muon Identification at LHCb, J. Instrum. 8 (2013) P10020; arxiv: [10] LHCb collaboration LHCb Detector Performance, Int. J. Mod. Phys. A 30 (2015) , arxiv: [11] L. Anderlini, O. Lupton, B. Sciascia, V. Gligorov Calibration samples for particle identification at LHCb in Run 2, LHCb-PUB ; CERN-LHCb-PUB [12] R. Aaij et al. [LHCb Collaboration], Measurement of forward J/ψ production cross-sections in pp collisions at s = 13 TeV, JHEP 1510 (2015) 172, arxiv: [13] R. Aaij et al. [LHCb Collaboration], Measurements of prompt charm production cross-sections in pp collisions at s = 13 TeV, JHEP 1603 (2016) 159, arxiv: page 9

12 A Appendix Date: July 18, 2016 A Appendix Tracking Rich Calorimeters P USED R1 GAS Ecal PIDe PT USED R2 GAS Ecal PIDmu CHI2NDOF ABOVE MU THRESHOLD HCal PIDe NDOF ABOVE k THRESHOLD HCal PIDmu LIKELIHOOD DLLe Prs PIDe GHOST PROBABILITY DLLmu InAccBrem FIT MATCH CHI2 DLLk BremPIDe CLONE DISTANCE DLLp FIT VELO CHI2 DLLbt FIT VELO NDOF FIT T CHI2 FIT T NDOF Muon Velo Background Likelihood VeloCharge Muon Likelihood IsMuon nshared InMuon Acceptance IsLooseMuon Table 2 List of PID-related variables that are inputs to the PIDANN algorithms (Run 1 version) and stored in the TurboReport. page 10

13 A Appendix Date: July 18, 2016 Tracking Rich Calorimeters P USED R1 GAS Ecal PIDe PT USED R2 GAS Ecal PIDmu CHI2NDOF ABOVE MU THRESHOLD HCal PIDe NDOF ABOVE k THRESHOLD HCal PIDmu GHOST PROBABILITY DLLe Prs PIDe FIT MATCH CHI2 DLLmu InAccBrem FIT VELO CHI2 DLLk BremPIDe FIT VELO NDOF DLLp FIT T CHI2 DLLbt FIT T NDOF Muon Velo Background Likelihood Muon Likelihood IsMuon nshared InMuon Acceptance IsLooseMuon Table 3 List of PID-related variables that are inputs to the PIDANN algorithms (Run 2 version), and stored in the TurboReport. Channel Rate Charm Charm Topo Leptons Turbo EW Low Other Tech Full Turbo Calib Mult (khz) (%) (%) (%) (%) (%) (%) (%) (%) (%) ALL CharmFull CharmTurbo Topo Leptons TurboCalib EW LowMult Other Technical Table 4 Overlap between different samples. Each cell is normalised to the rate expected from each sample (shown in the second column). page 11

First LHCb measurement with data from the LHC Run 2

First LHCb measurement with data from the LHC Run 2 IL NUOVO CIMENTO 40 C (2017) 35 DOI 10.1393/ncc/i2017-17035-4 Colloquia: IFAE 2016 First LHCb measurement with data from the LHC Run 2 L. Anderlini( 1 )ands. Amerio( 2 ) ( 1 ) INFN, Sezione di Firenze

More information

ATLAS, CMS and LHCb Trigger systems for flavour physics

ATLAS, CMS and LHCb Trigger systems for flavour physics ATLAS, CMS and LHCb Trigger systems for flavour physics Università degli Studi di Bologna and INFN E-mail: guiducci@bo.infn.it The trigger systems of the LHC detectors play a crucial role in determining

More information

Tesla : an application for real-time data analysis in High Energy Physics

Tesla : an application for real-time data analysis in High Energy Physics EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH (CERN) 19 April 016 CERN-LHCb-DP-016-001 arxiv:1604.05596v1 [physics.ins-det] 19 Apr 016 Tesla : an application for real-time data analysis in High Energy Physics

More information

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector. CERN E-mail: agnieszka.dziurda@cern.ch he LHCb detector is a single-arm forward spectrometer

More information

The GAP project: GPU applications for High Level Trigger and Medical Imaging

The GAP project: GPU applications for High Level Trigger and Medical Imaging The GAP project: GPU applications for High Level Trigger and Medical Imaging Matteo Bauce 1,2, Andrea Messina 1,2,3, Marco Rescigno 3, Stefano Giagu 1,3, Gianluca Lamanna 4,6, Massimiliano Fiorini 5 1

More information

Calorimeter Object Status. A&S week, Feb M. Chefdeville, LAPP, Annecy

Calorimeter Object Status. A&S week, Feb M. Chefdeville, LAPP, Annecy Calorimeter Object Status St A&S week, Feb. 1 2017 M. Chefdeville, LAPP, Annecy Outline Status of Pi0 calibration Calibration survey with electrons (E/p) Calorimeter performance with single photons from

More information

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN CERN E-mail: mia.tosi@cern.ch During its second period of operation (Run 2) which started in 2015, the LHC will reach a peak instantaneous luminosity of approximately 2 10 34 cm 2 s 1 with an average pile-up

More information

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract

ATLAS NOTE. December 4, ATLAS offline reconstruction timing improvements for run-2. The ATLAS Collaboration. Abstract ATLAS NOTE December 4, 2014 ATLAS offline reconstruction timing improvements for run-2 The ATLAS Collaboration Abstract ATL-SOFT-PUB-2014-004 04/12/2014 From 2013 to 2014 the LHC underwent an upgrade to

More information

LHCb Computing Resources: 2018 requests and preview of 2019 requests

LHCb Computing Resources: 2018 requests and preview of 2019 requests LHCb Computing Resources: 2018 requests and preview of 2019 requests LHCb-PUB-2017-009 23/02/2017 LHCb Public Note Issue: 0 Revision: 0 Reference: LHCb-PUB-2017-009 Created: 23 rd February 2017 Last modified:

More information

Full Offline Reconstruction in Real Time with the LHCb Detector

Full Offline Reconstruction in Real Time with the LHCb Detector Full Offline Reconstruction in Real Time with the LHCb Detector Agnieszka Dziurda 1,a on behalf of the LHCb Collaboration 1 CERN, Geneva, Switzerland Abstract. This document describes the novel, unique

More information

Tracking and Vertex reconstruction at LHCb for Run II

Tracking and Vertex reconstruction at LHCb for Run II Tracking and Vertex reconstruction at LHCb for Run II Hang Yin Central China Normal University On behalf of LHCb Collaboration The fifth Annual Conference on Large Hadron Collider Physics, Shanghai, China

More information

First results from the LHCb Vertex Locator

First results from the LHCb Vertex Locator First results from the LHCb Vertex Locator Act 1: LHCb Intro. Act 2: Velo Design Dec. 2009 Act 3: Initial Performance Chris Parkes for LHCb VELO group Vienna Conference 2010 2 Introducing LHCb LHCb is

More information

Adding timing to the VELO

Adding timing to the VELO Summer student project report: Adding timing to the VELO supervisor: Mark Williams Biljana Mitreska Cern Summer Student Internship from June 12 to August 4, 2017 Acknowledgements I would like to thank

More information

Tracking and flavour tagging selection in the ATLAS High Level Trigger

Tracking and flavour tagging selection in the ATLAS High Level Trigger Tracking and flavour tagging selection in the ATLAS High Level Trigger University of Pisa and INFN E-mail: milene.calvetti@cern.ch In high-energy physics experiments, track based selection in the online

More information

CMS Alignement and Calibration workflows: lesson learned and future plans

CMS Alignement and Calibration workflows: lesson learned and future plans Available online at www.sciencedirect.com Nuclear and Particle Physics Proceedings 273 275 (2016) 923 928 www.elsevier.com/locate/nppp CMS Alignement and Calibration workflows: lesson learned and future

More information

CMS Conference Report

CMS Conference Report Available on CMS information server CMS CR 2005/021 CMS Conference Report 29 Septemebr 2005 Track and Vertex Reconstruction with the CMS Detector at LHC S. Cucciarelli CERN, Geneva, Switzerland Abstract

More information

Performance of the ATLAS Inner Detector at the LHC

Performance of the ATLAS Inner Detector at the LHC Performance of the ALAS Inner Detector at the LHC hijs Cornelissen for the ALAS Collaboration Bergische Universität Wuppertal, Gaußstraße 2, 4297 Wuppertal, Germany E-mail: thijs.cornelissen@cern.ch Abstract.

More information

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT) Universitá degli Studi di Padova e INFN (IT) E-mail: mia.tosi@gmail.com The trigger systems of the LHC detectors play a crucial role in determining the physics capabilities of the experiments. A reduction

More information

Primary Vertex Reconstruction at LHCb

Primary Vertex Reconstruction at LHCb LHCb-PUB-214-44 October 21, 214 Primary Vertex Reconstruction at LHCb M. Kucharczyk 1,2, P. Morawski 3, M. Witek 1. 1 Henryk Niewodniczanski Institute of Nuclear Physics PAN, Krakow, Poland 2 Sezione INFN

More information

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions The LHCb upgrade Burkhard Schmidt for the LHCb Collaboration Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions OT IT coverage 1.9

More information

b-jet identification at High Level Trigger in CMS

b-jet identification at High Level Trigger in CMS Journal of Physics: Conference Series PAPER OPEN ACCESS b-jet identification at High Level Trigger in CMS To cite this article: Eric Chabert 2015 J. Phys.: Conf. Ser. 608 012041 View the article online

More information

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger

Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger Journal of Physics: Conference Series PAPER OPEN ACCESS Performance of Tracking, b-tagging and Jet/MET reconstruction at the CMS High Level Trigger To cite this article: Mia Tosi 205 J. Phys.: Conf. Ser.

More information

The LHCb Upgrade. LHCC open session 17 February Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014

The LHCb Upgrade. LHCC open session 17 February Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014 The LHCb Upgrade LHCC open session 17 February 2010 Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014 Andreas Schopper on behalf of Motivation LHCb is a high precision experiment

More information

HLT Hadronic L0 Confirmation Matching VeLo tracks to L0 HCAL objects

HLT Hadronic L0 Confirmation Matching VeLo tracks to L0 HCAL objects LHCb Note 26-4, TRIG LPHE Note 26-14 July 5, 26 HLT Hadronic L Confirmation Matching VeLo tracks to L HCAL objects N. Zwahlen 1 LPHE, EPFL Abstract This note describes the HltHadAlleyMatchCalo tool that

More information

MIP Reconstruction Techniques and Minimum Spanning Tree Clustering

MIP Reconstruction Techniques and Minimum Spanning Tree Clustering SLAC-PUB-11359 July 25 MIP Reconstruction Techniques and Minimum Spanning Tree Clustering Wolfgang F. Mader The University of Iowa, 23 Van Allen Hall, 52242 Iowa City, IA The development of a tracking

More information

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization Scientifica Acta 2, No. 2, 74 79 (28) Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization Alessandro Grelli Dipartimento di Fisica Nucleare e Teorica, Università

More information

PoS(High-pT physics09)036

PoS(High-pT physics09)036 Triggering on Jets and D 0 in HLT at ALICE 1 University of Bergen Allegaten 55, 5007 Bergen, Norway E-mail: st05886@alf.uib.no The High Level Trigger (HLT) of the ALICE experiment is designed to perform

More information

The ALICE electromagnetic calorimeter high level triggers

The ALICE electromagnetic calorimeter high level triggers Journal of Physics: Conference Series The ALICE electromagnetic calorimeter high level triggers To cite this article: F Ronchetti et al 22 J. Phys.: Conf. Ser. 96 245 View the article online for updates

More information

ATLAS PILE-UP AND OVERLAY SIMULATION

ATLAS PILE-UP AND OVERLAY SIMULATION ATLAS PILE-UP AND OVERLAY SIMULATION LPCC Detector Simulation Workshop, June 26-27, 2017 ATL-SOFT-SLIDE-2017-375 22/06/2017 Tadej Novak on behalf of the ATLAS Collaboration INTRODUCTION In addition to

More information

Track pattern-recognition on GPGPUs in the LHCb experiment

Track pattern-recognition on GPGPUs in the LHCb experiment Track pattern-recognition on GPGPUs in the LHCb experiment Stefano Gallorini 1,2 1 University and INFN Padova, Via Marzolo 8, 35131, Padova, Italy 2 CERN, 1211 Geneve 23, Switzerland DOI: http://dx.doi.org/10.3204/desy-proc-2014-05/7

More information

Muon Reconstruction and Identification in CMS

Muon Reconstruction and Identification in CMS Muon Reconstruction and Identification in CMS Marcin Konecki Institute of Experimental Physics, University of Warsaw, Poland E-mail: marcin.konecki@gmail.com An event reconstruction at LHC is a challenging

More information

arxiv:hep-ph/ v1 11 Mar 2002

arxiv:hep-ph/ v1 11 Mar 2002 High Level Tracker Triggers for CMS Danek Kotliński a Andrey Starodumov b,1 a Paul Scherrer Institut, CH-5232 Villigen, Switzerland arxiv:hep-ph/0203101v1 11 Mar 2002 b INFN Sezione di Pisa, Via Livornese

More information

LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

LHCb Computing Resources: 2019 requests and reassessment of 2018 requests LHCb Computing Resources: 2019 requests and reassessment of 2018 requests LHCb-PUB-2017-019 09/09/2017 LHCb Public Note Issue: 0 Revision: 0 Reference: LHCb-PUB-2017-019 Created: 30 th August 2017 Last

More information

CMS FPGA Based Tracklet Approach for L1 Track Finding

CMS FPGA Based Tracklet Approach for L1 Track Finding CMS FPGA Based Tracklet Approach for L1 Track Finding Anders Ryd (Cornell University) On behalf of the CMS Tracklet Group Presented at AWLC June 29, 2017 Anders Ryd Cornell University FPGA Based L1 Tracking

More information

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC On behalf of the ATLAS Collaboration Uppsala Universitet E-mail: mikael.martensson@cern.ch ATL-DAQ-PROC-2016-034 09/01/2017 A fast

More information

Prompt data reconstruction at the ATLAS experiment

Prompt data reconstruction at the ATLAS experiment Prompt data reconstruction at the ATLAS experiment Graeme Andrew Stewart 1, Jamie Boyd 1, João Firmino da Costa 2, Joseph Tuggle 3 and Guillaume Unal 1, on behalf of the ATLAS Collaboration 1 European

More information

Design of the new ATLAS Inner Tracker (ITk) for the High Luminosity LHC

Design of the new ATLAS Inner Tracker (ITk) for the High Luminosity LHC Design of the new ATLAS Inner Tracker (ITk) for the High Luminosity LHC Jike Wang (DESY) for the ATLAS Collaboration May/2017, TIPP 2017 LHC Machine Schedule In year 2015, ATLAS and CMS went into Run2

More information

Determination of the aperture of the LHCb VELO RF foil

Determination of the aperture of the LHCb VELO RF foil LHCb-PUB-214-12 April 1, 214 Determination of the aperture of the LHCb VELO RF foil M. Ferro-Luzzi 1, T. Latham 2, C. Wallace 2. 1 CERN, Geneva, Switzerland 2 University of Warwick, United Kingdom LHCb-PUB-214-12

More information

PoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout

PoS(ACAT08)101. An Overview of the b-tagging Algorithms in the CMS Offline Software. Christophe Saout An Overview of the b-tagging Algorithms in the CMS Offline Software Christophe Saout CERN, Geneva, Switzerland E-mail: christophe.saout@cern.ch The CMS Offline software contains a widespread set of algorithms

More information

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment EPJ Web of Conferences 108, 02023 (2016) DOI: 10.1051/ epjconf/ 201610802023 C Owned by the authors, published by EDP Sciences, 2016 A New Segment Building Algorithm for the Cathode Strip Chambers in the

More information

Direct photon measurements in ALICE. Alexis Mas for the ALICE collaboration

Direct photon measurements in ALICE. Alexis Mas for the ALICE collaboration Direct photon measurements in ALICE Alexis Mas for the ALICE collaboration 1 Outline I - Physics motivations for direct photon measurements II Direct photon measurements in ALICE i - Conversion method

More information

Software and computing evolution: the HL-LHC challenge. Simone Campana, CERN

Software and computing evolution: the HL-LHC challenge. Simone Campana, CERN Software and computing evolution: the HL-LHC challenge Simone Campana, CERN Higgs discovery in Run-1 The Large Hadron Collider at CERN We are here: Run-2 (Fernando s talk) High Luminosity: the HL-LHC challenge

More information

Status of the TORCH time-of-flight detector

Status of the TORCH time-of-flight detector Status of the TORCH time-of-flight detector Neville Harnew University of Oxford (On behalf of the TORCH collaboration : the Universities of Bath, Bristol and Oxford, CERN, and Photek) August 7-9, 2017

More information

Electron and Photon Reconstruction and Identification with the ATLAS Detector

Electron and Photon Reconstruction and Identification with the ATLAS Detector Electron and Photon Reconstruction and Identification with the ATLAS Detector IPRD10 S12 Calorimetry 7th-10th June 2010 Siena, Italy Marine Kuna (CPPM/IN2P3 Univ. de la Méditerranée) on behalf of the ATLAS

More information

Performance of the MRPC based Time Of Flight detector of ALICE at LHC

Performance of the MRPC based Time Of Flight detector of ALICE at LHC Performance of the MRPC based Time Of Flight detector of ALICE at LHC (for the ALICE Collaboration) Museo Storico della Fisica e Centro Studi e Ricerche "Enrico Fermi", Rome, Italy Dipartimento di Fisica

More information

Time of CDF (II)

Time of CDF (II) TOF detector lecture, 19. august 4 1 Time of Flight @ CDF (II) reconstruction/simulation group J. Beringer, A. Deisher, Ch. Doerr, M. Jones, E. Lipeles,, M. Shapiro, R. Snider, D. Usynin calibration group

More information

Level 0 trigger decision unit for the LHCb experiment

Level 0 trigger decision unit for the LHCb experiment Level 0 trigger decision unit for the LHCb experiment J. Laubser, H. Chanal, R. Cornat, O. Deschamps, M. Magne, P. Perret for the LHCb Collaboration Laboratoire de Physique Corpusculaire (IN2P3/CNRS),

More information

Study of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider.

Study of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider. Study of the Higgs boson coupling to the top quark and of the b jet identification with the ATLAS experiment at the Large Hadron Collider. Calvet Thomas CPPM, ATLAS group PhD day 25 novembre 2015 2 The

More information

PoS(IHEP-LHC-2011)002

PoS(IHEP-LHC-2011)002 and b-tagging performance in ATLAS Università degli Studi di Milano and INFN Milano E-mail: andrea.favareto@mi.infn.it The ATLAS Inner Detector is designed to provide precision tracking information at

More information

arxiv: v1 [physics.ins-det] 21 Sep 2015

arxiv: v1 [physics.ins-det] 21 Sep 2015 JMLR: Workshop and Conference Proceedings 42: 1-18 NIPS 2014, HEP-ML Workshop Real-time data analysis at the LHC: present and future arxiv:1509.06173v1 [physics.ins-det] 21 Sep 2015 Vladimir V. Gligorov

More information

Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction

Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction Precision Timing in High Pile-Up and Time-Based Vertex Reconstruction Cedric Flamant (CERN Summer Student) - Supervisor: Adi Bornheim Division of High Energy Physics, California Institute of Technology,

More information

The CMS data quality monitoring software: experience and future prospects

The CMS data quality monitoring software: experience and future prospects The CMS data quality monitoring software: experience and future prospects Federico De Guio on behalf of the CMS Collaboration CERN, Geneva, Switzerland E-mail: federico.de.guio@cern.ch Abstract. The Data

More information

Computing. DOE Program Review SLAC. Rainer Bartoldus. Breakout Session 3 June BaBar Deputy Computing Coordinator

Computing. DOE Program Review SLAC. Rainer Bartoldus. Breakout Session 3 June BaBar Deputy Computing Coordinator Computing DOE Program Review SLAC Breakout Session 3 June 2004 Rainer Bartoldus BaBar Deputy Computing Coordinator 1 Outline The New Computing Model (CM2) New Kanga/ROOT event store, new Analysis Model,

More information

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS CR -2008/100 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 02 December 2008 (v2, 03 December 2008)

More information

ATLAS NOTE ATLAS-CONF July 20, Commissioning of the ATLAS high-performance b-tagging algorithms in the 7 TeV collision data

ATLAS NOTE ATLAS-CONF July 20, Commissioning of the ATLAS high-performance b-tagging algorithms in the 7 TeV collision data ALAS NOE ALAS-CONF-2-2 July 2, 2 Commissioning of the ALAS high-performance b-tagging algorithms in the ev collision data he ALAS collaboration ALAS-CONF-2-2 2 July 2 Abstract he ability to identify jets

More information

ATLAS ITk Layout Design and Optimisation

ATLAS ITk Layout Design and Optimisation ATLAS ITk Layout Design and Optimisation Noemi Calace noemi.calace@cern.ch On behalf of the ATLAS Collaboration 3rd ECFA High Luminosity LHC Experiments Workshop 3-6 October 2016 Aix-Les-Bains Overview

More information

Physics Analysis Tools for Beauty Physics in ATLAS

Physics Analysis Tools for Beauty Physics in ATLAS Physics Analysis Tools for Beauty Physics in ATLAS Christos Anastopoulos 8, Eva Bouhova-Thacker 2, James Catmore 2, Steve Dallison 5, Frederick Derue 6, Brigitte Epp 3, Patrick Jussel 3, Anna Kaczmarska

More information

L1 and Subsequent Triggers

L1 and Subsequent Triggers April 8, 2003 L1 and Subsequent Triggers Abstract During the last year the scope of the L1 trigger has changed rather drastically compared to the TP. This note aims at summarising the changes, both in

More information

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

TORCH: A large-area detector for precision time-of-flight measurements at LHCb TORCH: A large-area detector for precision time-of-flight measurements at LHCb Neville Harnew University of Oxford ON BEHALF OF THE LHCb RICH/TORCH COLLABORATION Outline The LHCb upgrade TORCH concept

More information

Track reconstruction with the CMS tracking detector

Track reconstruction with the CMS tracking detector Track reconstruction with the CMS tracking detector B. Mangano (University of California, San Diego) & O.Gutsche (Fermi National Accelerator Laboratory) Overview The challenges The detector Track reconstruction

More information

Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S)

Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S) Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S) Overview Large Hadron Collider (LHC) Compact Muon Solenoid (CMS) experiment The Challenge Worldwide LHC

More information

CMS High Level Trigger Timing Measurements

CMS High Level Trigger Timing Measurements Journal of Physics: Conference Series PAPER OPEN ACCESS High Level Trigger Timing Measurements To cite this article: Clint Richardson 2015 J. Phys.: Conf. Ser. 664 082045 Related content - Recent Standard

More information

Status of PID. PID in in Release Muon Identification Influence of of G4-Bug on on PID. BABAR Collaboration Meeting, Oct 1st 2005

Status of PID. PID in in Release Muon Identification Influence of of G4-Bug on on PID. BABAR Collaboration Meeting, Oct 1st 2005 Status of PID PID in in Release 18 18 Run Run 3,, Run Run 5 Muon Identification Influence of of G4-Bug on on PID The PID-Group: David Aston (*), Bipul Bhuyan (*), Thorsten Brandt, Kevin Flood, Jonathan

More information

Tracking and Vertexing performance in CMS

Tracking and Vertexing performance in CMS Vertex 2012, 16-21 September, Jeju, Korea Tracking and Vertexing performance in CMS Antonio Tropiano (Università and INFN, Firenze) on behalf of the CMS collaboration Outline Tracker description Track

More information

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE. Proceedings of the PIC 2012, Štrbské Pleso, Slovakia THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE. E.STANECKA, ON BEHALF OF THE ATLAS COLLABORATION Institute of Nuclear Physics

More information

arxiv: v1 [hep-ex] 7 Jul 2011

arxiv: v1 [hep-ex] 7 Jul 2011 LHCb BEAM-GAS IMAGING RESULTS P. Hopchev, LAPP, IN2P3-CNRS, Chemin de Bellevue, BP110, F-74941, Annecy-le-Vieux For the LHCb Collaboration arxiv:1107.1492v1 [hep-ex] 7 Jul 2011 Abstract The high resolution

More information

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS CR -2017/188 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 29 June 2017 (v2, 07 July 2017) Common

More information

Data Quality Monitoring at CMS with Machine Learning

Data Quality Monitoring at CMS with Machine Learning Data Quality Monitoring at CMS with Machine Learning July-August 2016 Author: Aytaj Aghabayli Supervisors: Jean-Roch Vlimant Maurizio Pierini CERN openlab Summer Student Report 2016 Abstract The Data Quality

More information

Analogue, Digital and Semi-Digital Energy Reconstruction in the CALICE AHCAL

Analogue, Digital and Semi-Digital Energy Reconstruction in the CALICE AHCAL Analogue, Digital and Semi-Digital Energy Reconstruction in the AHCAL Deutsches Elektronen Synchrotron (DESY), Hamburg, Germany E-mail: coralie.neubueser@desy.de Within the collaboration different calorimeter

More information

HPS Data Analysis Group Summary. Matt Graham HPS Collaboration Meeting June 6, 2013

HPS Data Analysis Group Summary. Matt Graham HPS Collaboration Meeting June 6, 2013 HPS Data Analysis Group Summary Matt Graham HPS Collaboration Meeting June 6, 2013 Data Analysis 101 define what you want to measure get data to disk (by magic or whatever) select subset of data to optimize

More information

The CMS Computing Model

The CMS Computing Model The CMS Computing Model Dorian Kcira California Institute of Technology SuperComputing 2009 November 14-20 2009, Portland, OR CERN s Large Hadron Collider 5000+ Physicists/Engineers 300+ Institutes 70+

More information

Physics Analysis Software Framework for Belle II

Physics Analysis Software Framework for Belle II Physics Analysis Software Framework for Belle II Marko Starič Belle Belle II collaboration Jožef Stefan Institute, Ljubljana CHEP 2015 M. Starič (IJS) Physics Analysis Software Okinawa, 13-17 April 2015

More information

Monte Carlo Production Management at CMS

Monte Carlo Production Management at CMS Monte Carlo Production Management at CMS G Boudoul 1, G Franzoni 2, A Norkus 2,3, A Pol 2, P Srimanobhas 4 and J-R Vlimant 5 - for the Compact Muon Solenoid collaboration 1 U. C. Bernard-Lyon I, 43 boulevard

More information

Early experience with the Run 2 ATLAS analysis model

Early experience with the Run 2 ATLAS analysis model Early experience with the Run 2 ATLAS analysis model Argonne National Laboratory E-mail: cranshaw@anl.gov During the long shutdown of the LHC, the ATLAS collaboration redesigned its analysis model based

More information

Alignment of the ATLAS Inner Detector tracking system

Alignment of the ATLAS Inner Detector tracking system Alignment of the ATLAS Inner Detector tracking system Instituto de Física Corpuscular (IFIC), Centro Mixto UVEG-CSIC, Apdo.22085, ES-46071 Valencia, E-mail: Regina.Moles@ific.uv.es The ATLAS experiment

More information

Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns

Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns Journal of Physics: Conference Series OPEN ACCESS Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns To cite this article: A Vaniachine et al 2014 J. Phys.: Conf. Ser. 513 032101 View

More information

CMS data quality monitoring: Systems and experiences

CMS data quality monitoring: Systems and experiences Journal of Physics: Conference Series CMS data quality monitoring: Systems and experiences To cite this article: L Tuura et al 2010 J. Phys.: Conf. Ser. 219 072020 Related content - The CMS data quality

More information

The ATLAS Conditions Database Model for the Muon Spectrometer

The ATLAS Conditions Database Model for the Muon Spectrometer The ATLAS Conditions Database Model for the Muon Spectrometer Monica Verducci 1 INFN Sezione di Roma P.le Aldo Moro 5,00185 Rome, Italy E-mail: monica.verducci@cern.ch on behalf of the ATLAS Muon Collaboration

More information

Monitoring of Computing Resource Use of Active Software Releases at ATLAS

Monitoring of Computing Resource Use of Active Software Releases at ATLAS 1 2 3 4 5 6 Monitoring of Computing Resource Use of Active Software Releases at ATLAS Antonio Limosani on behalf of the ATLAS Collaboration CERN CH-1211 Geneva 23 Switzerland and University of Sydney,

More information

LHCb Topological Trigger Reoptimization

LHCb Topological Trigger Reoptimization LHCb Topological Trigger Reoptimization Philip Iten1, Tatiana Likhomanenko2,3, Egor Khairullin2, Andrey Ustyuzhanin2,3, Mike Williams1 1 Massachusetts Institute of Technology, US 2 Yandex School of Data

More information

Studies of the KS and KL lifetimes and

Studies of the KS and KL lifetimes and Studies of the KS and KL lifetimes and BR(K ) with KLOE ± ± + Simona S. Bocchetta* on behalf of the KLOE Collaboration KAON09 Tsukuba June 9th 2009 * INFN and University of Roma Tre Outline DA NE and KLOE

More information

Simulating the RF Shield for the VELO Upgrade

Simulating the RF Shield for the VELO Upgrade LHCb-PUB-- March 7, Simulating the RF Shield for the VELO Upgrade T. Head, T. Ketel, D. Vieira. Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, Brazil European Organization for Nuclear Research

More information

The Database Driven ATLAS Trigger Configuration System

The Database Driven ATLAS Trigger Configuration System Journal of Physics: Conference Series PAPER OPEN ACCESS The Database Driven ATLAS Trigger Configuration System To cite this article: Carlos Chavez et al 2015 J. Phys.: Conf. Ser. 664 082030 View the article

More information

LHCb Computing Status. Andrei Tsaregorodtsev CPPM

LHCb Computing Status. Andrei Tsaregorodtsev CPPM LHCb Computing Status Andrei Tsaregorodtsev CPPM Plan Run II Computing Model Results of the 2015 data processing 2016-2017 outlook Preparing for Run III Conclusions 2 HLT Output Stream Splitting 12.5 khz

More information

VISPA: Visual Physics Analysis Environment

VISPA: Visual Physics Analysis Environment VISPA: Visual Physics Analysis Environment Tatsiana Klimkovich for the VISPA group (O.Actis, M.Erdmann, R.Fischer, A.Hinzmann, M.Kirsch, G.Müller, M.Plum, J.Steggemann) DESY Computing Seminar, 27 October

More information

Machine Learning in Particle Physics. Mike Williams MIT June 16, 2017

Machine Learning in Particle Physics. Mike Williams MIT June 16, 2017 Machine Learning in Particle Physics Mike Williams MIT June 6, 7 Machine Learning Supervised machine learning is a broad and evolving field. The most common usage in physics is training algorithms to classify

More information

CMS Simulation Software

CMS Simulation Software CMS Simulation Software Dmitry Onoprienko Kansas State University on behalf of the CMS collaboration 10th Topical Seminar on Innovative Particle and Radiation Detectors 1-5 October 2006. Siena, Italy Simulation

More information

Tau ID systematics in the Z lh cross section measurement

Tau ID systematics in the Z lh cross section measurement Tau ID systematics in the Z lh cross section measurement Frank Seifert1 supervised by Arno Straessner1, Wolfgang Mader1 In collaboration with Michel Trottier McDonald2 Technische Universität Dresden 1

More information

How the Monte Carlo production of a wide variety of different samples is centrally handled in the LHCb experiment

How the Monte Carlo production of a wide variety of different samples is centrally handled in the LHCb experiment Journal of Physics: Conference Series PAPER OPEN ACCESS How the Monte Carlo production of a wide variety of different samples is centrally handled in the LHCb experiment To cite this article: G Corti et

More information

OPERA: A First ντ Appearance Candidate

OPERA: A First ντ Appearance Candidate OPERA: A First ντ Appearance Candidate Björn Wonsak On behalf of the OPERA collaboration. 1 Overview The OPERA Experiment. ντ Candidate Background & Sensitivity Outlook & Conclusions 2/42 Overview The

More information

Data handling and processing at the LHC experiments

Data handling and processing at the LHC experiments 1 Data handling and processing at the LHC experiments Astronomy and Bio-informatic Farida Fassi CC-IN2P3/CNRS EPAM 2011, Taza, Morocco 2 The presentation will be LHC centric, which is very relevant for

More information

Data Reconstruction in Modern Particle Physics

Data Reconstruction in Modern Particle Physics Data Reconstruction in Modern Particle Physics Daniel Saunders, University of Bristol 1 About me Particle Physics student, final year. CSC 2014, tcsc 2015, icsc 2016 Main research interests. Detector upgrades

More information

Implementing Online Calibration Feed Back Loops in the Alice High Level Trigger

Implementing Online Calibration Feed Back Loops in the Alice High Level Trigger Implementing Online Calibration Feed Back Loops in the Alice High Level Trigger Oliver Berroteran 2016-08-23 Supervisor: Markus Fasel, CERN Abstract The High Level Trigger (HLT) is a computing farm consisting

More information

Performance studies of the Roman Pot timing detectors in the forward region of the IP5 at LHC

Performance studies of the Roman Pot timing detectors in the forward region of the IP5 at LHC TOTEM NOTE 2014 001 August 1, 2014 Performance studies of the Roman Pot timing detectors in the forward region of the IP5 at LHC M. Berretti (CERN) Abstract CERN-TOTEM-NOTE-2014-001 01/08/2014 The detection

More information

Evaluation of the computing resources required for a Nordic research exploitation of the LHC

Evaluation of the computing resources required for a Nordic research exploitation of the LHC PROCEEDINGS Evaluation of the computing resources required for a Nordic research exploitation of the LHC and Sverker Almehed, Chafik Driouichi, Paula Eerola, Ulf Mjörnmark, Oxana Smirnova,TorstenÅkesson

More information

Tracking POG Update. Tracking POG Meeting March 17, 2009

Tracking POG Update. Tracking POG Meeting March 17, 2009 Tracking POG Update Tracking POG Meeting March 17, 2009 Outline Recent accomplishments in Tracking POG - Reconstruction improvements for collisions - Analysis of CRAFT Data Upcoming Tasks Announcements

More information

High Level Trigger System for the LHC ALICE Experiment

High Level Trigger System for the LHC ALICE Experiment High Level Trigger System for the LHC ALICE Experiment H Helstrup 1, J Lien 1, V Lindenstruth 2,DRöhrich 3, B Skaali 4, T Steinbeck 2, K Ullaland 3, A Vestbø 3, and A Wiebalck 2 for the ALICE Collaboration

More information

The Belle II Software From Detector Signals to Physics Results

The Belle II Software From Detector Signals to Physics Results The Belle II Software From Detector Signals to Physics Results LMU Munich INSTR17 2017-02-28 Belle II @ SuperKEKB B, charm, τ physics 40 higher luminosity than KEKB Aim: 50 times more data than Belle Significantly

More information

Machine Learning in Data Quality Monitoring

Machine Learning in Data Quality Monitoring CERN openlab workshop on Machine Learning and Data Analytics April 27 th, 2017 Machine Learning in Data Quality Monitoring a point of view Goal Maximize the best Quality Data for physics analysis Data

More information

ALICE tracking system

ALICE tracking system ALICE tracking system Marian Ivanov, GSI Darmstadt, on behalf of the ALICE Collaboration Third International Workshop for Future Challenges in Tracking and Trigger Concepts 1 Outlook Detector description

More information