The MICE Physics Computing Capabilities

Size: px
Start display at page:

Download "The MICE Physics Computing Capabilities"

Transcription

1 MICE-NOTE-COMP-0439 The MICE Physics Computing Capabilities C. T. Rogers, D. Rajaram Abstract In this note we present the MICE simulation, online and offline analysis capabilities and detail the framework that is under construction in preparation for MICE data taking. 1 The MICE Computing Project Development of the computing aspects of MICE [1] are described by the MICE computing work package. MICE has a broad spectrum of computing tasks, ranging from reconstruction and detector readout to web services and configuration management tools. Fig. 1: Top level work breakdown structure for the software and computing work package.. The package is broken down into four areas, as shown in Fig. 1. The tasks cover the following areas: Offline: monte carlo and reconstruction of the MICE experiment Online: computing aspects of the control room and readout of the MICE detector electronics 1

2 1 The MICE Computing Project 2 Controls and Monitoring: slow control and monitoring of the experimental equipment and environment. Infrastructure: configuration management and batch production of monte carlo and reconstruction data, data curation tools, MICE web services and ing lists. In this note the monte carlo and reconstruction of the data is described along with the infrastructure that is used to drive it. Fig. 2: Data flow for the MICE computing project. Items in grey are external to the computing work package MICE Computing Process A diagram describing the processing of the MICE data is shown in Fig. 2. In principle, the Analysis group run the Monte Carlo, and use this to design a set of experimental settings, which are passed to operations; the Operations group take some data; slow controls are passed through the Controls and Monitoring infrastructure (CAM), with some Configuration information stored; and detector readout is handled by the Data Acquisition (DAQ) system, which is used both for calibration and physics data. The detectors are reconstructed and the reconstruction output is passed to the Analysis group for further processing. At each stage of the process, the computing group seeks to provide an audit trail,

3 2 Infrastructure 3 enabling users to understand which version of which software handles any data, and reprocess the data if necessary. 1.2 Code Hosting and Data Policy All code in MICE is hosted on launchpad [2], using the Bazaar Distributed Version Control System [3] to manage version control. All code developed by MICE is open source, typically licensed under the GNU Public Licence. MICE data is open and available on the web, in line with STFC data policies. There is no embargo on MICE data, but external papers must acknowledge the MICE collaboration. Papers published by collaboration members must use validated production datasets. Fig. 3: Work breakdown structure for the infrastructure subproject. 2 Infrastructure The work breakdown structure of the MICE computing infrastructure is shown in Fig. 3. The structure is split into three parts, Configuration Management; GRID services and web services. 2.1 Configuration Management The configuration management tools consist of a Configuration Filestore for hosting raw configuration data, and a Configuration Database for storing production configuration constants.

4 2 Infrastructure 4 The Configuration Filestore hosts data such as detector calibration data, field mapping data and surveys. The Configuration Filestore is intended to provide an area for medium term storage of raw configuration data to enable experts to review configuration data as part of an analysis. It is anticipated that, for example, data analysis may reveal shortcomings in a calibration and the Configuration Filestore provides an area to store calibration data for subsequent re-calibration. Long term storage is provided by GRID services. The Configuration Database provides a database for storing configuration information. Both nominal settings and as-run configuration data are stored. Data includes geometries, calibrations and experimental settings. As-run data is recorded with a validity time range; multiple revisions can be stored against a given validity time range to enable correction of erroneous data. Interface to the Configuration Database is provided by a web service layer (WSDL) running on Tomcat, enabling read-only access across http. Write access is available only to selected servers within the MICE Local Control Room (MLCR), also by http. Authentication is by IP address. 2.2 GRID Services MICE currently makes use of the large-scale distributed-computing facilities made available by the GridPP Collaboration [4] in the UK. MICE carries out two principal activities on the Grid: long term data curation, and batch execution of monte carlo and reconstruction. Four classes of data are stored in the MICE data storage; raw data, reconstructed data, simulation data, and miscellaneous data. For long term archival we use the Castor tape repository at the RAL Tier 1. Data files are replicated to storage nodes at several GridPP Tier 2 sites including Imperial College, whose data store is publicly readable via the Web. Raw data is moved onto the Grid from the MLCR in two steps: a data packer script that gathers and creates checksums of the raw data and online reconstruction outputs; and an autonomous Data Mover process that uploads the resulting tarballs to our long term storage. The reconstruction and monte carlo process is handled by a set of batch production agents submitting jobs to selected GridPP Tier 2 sites (over 3000 CPUs available to MICE). Control variables, calibrations and geometries are accessed from the Configuration Database. Three routes are envisaged for data production: monte carlo data is produced without reference to the raw data by user request; batch reconstruction data is produced against a given set of raw data runs and reconstruction control variables; and fast reconstruction is produced as raw data is moved onto the Grid to allow fast turnaround of physics analysis. The reconstructed data always has an associated run-specific monte carlo dataset. The Miscellaneous Data activity covers the archival to tape of a wide variety of material, from the construction, testing, and calibration of various subsystems (e.g. QA data, field maps, test beam and cosmics data). This is stored by an independent, manually-run set of scripts.

5 3 MAUS 5 3 MAUS MAUS (MICE Analysis User Software) [5] is MICE s simulation, reconstruction, and analysis software framework. MAUS aims to provide capabilities to perform a monte carlo simulation of the experiment reconstruct tracks and identify particles from simulations and real data, and provide monitoring and diagnostics while running the experiment. The work breakdown structure for MAUS is shown in Fig Framework MAUS is written in a mixture of python and C++. C++ is used for complex or low level algorithms where processing time is important. Python is used for simple or high level algorithms where development time is a more stringent requirement. A python user interface is provided. Installation is by a set of batch scripts. The MAUS installation is relatively straightforward, but MAUS has a number of dependencies which must also be built. MAUS developers support Scientific Linux 5 and 6, although MAUS is regularly built successfully on Ubuntu, CentOS and OpenSUSE linux distributions. MAUS has an Application Programmer Interface (API) that provides a framework on which developers can hang individual routines. Four types of modules are available. Inputters generate data, for example by reading data from disk or generating an input beam. Mappers modify the data in some way, for example by tracking primary particles to generate monte carlo data. After the beginning of the run, mappers can have no internal state, in order to facilitate distributed processing. Most of the monte carlo and reconstruction routines are mappers. Reducers collate the data in some way, for example by summing data to create a histogram. Reducers can have internal state. Outputters save the data in some way, for example providing data over a socket or writing data to disk. MAUS has six event types that can be passed between modules. The principal event type is the spill, and the majority of physics data is contained within the spill object. MAUS reconstructs data on a spill by spill basis, where a spill corresponds to the data associated with a dip of the MICE target. Additionally, MAUS has a JobHeader and JobFooter, which contain metadata associated

6 3 MAUS 6 Fig. 4: Work breakdown structure for the offline subproject. with a single execution of the code, and a RunHeader and RunFooter which contain metadata associated with a single MAUS data run. Finally, MAUS has an Image datatype which contains image data for display by the online routines. Routines are available to represent the data either as a binary ROOT object

7 3 MAUS 7 or as an ASCII json object. 3.2 Testing MAUS has a set of tests at the unit level and integration level. Unit tests are implemented against a single function, while integration tests operate against a complete workflow. Unit tests check that each function operates as intended by the developer and can achieve a high level of coverage and good test complexity. Integration tests check that the overall design of the code meets the specifications laid out, that interfaces with external codes or systems operate correctly. The MAUS team aims to provide unit test coverage that executes % of the total code base. This level of test coverage typically results in a code that performs the major workflows without any problem, but has errors in some of the less well-used functions and can behave ungracefully following user error. At the most recent release, MAUS-v0.8.2 test coverage was 68 % for python code, 78 % for non-legacy C++ code and 36 % for legacy (pre-2011) C++ code. MAUS operates a continuous integration stack using a pair of test servers that mimic an online (control room) and an offline environment. Build and test is driven by the Jenkins test environment [6]. Developers are asked to perform a build and test on a personal code branch, using the test server, before integrating with the development trunk. This enables MAUS to make frequent clean releases. Typically MAUS works on a 2-4 week release cycle. 3.3 Documentation The MAUS installation, development and deployment workflows are documented on the MAUS wiki [5]. Documentation on individual functions is provided as inline comments. It is the aim of the development team that each function should have a comment describing the expected inputs, outputs and any pertinent details of the implementation. The comments are parsed by doxygen into html format that is published on the MAUS wiki. User level documentation is provided by a LaTeX document that is included with the MAUS source code and also published on the MAUS wiki. 3.4 Data Flow The data flow of the monte carlo and reconstruction algorithms is shown in Fig. 5. The data flow is grouped into three principal areas; the monte carlo data flow is used to generate digits (electronics signals) from tracking; the reconstruction data flow is used to generate digits from raw data and the digits are converted to physics parameters of interest by the reconstruction.

8 4 Monte Carlo 8 4 Monte Carlo A monte carlo simulation of MICE encompasses beam generation, geometrical description of detectors and fields, tracking of particles through detectors, and digitization of the detectors response to particle interactions. 4.1 Beam generation The simulation within MAUS starts from the upstream end of the D2 magnet. The composition and phase space description of beam particles can be specified with datacards or can be read from an external file in G4Beamline [7], ICOOL [8], and user-defined formats. Efforts are underway to use G4Beamline to generate a realistic beam of particles tracked through the beamline upstream of D2. The G4Beamline beam generation is being validated against data from Step I after which the generator will be integrated with MAUS. This will enable users to generate beam particles using G4Beamline and then simulate them with MAUS. 4.2 Geometry There has been considerable progress in integrating a CAD-based geometry description into MAUS. A CAD model of MICE elements in their surveyed positions is converted to Geometry Description Markup Language (GDML) files which are then stored in the CDB. A user wanting to simulate MICE retrieves a geometry from the CDB by either specifying its unique ID, asking for the geometry valid at a specific time or valid for a specific data run. Once the geometry is downloaded, the positions of the survey nests on various detectors are fit to their surveyed position and the resulting geometry is used in the simulation. CAD-based geometries are available in MAUS for Step I and idealized Step IV configurations. Verification by beamline and detector groups and validation against Step I data is ongoing. Until the validation is complete, MAUS continues to allow use of legacy geometry descriptions through flat text files. 4.3 Tracking, Field Maps and Beam Optics MAUS tracking is performed by GEANT4 [9]. By default, MAUS uses 4th order Runge-Kutta for tracking, although other routines are available. 4th order Runge-Kutta has been shown to have very good precision relative to the MICE detector resolutions, even for step sizes of several cm. Magnetic field maps are implemented as a series of overlapping regions, each of which contains a field. On each tracking step, MAUS iterates over the list of fields, transforms to the local coordinate system of the field map, and calculates the field. The field values are transformed back into the global coordinate system, summed and passed to GEANT4. Numerous field types have been implemented within the MAUS framework. Solenoid fields can be calculated numerically from cylindrically symmetric 2D

9 4 Monte Carlo 9 field maps, by taking derivatives of an on-axis solenoidal field or by using the sum of fields from a set of cylindrical current sheets. Pillbox fields can be calculated by using the Bessel functions appropriate for a TM010 cavity or by reading a cylindrically symmetric field map. Multipole fields can be calculated from a 3D field map, or by taking derivatives from the usual multipole expansion formulae. Linear, quadratic and cubic interpolation routines have been implemented for field maps. Matrix transport routines for propagating particles and beams through these field maps have been implemented. Transport matrices are calculated by taking the numerical derivative of tracking output. These can be used to transport beam ellipses and single particles, for example enabling optics work, beam matching and so forth. Higher order transport routines are also available. The accelerator modelling routines in MAUS have been validated against ICOOL and G4Beamline. The routines have been used to model a number of beamlines and rings, including the Neutrino Factory front end. 4.4 Detector response and digitization The modelling of the detector response and electronics enables MAUS to provide test data for reconstruction algorithms and estimate the errors introduced by the detector and its readout. The interaction of particles in material is also performed by GEANT4. A sensitive detector class for each detector processes the hits in active volumes and stores relevant hit information such as the volume that was hit, the energy deposited and the time of the hit. The digitizers then simulate the detector response to these hits, modelling processes such as the photon yield of scintillator light, attenuation in light guides and the pulse shape in the electronics. The data structure of the outputs from the digitizers are designed to mock the output from the unpacking of the data from the DAQ. At the moment detector descriptions and digitizations have been implemented for the TOF, Tracker and the KL detectors. The monte carlo for the TOFs and Trackers have been available for some time. A recent change to the Tracker MC has been the addition of noise hits. The KL simulation and digitization is a recent addition and is currently being validated against Step I data. Simulation of the EMR is not yet in the production release of MAUS. However, the simulation software is advancing and the EMR group is verifying the description of the detector geometry and validating the simulation against data from cosmic rays and from the recently concluded EMR run. The description of the Cerenkov is being optimized and the simulation and digitization of optical photons is under development. 4.5 Plans The priority is to validate the CAD-based geometry implementation and study its speed and performance under a full simulation. Simulations of the Cherenkov

10 5 Reconstruction 10 and EMR detectors are being tested before merging into the production software. The monte carlo framework currently does not have a trigger. Work has started on developing the framework and algorithm for a realistic trigger simulation, in order to model the effects of pile-up. 5 Reconstruction The reconstruction chain takes as its input either digitized hits from MC or DAQ digits from real data. Regardless, the detector reconstruction algorithms, by requirement and design, operate the same way on both MC and data. 5.1 Detector Reconstruction MAUS is currently capable of performing at least some reconstruction on all of the MICE detectors. The TOF reconstruction has been stable for several years. The TOF reconstruction has three steps; PMT hits are found, matching PMT hits on each end of the scintillator slab are associated to make slab hits, and matching slab hits are found to make a space point. Calibration constants are stored in the Configuration Database and applied during space point reconstruction. The TOF has been shown to provide good time resolutions, with errors at the 50 ps level. Improvements are being made to the calibration algorithms to improve the reconstructed resolutions and examine some suspected systematic effects. The Tracker reconstruction has undergone a refactoring process over several years. This process is now nearly complete. Hits from adjacent fiber channels are used to form clusters. Clusters in different planes in a station are then matched to form x, y spacepoints. The pattern recognition algorithm then attempts to fit tracks to the spacepoints and the final track fit is performed using a Kalman filter to account for energy loss and multiple scattering. In the case of monte carlo, noise hits are added before the spacepoint reconstruction stage. Studies of spacepoint and track reconstruction efficiencies have begun and will feedback into optimization of the reconstruction. The Ckov reconstruction takes the raw flash-adc data, subtracts pedestals, calculates the charge and applies calibrations to determine the photoelectron yield. Work is ongoing to optimize the calibration corrections and understand the efficiencies. Hit-level reconstruction of the KL is also implemented in MAUS. Individual PMT hits are unpacked from the DAQ or simulated from monte carlo and the reconstruction associates them to identify the slabs that were hit and calculates the charge and charge-product corresponding to each slab hit. Further reconstruction will be performed at the global level. Hit-level reconstruction of the EMR is now implemented in MAUS; the integrated ADC and time over threshold are calculated for each bar that was hit. The next steps are to use the bar hits to reconstruct tracks and calculate

11 5 Reconstruction 11 the range and energy. These higher level reconstructions are currently being developed. 5.2 Global reconstruction The aim of the global reconstruction is to take the reconstructed outputs from individual detectors and tie them together to form a global track and provide a likelihood for various particle hypotheses. The global track reconstruction has been under development for several years, but a lack of manpower with the appropriate level of seniority has hindered development. The data structure to hold global parameters has been implemented and work has started on importing space-points from the TOF and reconstructed tracks from the Tracker and then propagating and forming global global tracks. The software group realizes that this is a critical item essential for Step IV reconstruction and a detailed work plan is being developed to monitor progress and allocate resources if necessary to ensure its timely completion. Particle identification in MICE typically requires the combination of several detectors. Principally the time-of-flight between TOF detectors can be used to calculate velocity, which is compared with the momentum measured in the tracker to calculate particle mass and hence particle type. Additional information can be gleaned from the Ckov, KL and EMR detectors. The global particle identification framework is designed to tie this disparate information into a set of hypotheses of particle types, with an estimate of the likelihood of each hypothesis. The framework for this is now in place, and estimators drawing information from each detector are under development. 5.3 Online reconstruction During data taking, it is essential to visualize a detector s performance and have diagnostic tools to identify and debug unexpected behavior. This is accomplished through summary histograms of high and low-level reconstructions from detectors. These are available for the TOF, Cherenkov, KL, and Trackers. EMR displays were developed during the run in 2013 and integration with the MAUS framework is underway. For online reconstruction, MAUS uses a distributed processing model to enable a scalable reconstruction of the MICE dataset. Raw data is passed to a networked message queue for multiprocessing across multiple CPUs and servers. Reconstructed data is handed to another message queue. Histogramming routines pick data from this second message queue and collate it into histograms, which are written to disk. A web-based visualisation tool enables viewing of the histograms. A production version of the online reconstruction is available. Further development work is underway on the user interface and some of the backend infrastructure. Though the framework for the online reconstruction is based on distributed processing of spills, the reconstruction modules are the same as those used for offline processing.

12 5 Reconstruction 12 An event display summarising global reconstruction data is also under development. This will enable visualisation of the phase space distribution of the beam at various points along the beamline, together with a comparison of the nominal beam envelope propagation. The event display is intended to enable online validation of the behaviour of accelerator components, by comparing propagation of the beam envelope with the detected beam parameters. 5.4 Plans The remaining piece in the detector reconstruction chain is the EMR and it is anticipated that it will be available in MAUS in the next three months. That will give MAUS the capability to reconstruct every Step IV detector individually. The highest priority task for Step IV is the global reconstruction and particle identification. References [1] G. Gregoire et al., Proposal to the Rutherford Appleton Laboratory: An International Muon Ionization Cooling Experiment (MICE), Tech. rep. (2003), MICE0021/MICE0021.pdf [2] [3] [4] [5] C.D. Tunnell and C.T. Rogers, MAUS: MICE Analysis User Software, IPAC (2011); [6] [7] [8] R.C. Fernow, ICOOL: A simulation code for ionization cooling of muon beams, Proc Particle Accelerator Conference, New York (1999). [9] S. Agostinelli, et al., GEANT4 - A Simulation Toolkit, Nucl. Instrum. Meth. A 506 (2003)

13 5 Reconstruction 13 Fig. 5: Data flow for the MAUS project. Items shown in black have some production implementation, although not necessarily the final one. Those shown in grey are under development.

The MICE Run Control System

The MICE Run Control System Journal of Physics: Conference Series MICE-CONF-GEN-429 OPEN ACCESS The MICE Run Control System To cite this article: Pierrick Hanlet and the Mice collaboration 2014 J. Phys.: Conf. Ser. 513 012012 View

More information

Hall D and IT. at Internal Review of IT in the 12 GeV Era. Mark M. Ito. May 20, Hall D. Hall D and IT. M. Ito. Introduction.

Hall D and IT. at Internal Review of IT in the 12 GeV Era. Mark M. Ito. May 20, Hall D. Hall D and IT. M. Ito. Introduction. at Internal Review of IT in the 12 GeV Era Mark Hall D May 20, 2011 Hall D in a Nutshell search for exotic mesons in the 1.5 to 2.0 GeV region 12 GeV electron beam coherent bremsstrahlung photon beam coherent

More information

The NOvA software testing framework

The NOvA software testing framework Journal of Physics: Conference Series PAPER OPEN ACCESS The NOvA software testing framework Related content - Corrosion process monitoring by AFM higher harmonic imaging S Babicz, A Zieliski, J Smulko

More information

Muon Reconstruction and Identification in CMS

Muon Reconstruction and Identification in CMS Muon Reconstruction and Identification in CMS Marcin Konecki Institute of Experimental Physics, University of Warsaw, Poland E-mail: marcin.konecki@gmail.com An event reconstruction at LHC is a challenging

More information

Forward Time-of-Flight Geometry for CLAS12

Forward Time-of-Flight Geometry for CLAS12 Forward Time-of-Flight Geometry for CLAS12 D.S. Carman, Jefferson Laboratory ftof geom.tex April 13, 2016 Abstract This document details the nominal geometry for the CLAS12 Forward Time-of- Flight System

More information

Test Beam Task List - ECAL

Test Beam Task List - ECAL Test Beam Task List - ECAL Aim: Identify all tasks essential for run and analysis of beam data Ensure (at least) 1 person commits to produce results in each area Very variable size of tasks easier for

More information

Gamma-ray Large Area Space Telescope. Work Breakdown Structure

Gamma-ray Large Area Space Telescope. Work Breakdown Structure Gamma-ray Large Area Space Telescope Work Breakdown Structure 4.1.D Science Analysis Software The Science Analysis Software comprises several components: (1) Prompt processing of instrument data through

More information

CernVM-FS beyond LHC computing

CernVM-FS beyond LHC computing CernVM-FS beyond LHC computing C Condurache, I Collier STFC Rutherford Appleton Laboratory, Harwell Oxford, Didcot, OX11 0QX, UK E-mail: catalin.condurache@stfc.ac.uk Abstract. In the last three years

More information

Reprocessing DØ data with SAMGrid

Reprocessing DØ data with SAMGrid Reprocessing DØ data with SAMGrid Frédéric Villeneuve-Séguier Imperial College, London, UK On behalf of the DØ collaboration and the SAM-Grid team. Abstract The DØ experiment studies proton-antiproton

More information

Experience with Data-flow, DQM and Analysis of TIF Data

Experience with Data-flow, DQM and Analysis of TIF Data Experience with Data-flow, DQM and Analysis of TIF Data G. Bagliesi, R.J. Bainbridge, T. Boccali, A. Bocci, V. Ciulli, N. De Filippis, M. De Mattia, S. Dutta, D. Giordano, L. Mirabito, C. Noeding, F. Palla,

More information

Locating the neutrino interaction vertex with the help of electronic detectors in the OPERA experiment

Locating the neutrino interaction vertex with the help of electronic detectors in the OPERA experiment Locating the neutrino interaction vertex with the help of electronic detectors in the OPERA experiment S.Dmitrievsky Joint Institute for Nuclear Research, Dubna, Russia LNGS seminar, 2015/04/08 Outline

More information

The CMS data quality monitoring software: experience and future prospects

The CMS data quality monitoring software: experience and future prospects The CMS data quality monitoring software: experience and future prospects Federico De Guio on behalf of the CMS Collaboration CERN, Geneva, Switzerland E-mail: federico.de.guio@cern.ch Abstract. The Data

More information

CMS Simulation Software

CMS Simulation Software CMS Simulation Software Dmitry Onoprienko Kansas State University on behalf of the CMS collaboration 10th Topical Seminar on Innovative Particle and Radiation Detectors 1-5 October 2006. Siena, Italy Simulation

More information

Update of the BESIII Event Display System

Update of the BESIII Event Display System Update of the BESIII Event Display System Shuhui Huang, Zhengyun You Sun Yat-sen University, Guangzhou, 510275, China E-mail: huangshh28@mail2.sysu.edu.cn, youzhy5@mail.sysu.edu.cn Abstract. The BESIII

More information

CLAS12 Offline Software Tools. G.Gavalian (Jlab) CLAS Collaboration Meeting (June 15, 2016)

CLAS12 Offline Software Tools. G.Gavalian (Jlab) CLAS Collaboration Meeting (June 15, 2016) CLAS12 Offline Software Tools G.Gavalian (Jlab) Overview Data Formats: RAW data decoding from EVIO. Reconstruction output banks in EVIO. Reconstruction output convertor to ROOT (coming soon). Data preservation

More information

Update of the BESIII Event Display System

Update of the BESIII Event Display System Journal of Physics: Conference Series PAPER OPEN ACCESS Update of the BESIII Event Display System To cite this article: Shuhui Huang and Zhengyun You 2018 J. Phys.: Conf. Ser. 1085 042027 View the article

More information

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

Performance quality monitoring system for the Daya Bay reactor neutrino experiment Journal of Physics: Conference Series OPEN ACCESS Performance quality monitoring system for the Daya Bay reactor neutrino experiment To cite this article: Y B Liu and the Daya Bay collaboration 2014 J.

More information

arxiv: v1 [physics.ins-det] 19 Oct 2017

arxiv: v1 [physics.ins-det] 19 Oct 2017 arxiv:1710.07150v1 [physics.ins-det] 19 Oct 2017 Parallelized JUNO simulation software based on SNiPER Tao Lin 1, Jiaheng Zou 1, Weidong Li 1, Ziyan Deng 1, Guofu Cao 1, Xingtao Huang 2 and Zhengyun You

More information

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine Journal of Physics: Conference Series PAPER OPEN ACCESS ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine To cite this article: Noemi Calace et al 2015 J. Phys.: Conf. Ser. 664 072005

More information

05/09/07 CHEP2007 Stefano Spataro. Simulation and Event Reconstruction inside the PandaRoot Framework. Stefano Spataro. for the collaboration

05/09/07 CHEP2007 Stefano Spataro. Simulation and Event Reconstruction inside the PandaRoot Framework. Stefano Spataro. for the collaboration for the collaboration Overview Introduction on Panda Structure of the framework Event generation Detector implementation Reconstruction The Panda experiment AntiProton Annihilations at Darmstadt Multi

More information

ONLINE MONITORING SYSTEM FOR THE EXPERIMENT

ONLINE MONITORING SYSTEM FOR THE EXPERIMENT ONLINE MONITORING SYSTEM FOR THE BM@N EXPERIMENT I.R. Gabdrakhmanov a, M.N. Kapishin b, S.P. Merts c Veksler and Baldin Laboratory of High Energy Physics, Joint Institute for Nuclear Research, 6 Joliot-Curie,

More information

First Operational Experience from the LHCb Silicon Tracker

First Operational Experience from the LHCb Silicon Tracker First Operational Experience from the LHCb Silicon Tracker 7 th International Hiroshima Symposium on Development and Application of Semiconductor Tracking Devices The LHCb Silicon Tracker Installation

More information

A Geometrical Modeller for HEP

A Geometrical Modeller for HEP A Geometrical Modeller for HEP R. Brun, A. Gheata CERN, CH 1211, Geneva 23, Switzerland M. Gheata ISS, RO 76900, Bucharest MG23, Romania For ALICE off-line collaboration Geometrical modelling generally

More information

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF Conference 2017 The Data Challenges of the LHC Reda Tafirout, TRIUMF Outline LHC Science goals, tools and data Worldwide LHC Computing Grid Collaboration & Scale Key challenges Networking ATLAS experiment

More information

ALICE ANALYSIS PRESERVATION. Mihaela Gheata DASPOS/DPHEP7 workshop

ALICE ANALYSIS PRESERVATION. Mihaela Gheata DASPOS/DPHEP7 workshop 1 ALICE ANALYSIS PRESERVATION Mihaela Gheata DASPOS/DPHEP7 workshop 2 Outline ALICE data flow ALICE analysis Data & software preservation Open access and sharing analysis tools Conclusions 3 ALICE data

More information

Software Development for Linear Accelerator Data Acquisition Systems

Software Development for Linear Accelerator Data Acquisition Systems Software Development for Linear Accelerator Data Acquisition Systems Jackson DeBuhr Department of Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, Troy, NY, 12180 (Dated: August

More information

Event reconstruction in STAR

Event reconstruction in STAR Chapter 4 Event reconstruction in STAR 4.1 Data aquisition and trigger The STAR data aquisition system (DAQ) [54] receives the input from multiple detectors at different readout rates. The typical recorded

More information

Detector Control LHC

Detector Control LHC Detector Control Systems @ LHC Matthias Richter Department of Physics, University of Oslo IRTG Lecture week Autumn 2012 Oct 18 2012 M. Richter (UiO) DCS @ LHC Oct 09 2012 1 / 39 Detectors in High Energy

More information

Forward Time-of-Flight Detector Efficiency for CLAS12

Forward Time-of-Flight Detector Efficiency for CLAS12 Forward Time-of-Flight Detector Efficiency for CLAS12 D.S. Carman, Jefferson Laboratory ftof eff.tex May 29, 2014 Abstract This document details an absolute hit efficiency study of the FTOF panel-1a and

More information

Data handling with SAM and art at the NOνA experiment

Data handling with SAM and art at the NOνA experiment Data handling with SAM and art at the NOνA experiment A Aurisano 1, C Backhouse 2, G S Davies 3, R Illingworth 4, N Mayer 5, M Mengel 4, A Norman 4, D Rocco 6 and J Zirnstein 6 1 University of Cincinnati,

More information

Status of the TORCH time-of-flight detector

Status of the TORCH time-of-flight detector Status of the TORCH time-of-flight detector Neville Harnew University of Oxford (On behalf of the TORCH collaboration : the Universities of Bath, Bristol and Oxford, CERN, and Photek) August 7-9, 2017

More information

π ± Charge Exchange Cross Section on Liquid Argon

π ± Charge Exchange Cross Section on Liquid Argon π ± Charge Exchange Cross Section on Liquid Argon Kevin Nelson REU Program, College of William and Mary Mike Kordosky College of William and Mary, Physics Dept. August 5, 2016 Abstract The observation

More information

Uniformity scan of the 6cm tubes. Jingbo Wang Argonne National Laboratory, Lemont, IL

Uniformity scan of the 6cm tubes. Jingbo Wang Argonne National Laboratory, Lemont, IL Uniformity scan of the 6cm tubes Jingbo Wang Argonne National Laboratory, Lemont, IL wjingbo@anl.gov Update of the test system Realized the computer control for the multi-channel HV supply, the 2D translation

More information

Topics for the TKR Software Review Tracy Usher, Leon Rochester

Topics for the TKR Software Review Tracy Usher, Leon Rochester Topics for the TKR Software Review Tracy Usher, Leon Rochester Progress in reconstruction Reconstruction short-term plans Simulation Calibration issues Balloon-specific support Personnel and Schedule TKR

More information

Simulation and Physics Studies for SiD. Norman Graf (for the Simulation & Reconstruction Team)

Simulation and Physics Studies for SiD. Norman Graf (for the Simulation & Reconstruction Team) Simulation and Physics Studies for SiD Norman Graf (for the Simulation & Reconstruction Team) SLAC DOE Program Review June 13, 2007 Linear Collider Detector Environment Detectors designed to exploit the

More information

Klaus Dehmelt EIC Detector R&D Weekly Meeting November 28, 2011 GEM SIMULATION FRAMEWORK

Klaus Dehmelt EIC Detector R&D Weekly Meeting November 28, 2011 GEM SIMULATION FRAMEWORK Klaus Dehmelt EIC Detector R&D Weekly Meeting November 28, 2011 GEM SIMULATION FRAMEWORK Overview GEM Simulation Framework in the context of Simulation Studies for a High Resolution Time Projection Chamber

More information

Performance quality monitoring system (PQM) for the Daya Bay experiment

Performance quality monitoring system (PQM) for the Daya Bay experiment Performance quality monitoring system (PQM) for the Daya Bay experiment LIU Yingbiao Institute of High Energy Physics On behalf of the Daya Bay Collaboration ACAT2013, Beijing, May 16-21, 2013 2 The Daya

More information

Real-time dataflow and workflow with the CMS tracker data

Real-time dataflow and workflow with the CMS tracker data Journal of Physics: Conference Series Real-time dataflow and workflow with the CMS tracker data To cite this article: N D Filippis et al 2008 J. Phys.: Conf. Ser. 119 072015 View the article online for

More information

The CMS Computing Model

The CMS Computing Model The CMS Computing Model Dorian Kcira California Institute of Technology SuperComputing 2009 November 14-20 2009, Portland, OR CERN s Large Hadron Collider 5000+ Physicists/Engineers 300+ Institutes 70+

More information

GEANT4 is used for simulating: RICH testbeam data, HCAL testbeam data. GAUSS Project: LHCb Simulation using GEANT4 with GAUDI.

GEANT4 is used for simulating: RICH testbeam data, HCAL testbeam data. GAUSS Project: LHCb Simulation using GEANT4 with GAUDI. Status of GEANT4 in LHCb S. Easo, RAL, 30-9-2002 The LHCbexperiment. GEANT4 is used for simulating: RICH testbeam data, HCAL testbeam data. GAUSS Project: LHCb Simulation using GEANT4 with GAUDI. Summary.

More information

Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall M. Petris, D. Bartos, G. Caragheorgheopol, M. Petrovici, L. Radulescu, V. Simion IFIN-HH

More information

Tracking and Vertex reconstruction at LHCb for Run II

Tracking and Vertex reconstruction at LHCb for Run II Tracking and Vertex reconstruction at LHCb for Run II Hang Yin Central China Normal University On behalf of LHCb Collaboration The fifth Annual Conference on Large Hadron Collider Physics, Shanghai, China

More information

EicRoot software framework

EicRoot software framework EicRoot software framework Alexander Kiselev EIC Software Meeting Jefferson Lab September,24 2015 Contents of the talk FairRoot software project EicRoot framework structure Typical EicRoot applications

More information

Simulation and data reconstruction framework slic & lcsim. Norman Graf, Jeremy McCormick SLAC HPS Collaboration Meeting May 27, 2011

Simulation and data reconstruction framework slic & lcsim. Norman Graf, Jeremy McCormick SLAC HPS Collaboration Meeting May 27, 2011 Simulation and data reconstruction framework slic & lcsim Norman Graf, Jeremy McCormick SLAC HPS Collaboration Meeting May 27, 2011 Simulation Mission Statement Provide full simulation capabilities for

More information

Disentangling P ANDA s time-based data stream

Disentangling P ANDA s time-based data stream Disentangling P ANDA s time-based data stream M. Tiemens on behalf of the PANDA Collaboration KVI - Center For Advanced Radiation Technology, University of Groningen, Zernikelaan 25, 9747 AA Groningen,

More information

The GLAST Event Reconstruction: What s in it for DC-1?

The GLAST Event Reconstruction: What s in it for DC-1? The GLAST Event Reconstruction: What s in it for DC-1? Digitization Algorithms Calorimeter Reconstruction Tracker Reconstruction ACD Reconstruction Plans GLAST Ground Software Workshop Tuesday, July 15,

More information

Performance of the ATLAS Inner Detector at the LHC

Performance of the ATLAS Inner Detector at the LHC Performance of the ALAS Inner Detector at the LHC hijs Cornelissen for the ALAS Collaboration Bergische Universität Wuppertal, Gaußstraße 2, 4297 Wuppertal, Germany E-mail: thijs.cornelissen@cern.ch Abstract.

More information

Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer

Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer Lydia Lorenti Advisor: David Heddle April 29, 2018 Abstract The CLAS12 spectrometer at Jefferson

More information

Monte Carlo programs

Monte Carlo programs Monte Carlo programs Alexander Khanov PHYS6260: Experimental Methods is HEP Oklahoma State University November 15, 2017 Simulation steps: event generator Input = data cards (program options) this is the

More information

CMS Conference Report

CMS Conference Report Available on CMS information server CMS CR 2005/021 CMS Conference Report 29 Septemebr 2005 Track and Vertex Reconstruction with the CMS Detector at LHC S. Cucciarelli CERN, Geneva, Switzerland Abstract

More information

Data Quality Monitoring at CMS with Machine Learning

Data Quality Monitoring at CMS with Machine Learning Data Quality Monitoring at CMS with Machine Learning July-August 2016 Author: Aytaj Aghabayli Supervisors: Jean-Roch Vlimant Maurizio Pierini CERN openlab Summer Student Report 2016 Abstract The Data Quality

More information

Alignment of the ATLAS Inner Detector

Alignment of the ATLAS Inner Detector Alignment of the ATLAS Inner Detector Heather M. Gray [1,2] on behalf of the ATLAS ID Alignment Group [1] California Institute of Technology [2] Columbia University The ATLAS Experiment tile calorimeter

More information

A PROPOSAL FOR MODELING THE CONTROL SYSTEM FOR THE SPANISH LIGHT SOURCE IN UML

A PROPOSAL FOR MODELING THE CONTROL SYSTEM FOR THE SPANISH LIGHT SOURCE IN UML A PROPOSAL FOR MODELING THE CONTROL SYSTEM FOR THE SPANISH LIGHT SOURCE IN UML D. Beltran*, LLS, Barcelona, Spain M. Gonzalez, CERN, Geneva, Switzerlan Abstract CELLS (Consorcio para la construcción, equipamiento

More information

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS CR -2008/100 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 02 December 2008 (v2, 03 December 2008)

More information

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

TORCH: A large-area detector for precision time-of-flight measurements at LHCb TORCH: A large-area detector for precision time-of-flight measurements at LHCb Neville Harnew University of Oxford ON BEHALF OF THE LHCb RICH/TORCH COLLABORATION Outline The LHCb upgrade TORCH concept

More information

Track reconstruction of real cosmic muon events with CMS tracker detector

Track reconstruction of real cosmic muon events with CMS tracker detector Track reconstruction of real cosmic muon events with CMS tracker detector Piergiulio Lenzi a, Chiara Genta a, Boris Mangano b a Università degli Studi di Firenze and Istituto Nazionale di Fisica Nucleare

More information

Kondo GNANVO Florida Institute of Technology, Melbourne FL

Kondo GNANVO Florida Institute of Technology, Melbourne FL Kondo GNANVO Florida Institute of Technology, Melbourne FL OUTLINE Development of AMORE software for online monitoring and data analysis of MT station Preliminary cosmic data results from triple-gem chambers

More information

Track reconstruction with the CMS tracking detector

Track reconstruction with the CMS tracking detector Track reconstruction with the CMS tracking detector B. Mangano (University of California, San Diego) & O.Gutsche (Fermi National Accelerator Laboratory) Overview The challenges The detector Track reconstruction

More information

The AMS-02 Anticoincidence Counter. Philip von Doetinchem I. Phys. Inst. B, RWTH Aachen for the AMS-02 Collaboration DPG, Freiburg March 2008

The AMS-02 Anticoincidence Counter. Philip von Doetinchem I. Phys. Inst. B, RWTH Aachen for the AMS-02 Collaboration DPG, Freiburg March 2008 I. Phys. Inst. B, RWTH Aachen for the AMS-02 Collaboration DPG, Freiburg March 2008 Cosmic Rays in the GeV Range world average SUSY DM KK DM good agreement of data and propagation models, but some unexplained

More information

The GTPC Package: Tracking and Analysis Software for GEM TPCs

The GTPC Package: Tracking and Analysis Software for GEM TPCs The GTPC Package: Tracking and Analysis Software for GEM TPCs Linear Collider TPC R&D Meeting LBNL, Berkeley, California (USA) 18-19 October, 003 Steffen Kappler Institut für Experimentelle Kernphysik,

More information

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC On behalf of the ATLAS Collaboration Uppsala Universitet E-mail: mikael.martensson@cern.ch ATL-DAQ-PROC-2016-034 09/01/2017 A fast

More information

PoS(High-pT physics09)036

PoS(High-pT physics09)036 Triggering on Jets and D 0 in HLT at ALICE 1 University of Bergen Allegaten 55, 5007 Bergen, Norway E-mail: st05886@alf.uib.no The High Level Trigger (HLT) of the ALICE experiment is designed to perform

More information

Alignment of the ATLAS Inner Detector tracking system

Alignment of the ATLAS Inner Detector tracking system Alignment of the ATLAS Inner Detector tracking system Instituto de Física Corpuscular (IFIC), Centro Mixto UVEG-CSIC, Apdo.22085, ES-46071 Valencia, E-mail: Regina.Moles@ific.uv.es The ATLAS experiment

More information

Tracking and Vertexing in 3D B-field

Tracking and Vertexing in 3D B-field Tracking and Vertexing in 3D B-field Norman Graf (SLAC) HPS Collaboration Meeting, JLab October 26, 2015 Track Extrapolation At the heart of both track and vertex fitting in the presence of a non-uniform

More information

Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition

Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition Bernd Schweizer, Andreas Goedicke Philips Technology Research Laboratories, Aachen, Germany bernd.schweizer@philips.com Abstract.

More information

Gamma spectroscopic measurements using the PID350 pixelated CdTe radiation detector

Gamma spectroscopic measurements using the PID350 pixelated CdTe radiation detector Gamma spectroscopic measurements using the PID350 pixelated CdTe radiation detector K. Karafasoulis, K. Zachariadou, S. Seferlis, I. Papadakis, D. Loukas, C. Lambropoulos, C. Potiriadis Abstract Spectroscopic

More information

Monte Carlo Production on the Grid by the H1 Collaboration

Monte Carlo Production on the Grid by the H1 Collaboration Journal of Physics: Conference Series Monte Carlo Production on the Grid by the H1 Collaboration To cite this article: E Bystritskaya et al 2012 J. Phys.: Conf. Ser. 396 032067 Recent citations - Monitoring

More information

Data Quality Monitoring Display for ATLAS experiment

Data Quality Monitoring Display for ATLAS experiment Data Quality Monitoring Display for ATLAS experiment Y Ilchenko 1, C Cuenca Almenar 2, A Corso-Radu 2, H Hadavand 1, S Kolos 2, K Slagle 2, A Taffard 2 1 Southern Methodist University, Dept. of Physics,

More information

PoS(Baldin ISHEPP XXII)134

PoS(Baldin ISHEPP XXII)134 Implementation of the cellular automaton method for track reconstruction in the inner tracking system of MPD at NICA, G.A. Ososkov and A.I. Zinchenko Joint Institute of Nuclear Research, 141980 Dubna,

More information

Computing at Belle II

Computing at Belle II Computing at Belle II CHEP 22.05.2012 Takanori Hara for the Belle II Computing Group Physics Objective of Belle and Belle II Confirmation of KM mechanism of CP in the Standard Model CP in the SM too small

More information

Virtualizing a Batch. University Grid Center

Virtualizing a Batch. University Grid Center Virtualizing a Batch Queuing System at a University Grid Center Volker Büge (1,2), Yves Kemp (1), Günter Quast (1), Oliver Oberst (1), Marcel Kunze (2) (1) University of Karlsruhe (2) Forschungszentrum

More information

Geant4 v9.5. Kernel III. Makoto Asai (SLAC) Geant4 Tutorial Course

Geant4 v9.5. Kernel III. Makoto Asai (SLAC) Geant4 Tutorial Course Geant4 v9.5 Kernel III Makoto Asai (SLAC) Geant4 Tutorial Course Contents Fast simulation (Shower parameterization) Multi-threading Computing performance Kernel III - M.Asai (SLAC) 2 Fast simulation (shower

More information

Overview of ATLAS PanDA Workload Management

Overview of ATLAS PanDA Workload Management Overview of ATLAS PanDA Workload Management T. Maeno 1, K. De 2, T. Wenaus 1, P. Nilsson 2, G. A. Stewart 3, R. Walker 4, A. Stradling 2, J. Caballero 1, M. Potekhin 1, D. Smith 5, for The ATLAS Collaboration

More information

The ATLAS Trigger Simulation with Legacy Software

The ATLAS Trigger Simulation with Legacy Software The ATLAS Trigger Simulation with Legacy Software Carin Bernius SLAC National Accelerator Laboratory, Menlo Park, California E-mail: Catrin.Bernius@cern.ch Gorm Galster The Niels Bohr Institute, University

More information

The NOvA DAQ Monitor System

The NOvA DAQ Monitor System Journal of Physics: Conference Series PAPER OPEN ACCESS The NOvA DAQ Monitor System To cite this article: Michael Baird et al 2015 J. Phys.: Conf. Ser. 664 082020 View the article online for updates and

More information

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction Dan Peterson, Cornell University The Cornell group has constructed, operated and maintained the charged particle tracking detectors

More information

arxiv:physics/ v1 [physics.ins-det] 18 Dec 1998

arxiv:physics/ v1 [physics.ins-det] 18 Dec 1998 Studies of 1 µm-thick silicon strip detector with analog VLSI readout arxiv:physics/981234v1 [physics.ins-det] 18 Dec 1998 T. Hotta a,1, M. Fujiwara a, T. Kinashi b, Y. Kuno c, M. Kuss a,2, T. Matsumura

More information

Work in Tbilisi. David Mchedlishvili (SMART EDM_lab of TSU) GGSWBS , Tbilisi. Shota Rustaveli National Science Foundation

Work in Tbilisi. David Mchedlishvili (SMART EDM_lab of TSU) GGSWBS , Tbilisi. Shota Rustaveli National Science Foundation Mitglied der Helmholtz-Gemeinschaft David Mchedlishvili (SMART EDM_lab of TSU) Work in Tbilisi GGSWBS 18 23.08.2018, Tbilisi JEDI: Charged-Particle EDM Search Main principle: Inject polarized particles

More information

Update on PRad GEMs, Readout Electronics & DAQ

Update on PRad GEMs, Readout Electronics & DAQ Update on PRad GEMs, Readout Electronics & DAQ Kondo Gnanvo University of Virginia, Charlottesville, VA Outline PRad GEMs update Upgrade of SRS electronics Integration into JLab DAQ system Cosmic tests

More information

Monitoring of Computing Resource Use of Active Software Releases at ATLAS

Monitoring of Computing Resource Use of Active Software Releases at ATLAS 1 2 3 4 5 6 Monitoring of Computing Resource Use of Active Software Releases at ATLAS Antonio Limosani on behalf of the ATLAS Collaboration CERN CH-1211 Geneva 23 Switzerland and University of Sydney,

More information

A new method of The Monte Carlo Simulation based on Hit stream for the LHAASO

A new method of The Monte Carlo Simulation based on Hit stream for the LHAASO A of The Monte Carlo Simulation based on Hit stream for the LHAASO Institute of High Energy Physics, CAS, Beijing, China. E-mail: wuhr@ihep.ac.cn Mingjun Chen Institute of High Energy Physics, CAS, Beijing,

More information

Introduction to Geant4

Introduction to Geant4 Introduction to Geant4 Release 10.4 Geant4 Collaboration Rev1.0: Dec 8th, 2017 CONTENTS: 1 Geant4 Scope of Application 3 2 History of Geant4 5 3 Overview of Geant4 Functionality 7 4 Geant4 User Support

More information

ATLAS Offline Data Quality Monitoring

ATLAS Offline Data Quality Monitoring ATLAS Offline Data Quality Monitoring ATL-SOFT-PROC-2009-003 24 July 2009 J. Adelman 9, M. Baak 3, N. Boelaert 6, M. D Onofrio 1, J.A. Frost 2, C. Guyot 8, M. Hauschild 3, A. Hoecker 3, K.J.C. Leney 5,

More information

ICAT Job Portal. a generic job submission system built on a scientific data catalog. IWSG 2013 ETH, Zurich, Switzerland 3-5 June 2013

ICAT Job Portal. a generic job submission system built on a scientific data catalog. IWSG 2013 ETH, Zurich, Switzerland 3-5 June 2013 ICAT Job Portal a generic job submission system built on a scientific data catalog IWSG 2013 ETH, Zurich, Switzerland 3-5 June 2013 Steve Fisher, Kevin Phipps and Dan Rolfe Rutherford Appleton Laboratory

More information

Tracking POG Update. Tracking POG Meeting March 17, 2009

Tracking POG Update. Tracking POG Meeting March 17, 2009 Tracking POG Update Tracking POG Meeting March 17, 2009 Outline Recent accomplishments in Tracking POG - Reconstruction improvements for collisions - Analysis of CRAFT Data Upcoming Tasks Announcements

More information

Certified LabVIEW Architect Recertification Exam Test Booklet

Certified LabVIEW Architect Recertification Exam Test Booklet Certified LabVIEW Architect Recertification Exam Test Booklet Note: The use of the computer or any reference materials is NOT allowed during the exam. Instructions: If you did not receive this exam in

More information

Data Curation Profile Cornell University, Biophysics

Data Curation Profile Cornell University, Biophysics Data Curation Profile Cornell University, Biophysics Profile Author Dianne Dietrich Author s Institution Cornell University Contact dd388@cornell.edu Researcher(s) Interviewed Withheld Researcher s Institution

More information

Beam test measurements of the Belle II vertex detector modules

Beam test measurements of the Belle II vertex detector modules Beam test measurements of the Belle II vertex detector modules Tadeas Bilka Charles University, Prague on behalf of the Belle II Collaboration IPRD 2016, 3 6 October 2016, Siena, Italy Outline Belle II

More information

Use of ROOT in the DØ Online Event Monitoring System

Use of ROOT in the DØ Online Event Monitoring System Use of ROOT in the DØ Online Event Monitoring System J. Snow 1, P. Canal 2, J. Kowalkowski 2,J.Yu 2 1 Langston University, Langston, Oklahoma 73050, USA 2 Fermi National Accelerator Laboratory, P.O. Box

More information

Summary of the LHC Computing Review

Summary of the LHC Computing Review Summary of the LHC Computing Review http://lhc-computing-review-public.web.cern.ch John Harvey CERN/EP May 10 th, 2001 LHCb Collaboration Meeting The Scale Data taking rate : 50,100, 200 Hz (ALICE, ATLAS-CMS,

More information

Expected Performances of the scintillator counters Time Of Flight system of the AMS-02 experiment

Expected Performances of the scintillator counters Time Of Flight system of the AMS-02 experiment Expected Performances of the scintillator counters Time Of Flight system of the AMS-02 experiment Cristina Sbarra for the AMS-TOF Bologna group (sbarra@bo.infn.it) INFN-Bologna (Italy) Cristina Sbarra

More information

Limitations in the PHOTON Monte Carlo gamma transport code

Limitations in the PHOTON Monte Carlo gamma transport code Nuclear Instruments and Methods in Physics Research A 480 (2002) 729 733 Limitations in the PHOTON Monte Carlo gamma transport code I. Orion a, L. Wielopolski b, * a St. Luke s/roosevelt Hospital, Columbia

More information

Performance of the GlueX Detector Systems

Performance of the GlueX Detector Systems Performance of the GlueX Detector Systems GlueX-doc-2775 Gluex Collaboration August 215 Abstract This document summarizes the status of calibration and performance of the GlueX detector as of summer 215.

More information

LHCb Computing Strategy

LHCb Computing Strategy LHCb Computing Strategy Nick Brook Computing Model 2008 needs Physics software Harnessing the Grid DIRC GNG Experience & Readiness HCP, Elba May 07 1 Dataflow RW data is reconstructed: e.g. Calo. Energy

More information

Arion: a realistic projection simulator for optimizing laboratory and industrial micro-ct

Arion: a realistic projection simulator for optimizing laboratory and industrial micro-ct Arion: a realistic projection simulator for optimizing laboratory and industrial micro-ct J. DHAENE* 1, E. PAUWELS 1, T. DE SCHRYVER 1, A. DE MUYNCK 1, M. DIERICK 1, L. VAN HOOREBEKE 1 1 UGCT Dept. Physics

More information

8.882 LHC Physics. Track Reconstruction and Fitting. [Lecture 8, March 2, 2009] Experimental Methods and Measurements

8.882 LHC Physics. Track Reconstruction and Fitting. [Lecture 8, March 2, 2009] Experimental Methods and Measurements 8.882 LHC Physics Experimental Methods and Measurements Track Reconstruction and Fitting [Lecture 8, March 2, 2009] Organizational Issues Due days for the documented analyses project 1 is due March 12

More information

Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S)

Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S) Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S) Overview Large Hadron Collider (LHC) Compact Muon Solenoid (CMS) experiment The Challenge Worldwide LHC

More information

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction Dan Peterson, Cornell University The Cornell group has constructed, operated and maintained the charged particle tracking detectors

More information

Real-time Analysis with the ALICE High Level Trigger.

Real-time Analysis with the ALICE High Level Trigger. Real-time Analysis with the ALICE High Level Trigger C. Loizides 1,3, V.Lindenstruth 2, D.Röhrich 3, B.Skaali 4, T.Steinbeck 2, R. Stock 1, H. TilsnerK.Ullaland 3, A.Vestbø 3 and T.Vik 4 for the ALICE

More information

R3BRoot Framework. D. Kresan GSI, Darmstadt. First R3BRoot Development Workshop July 28 30, 2015 GSI, Darmstadt

R3BRoot Framework. D. Kresan GSI, Darmstadt. First R3BRoot Development Workshop July 28 30, 2015 GSI, Darmstadt GSI, Darmstadt First R3BRoot Development Workshop July 28 30, 2015 GSI, Darmstadt Outline Introduction to concept Relation to FairRoot Combined solution for R3B analysis Framework components - Analysis

More information