The MICE Physics Computing Capabilities

Similar documents
The MICE Run Control System

Hall D and IT. at Internal Review of IT in the 12 GeV Era. Mark M. Ito. May 20, Hall D. Hall D and IT. M. Ito. Introduction.

The NOvA software testing framework

Muon Reconstruction and Identification in CMS

Forward Time-of-Flight Geometry for CLAS12

Test Beam Task List - ECAL

Gamma-ray Large Area Space Telescope. Work Breakdown Structure

CernVM-FS beyond LHC computing

Reprocessing DØ data with SAMGrid

Experience with Data-flow, DQM and Analysis of TIF Data

Locating the neutrino interaction vertex with the help of electronic detectors in the OPERA experiment

The CMS data quality monitoring software: experience and future prospects

CMS Simulation Software

Update of the BESIII Event Display System

CLAS12 Offline Software Tools. G.Gavalian (Jlab) CLAS Collaboration Meeting (June 15, 2016)

Update of the BESIII Event Display System

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

arxiv: v1 [physics.ins-det] 19 Oct 2017

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

05/09/07 CHEP2007 Stefano Spataro. Simulation and Event Reconstruction inside the PandaRoot Framework. Stefano Spataro. for the collaboration

ONLINE MONITORING SYSTEM FOR THE EXPERIMENT

First Operational Experience from the LHCb Silicon Tracker

A Geometrical Modeller for HEP

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF

ALICE ANALYSIS PRESERVATION. Mihaela Gheata DASPOS/DPHEP7 workshop

Software Development for Linear Accelerator Data Acquisition Systems

Event reconstruction in STAR

Detector Control LHC

Forward Time-of-Flight Detector Efficiency for CLAS12

Data handling with SAM and art at the NOνA experiment

Status of the TORCH time-of-flight detector

π ± Charge Exchange Cross Section on Liquid Argon

Uniformity scan of the 6cm tubes. Jingbo Wang Argonne National Laboratory, Lemont, IL

Topics for the TKR Software Review Tracy Usher, Leon Rochester

Simulation and Physics Studies for SiD. Norman Graf (for the Simulation & Reconstruction Team)

Klaus Dehmelt EIC Detector R&D Weekly Meeting November 28, 2011 GEM SIMULATION FRAMEWORK

Performance quality monitoring system (PQM) for the Daya Bay experiment

Real-time dataflow and workflow with the CMS tracker data

The CMS Computing Model

GEANT4 is used for simulating: RICH testbeam data, HCAL testbeam data. GAUSS Project: LHCb Simulation using GEANT4 with GAUDI.

Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

Tracking and Vertex reconstruction at LHCb for Run II

EicRoot software framework

Simulation and data reconstruction framework slic & lcsim. Norman Graf, Jeremy McCormick SLAC HPS Collaboration Meeting May 27, 2011

Disentangling P ANDA s time-based data stream

The GLAST Event Reconstruction: What s in it for DC-1?

Performance of the ATLAS Inner Detector at the LHC

Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer

Monte Carlo programs

CMS Conference Report

Data Quality Monitoring at CMS with Machine Learning

Alignment of the ATLAS Inner Detector

A PROPOSAL FOR MODELING THE CONTROL SYSTEM FOR THE SPANISH LIGHT SOURCE IN UML

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

Track reconstruction of real cosmic muon events with CMS tracker detector

Kondo GNANVO Florida Institute of Technology, Melbourne FL

Track reconstruction with the CMS tracking detector

The AMS-02 Anticoincidence Counter. Philip von Doetinchem I. Phys. Inst. B, RWTH Aachen for the AMS-02 Collaboration DPG, Freiburg March 2008

The GTPC Package: Tracking and Analysis Software for GEM TPCs

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

PoS(High-pT physics09)036

Alignment of the ATLAS Inner Detector tracking system

Tracking and Vertexing in 3D B-field

Validation of GEANT4 for Accurate Modeling of 111 In SPECT Acquisition

Gamma spectroscopic measurements using the PID350 pixelated CdTe radiation detector

Monte Carlo Production on the Grid by the H1 Collaboration

Data Quality Monitoring Display for ATLAS experiment

PoS(Baldin ISHEPP XXII)134

Computing at Belle II

Virtualizing a Batch. University Grid Center

Geant4 v9.5. Kernel III. Makoto Asai (SLAC) Geant4 Tutorial Course

Overview of ATLAS PanDA Workload Management

The ATLAS Trigger Simulation with Legacy Software

The NOvA DAQ Monitor System

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

arxiv:physics/ v1 [physics.ins-det] 18 Dec 1998

Work in Tbilisi. David Mchedlishvili (SMART EDM_lab of TSU) GGSWBS , Tbilisi. Shota Rustaveli National Science Foundation

Update on PRad GEMs, Readout Electronics & DAQ

Monitoring of Computing Resource Use of Active Software Releases at ATLAS

A new method of The Monte Carlo Simulation based on Hit stream for the LHAASO

Introduction to Geant4

ATLAS Offline Data Quality Monitoring

ICAT Job Portal. a generic job submission system built on a scientific data catalog. IWSG 2013 ETH, Zurich, Switzerland 3-5 June 2013

Tracking POG Update. Tracking POG Meeting March 17, 2009

Certified LabVIEW Architect Recertification Exam Test Booklet

Data Curation Profile Cornell University, Biophysics

Beam test measurements of the Belle II vertex detector modules

Use of ROOT in the DØ Online Event Monitoring System

Summary of the LHC Computing Review

Expected Performances of the scintillator counters Time Of Flight system of the AMS-02 experiment

Limitations in the PHOTON Monte Carlo gamma transport code

Performance of the GlueX Detector Systems

LHCb Computing Strategy

Arion: a realistic projection simulator for optimizing laboratory and industrial micro-ct

8.882 LHC Physics. Track Reconstruction and Fitting. [Lecture 8, March 2, 2009] Experimental Methods and Measurements

Stephen J. Gowdy (CERN) 12 th September 2012 XLDB Conference FINDING THE HIGGS IN THE HAYSTACK(S)

Charged Particle Tracking at Cornell: Gas Detectors and Event Reconstruction

Real-time Analysis with the ALICE High Level Trigger.

R3BRoot Framework. D. Kresan GSI, Darmstadt. First R3BRoot Development Workshop July 28 30, 2015 GSI, Darmstadt

Transcription:

MICE-NOTE-COMP-0439 The MICE Physics Computing Capabilities C. T. Rogers, D. Rajaram Abstract In this note we present the MICE simulation, online and offline analysis capabilities and detail the framework that is under construction in preparation for MICE data taking. 1 The MICE Computing Project Development of the computing aspects of MICE [1] are described by the MICE computing work package. MICE has a broad spectrum of computing tasks, ranging from reconstruction and detector readout to web services and configuration management tools. Fig. 1: Top level work breakdown structure for the software and computing work package.. The package is broken down into four areas, as shown in Fig. 1. The tasks cover the following areas: Offline: monte carlo and reconstruction of the MICE experiment Online: computing aspects of the control room and readout of the MICE detector electronics 1

1 The MICE Computing Project 2 Controls and Monitoring: slow control and monitoring of the experimental equipment and environment. Infrastructure: configuration management and batch production of monte carlo and reconstruction data, data curation tools, MICE web services and emailing lists. In this note the monte carlo and reconstruction of the data is described along with the infrastructure that is used to drive it. Fig. 2: Data flow for the MICE computing project. Items in grey are external to the computing work package.. 1.1 MICE Computing Process A diagram describing the processing of the MICE data is shown in Fig. 2. In principle, the Analysis group run the Monte Carlo, and use this to design a set of experimental settings, which are passed to operations; the Operations group take some data; slow controls are passed through the Controls and Monitoring infrastructure (CAM), with some Configuration information stored; and detector readout is handled by the Data Acquisition (DAQ) system, which is used both for calibration and physics data. The detectors are reconstructed and the reconstruction output is passed to the Analysis group for further processing. At each stage of the process, the computing group seeks to provide an audit trail,

2 Infrastructure 3 enabling users to understand which version of which software handles any data, and reprocess the data if necessary. 1.2 Code Hosting and Data Policy All code in MICE is hosted on launchpad [2], using the Bazaar Distributed Version Control System [3] to manage version control. All code developed by MICE is open source, typically licensed under the GNU Public Licence. MICE data is open and available on the web, in line with STFC data policies. There is no embargo on MICE data, but external papers must acknowledge the MICE collaboration. Papers published by collaboration members must use validated production datasets. Fig. 3: Work breakdown structure for the infrastructure subproject. 2 Infrastructure The work breakdown structure of the MICE computing infrastructure is shown in Fig. 3. The structure is split into three parts, Configuration Management; GRID services and web services. 2.1 Configuration Management The configuration management tools consist of a Configuration Filestore for hosting raw configuration data, and a Configuration Database for storing production configuration constants.

2 Infrastructure 4 The Configuration Filestore hosts data such as detector calibration data, field mapping data and surveys. The Configuration Filestore is intended to provide an area for medium term storage of raw configuration data to enable experts to review configuration data as part of an analysis. It is anticipated that, for example, data analysis may reveal shortcomings in a calibration and the Configuration Filestore provides an area to store calibration data for subsequent re-calibration. Long term storage is provided by GRID services. The Configuration Database provides a database for storing configuration information. Both nominal settings and as-run configuration data are stored. Data includes geometries, calibrations and experimental settings. As-run data is recorded with a validity time range; multiple revisions can be stored against a given validity time range to enable correction of erroneous data. Interface to the Configuration Database is provided by a web service layer (WSDL) running on Tomcat, enabling read-only access across http. Write access is available only to selected servers within the MICE Local Control Room (MLCR), also by http. Authentication is by IP address. 2.2 GRID Services MICE currently makes use of the large-scale distributed-computing facilities made available by the GridPP Collaboration [4] in the UK. MICE carries out two principal activities on the Grid: long term data curation, and batch execution of monte carlo and reconstruction. Four classes of data are stored in the MICE data storage; raw data, reconstructed data, simulation data, and miscellaneous data. For long term archival we use the Castor tape repository at the RAL Tier 1. Data files are replicated to storage nodes at several GridPP Tier 2 sites including Imperial College, whose data store is publicly readable via the Web. Raw data is moved onto the Grid from the MLCR in two steps: a data packer script that gathers and creates checksums of the raw data and online reconstruction outputs; and an autonomous Data Mover process that uploads the resulting tarballs to our long term storage. The reconstruction and monte carlo process is handled by a set of batch production agents submitting jobs to selected GridPP Tier 2 sites (over 3000 CPUs available to MICE). Control variables, calibrations and geometries are accessed from the Configuration Database. Three routes are envisaged for data production: monte carlo data is produced without reference to the raw data by user request; batch reconstruction data is produced against a given set of raw data runs and reconstruction control variables; and fast reconstruction is produced as raw data is moved onto the Grid to allow fast turnaround of physics analysis. The reconstructed data always has an associated run-specific monte carlo dataset. The Miscellaneous Data activity covers the archival to tape of a wide variety of material, from the construction, testing, and calibration of various subsystems (e.g. QA data, field maps, test beam and cosmics data). This is stored by an independent, manually-run set of scripts.

3 MAUS 5 3 MAUS MAUS (MICE Analysis User Software) [5] is MICE s simulation, reconstruction, and analysis software framework. MAUS aims to provide capabilities to perform a monte carlo simulation of the experiment reconstruct tracks and identify particles from simulations and real data, and provide monitoring and diagnostics while running the experiment. The work breakdown structure for MAUS is shown in Fig. 4. 3.1 Framework MAUS is written in a mixture of python and C++. C++ is used for complex or low level algorithms where processing time is important. Python is used for simple or high level algorithms where development time is a more stringent requirement. A python user interface is provided. Installation is by a set of batch scripts. The MAUS installation is relatively straightforward, but MAUS has a number of dependencies which must also be built. MAUS developers support Scientific Linux 5 and 6, although MAUS is regularly built successfully on Ubuntu, CentOS and OpenSUSE linux distributions. MAUS has an Application Programmer Interface (API) that provides a framework on which developers can hang individual routines. Four types of modules are available. Inputters generate data, for example by reading data from disk or generating an input beam. Mappers modify the data in some way, for example by tracking primary particles to generate monte carlo data. After the beginning of the run, mappers can have no internal state, in order to facilitate distributed processing. Most of the monte carlo and reconstruction routines are mappers. Reducers collate the data in some way, for example by summing data to create a histogram. Reducers can have internal state. Outputters save the data in some way, for example providing data over a socket or writing data to disk. MAUS has six event types that can be passed between modules. The principal event type is the spill, and the majority of physics data is contained within the spill object. MAUS reconstructs data on a spill by spill basis, where a spill corresponds to the data associated with a dip of the MICE target. Additionally, MAUS has a JobHeader and JobFooter, which contain metadata associated

3 MAUS 6 Fig. 4: Work breakdown structure for the offline subproject. with a single execution of the code, and a RunHeader and RunFooter which contain metadata associated with a single MAUS data run. Finally, MAUS has an Image datatype which contains image data for display by the online routines. Routines are available to represent the data either as a binary ROOT object

3 MAUS 7 or as an ASCII json object. 3.2 Testing MAUS has a set of tests at the unit level and integration level. Unit tests are implemented against a single function, while integration tests operate against a complete workflow. Unit tests check that each function operates as intended by the developer and can achieve a high level of coverage and good test complexity. Integration tests check that the overall design of the code meets the specifications laid out, that interfaces with external codes or systems operate correctly. The MAUS team aims to provide unit test coverage that executes 70 80 % of the total code base. This level of test coverage typically results in a code that performs the major workflows without any problem, but has errors in some of the less well-used functions and can behave ungracefully following user error. At the most recent release, MAUS-v0.8.2 test coverage was 68 % for python code, 78 % for non-legacy C++ code and 36 % for legacy (pre-2011) C++ code. MAUS operates a continuous integration stack using a pair of test servers that mimic an online (control room) and an offline environment. Build and test is driven by the Jenkins test environment [6]. Developers are asked to perform a build and test on a personal code branch, using the test server, before integrating with the development trunk. This enables MAUS to make frequent clean releases. Typically MAUS works on a 2-4 week release cycle. 3.3 Documentation The MAUS installation, development and deployment workflows are documented on the MAUS wiki [5]. Documentation on individual functions is provided as inline comments. It is the aim of the development team that each function should have a comment describing the expected inputs, outputs and any pertinent details of the implementation. The comments are parsed by doxygen into html format that is published on the MAUS wiki. User level documentation is provided by a LaTeX document that is included with the MAUS source code and also published on the MAUS wiki. 3.4 Data Flow The data flow of the monte carlo and reconstruction algorithms is shown in Fig. 5. The data flow is grouped into three principal areas; the monte carlo data flow is used to generate digits (electronics signals) from tracking; the reconstruction data flow is used to generate digits from raw data and the digits are converted to physics parameters of interest by the reconstruction.

4 Monte Carlo 8 4 Monte Carlo A monte carlo simulation of MICE encompasses beam generation, geometrical description of detectors and fields, tracking of particles through detectors, and digitization of the detectors response to particle interactions. 4.1 Beam generation The simulation within MAUS starts from the upstream end of the D2 magnet. The composition and phase space description of beam particles can be specified with datacards or can be read from an external file in G4Beamline [7], ICOOL [8], and user-defined formats. Efforts are underway to use G4Beamline to generate a realistic beam of particles tracked through the beamline upstream of D2. The G4Beamline beam generation is being validated against data from Step I after which the generator will be integrated with MAUS. This will enable users to generate beam particles using G4Beamline and then simulate them with MAUS. 4.2 Geometry There has been considerable progress in integrating a CAD-based geometry description into MAUS. A CAD model of MICE elements in their surveyed positions is converted to Geometry Description Markup Language (GDML) files which are then stored in the CDB. A user wanting to simulate MICE retrieves a geometry from the CDB by either specifying its unique ID, asking for the geometry valid at a specific time or valid for a specific data run. Once the geometry is downloaded, the positions of the survey nests on various detectors are fit to their surveyed position and the resulting geometry is used in the simulation. CAD-based geometries are available in MAUS for Step I and idealized Step IV configurations. Verification by beamline and detector groups and validation against Step I data is ongoing. Until the validation is complete, MAUS continues to allow use of legacy geometry descriptions through flat text files. 4.3 Tracking, Field Maps and Beam Optics MAUS tracking is performed by GEANT4 [9]. By default, MAUS uses 4th order Runge-Kutta for tracking, although other routines are available. 4th order Runge-Kutta has been shown to have very good precision relative to the MICE detector resolutions, even for step sizes of several cm. Magnetic field maps are implemented as a series of overlapping regions, each of which contains a field. On each tracking step, MAUS iterates over the list of fields, transforms to the local coordinate system of the field map, and calculates the field. The field values are transformed back into the global coordinate system, summed and passed to GEANT4. Numerous field types have been implemented within the MAUS framework. Solenoid fields can be calculated numerically from cylindrically symmetric 2D

4 Monte Carlo 9 field maps, by taking derivatives of an on-axis solenoidal field or by using the sum of fields from a set of cylindrical current sheets. Pillbox fields can be calculated by using the Bessel functions appropriate for a TM010 cavity or by reading a cylindrically symmetric field map. Multipole fields can be calculated from a 3D field map, or by taking derivatives from the usual multipole expansion formulae. Linear, quadratic and cubic interpolation routines have been implemented for field maps. Matrix transport routines for propagating particles and beams through these field maps have been implemented. Transport matrices are calculated by taking the numerical derivative of tracking output. These can be used to transport beam ellipses and single particles, for example enabling optics work, beam matching and so forth. Higher order transport routines are also available. The accelerator modelling routines in MAUS have been validated against ICOOL and G4Beamline. The routines have been used to model a number of beamlines and rings, including the Neutrino Factory front end. 4.4 Detector response and digitization The modelling of the detector response and electronics enables MAUS to provide test data for reconstruction algorithms and estimate the errors introduced by the detector and its readout. The interaction of particles in material is also performed by GEANT4. A sensitive detector class for each detector processes the hits in active volumes and stores relevant hit information such as the volume that was hit, the energy deposited and the time of the hit. The digitizers then simulate the detector response to these hits, modelling processes such as the photon yield of scintillator light, attenuation in light guides and the pulse shape in the electronics. The data structure of the outputs from the digitizers are designed to mock the output from the unpacking of the data from the DAQ. At the moment detector descriptions and digitizations have been implemented for the TOF, Tracker and the KL detectors. The monte carlo for the TOFs and Trackers have been available for some time. A recent change to the Tracker MC has been the addition of noise hits. The KL simulation and digitization is a recent addition and is currently being validated against Step I data. Simulation of the EMR is not yet in the production release of MAUS. However, the simulation software is advancing and the EMR group is verifying the description of the detector geometry and validating the simulation against data from cosmic rays and from the recently concluded EMR run. The description of the Cerenkov is being optimized and the simulation and digitization of optical photons is under development. 4.5 Plans The priority is to validate the CAD-based geometry implementation and study its speed and performance under a full simulation. Simulations of the Cherenkov

5 Reconstruction 10 and EMR detectors are being tested before merging into the production software. The monte carlo framework currently does not have a trigger. Work has started on developing the framework and algorithm for a realistic trigger simulation, in order to model the effects of pile-up. 5 Reconstruction The reconstruction chain takes as its input either digitized hits from MC or DAQ digits from real data. Regardless, the detector reconstruction algorithms, by requirement and design, operate the same way on both MC and data. 5.1 Detector Reconstruction MAUS is currently capable of performing at least some reconstruction on all of the MICE detectors. The TOF reconstruction has been stable for several years. The TOF reconstruction has three steps; PMT hits are found, matching PMT hits on each end of the scintillator slab are associated to make slab hits, and matching slab hits are found to make a space point. Calibration constants are stored in the Configuration Database and applied during space point reconstruction. The TOF has been shown to provide good time resolutions, with errors at the 50 ps level. Improvements are being made to the calibration algorithms to improve the reconstructed resolutions and examine some suspected systematic effects. The Tracker reconstruction has undergone a refactoring process over several years. This process is now nearly complete. Hits from adjacent fiber channels are used to form clusters. Clusters in different planes in a station are then matched to form x, y spacepoints. The pattern recognition algorithm then attempts to fit tracks to the spacepoints and the final track fit is performed using a Kalman filter to account for energy loss and multiple scattering. In the case of monte carlo, noise hits are added before the spacepoint reconstruction stage. Studies of spacepoint and track reconstruction efficiencies have begun and will feedback into optimization of the reconstruction. The Ckov reconstruction takes the raw flash-adc data, subtracts pedestals, calculates the charge and applies calibrations to determine the photoelectron yield. Work is ongoing to optimize the calibration corrections and understand the efficiencies. Hit-level reconstruction of the KL is also implemented in MAUS. Individual PMT hits are unpacked from the DAQ or simulated from monte carlo and the reconstruction associates them to identify the slabs that were hit and calculates the charge and charge-product corresponding to each slab hit. Further reconstruction will be performed at the global level. Hit-level reconstruction of the EMR is now implemented in MAUS; the integrated ADC and time over threshold are calculated for each bar that was hit. The next steps are to use the bar hits to reconstruct tracks and calculate

5 Reconstruction 11 the range and energy. These higher level reconstructions are currently being developed. 5.2 Global reconstruction The aim of the global reconstruction is to take the reconstructed outputs from individual detectors and tie them together to form a global track and provide a likelihood for various particle hypotheses. The global track reconstruction has been under development for several years, but a lack of manpower with the appropriate level of seniority has hindered development. The data structure to hold global parameters has been implemented and work has started on importing space-points from the TOF and reconstructed tracks from the Tracker and then propagating and forming global global tracks. The software group realizes that this is a critical item essential for Step IV reconstruction and a detailed work plan is being developed to monitor progress and allocate resources if necessary to ensure its timely completion. Particle identification in MICE typically requires the combination of several detectors. Principally the time-of-flight between TOF detectors can be used to calculate velocity, which is compared with the momentum measured in the tracker to calculate particle mass and hence particle type. Additional information can be gleaned from the Ckov, KL and EMR detectors. The global particle identification framework is designed to tie this disparate information into a set of hypotheses of particle types, with an estimate of the likelihood of each hypothesis. The framework for this is now in place, and estimators drawing information from each detector are under development. 5.3 Online reconstruction During data taking, it is essential to visualize a detector s performance and have diagnostic tools to identify and debug unexpected behavior. This is accomplished through summary histograms of high and low-level reconstructions from detectors. These are available for the TOF, Cherenkov, KL, and Trackers. EMR displays were developed during the run in 2013 and integration with the MAUS framework is underway. For online reconstruction, MAUS uses a distributed processing model to enable a scalable reconstruction of the MICE dataset. Raw data is passed to a networked message queue for multiprocessing across multiple CPUs and servers. Reconstructed data is handed to another message queue. Histogramming routines pick data from this second message queue and collate it into histograms, which are written to disk. A web-based visualisation tool enables viewing of the histograms. A production version of the online reconstruction is available. Further development work is underway on the user interface and some of the backend infrastructure. Though the framework for the online reconstruction is based on distributed processing of spills, the reconstruction modules are the same as those used for offline processing.

5 Reconstruction 12 An event display summarising global reconstruction data is also under development. This will enable visualisation of the phase space distribution of the beam at various points along the beamline, together with a comparison of the nominal beam envelope propagation. The event display is intended to enable online validation of the behaviour of accelerator components, by comparing propagation of the beam envelope with the detected beam parameters. 5.4 Plans The remaining piece in the detector reconstruction chain is the EMR and it is anticipated that it will be available in MAUS in the next three months. That will give MAUS the capability to reconstruct every Step IV detector individually. The highest priority task for Step IV is the global reconstruction and particle identification. References [1] G. Gregoire et al., Proposal to the Rutherford Appleton Laboratory: An International Muon Ionization Cooling Experiment (MICE), Tech. rep. (2003), http://mice.iit.edu/micenotes/public/pdf/ MICE0021/MICE0021.pdf [2] http://launchpad.net [3] http://bazaar.canonical.com/ [4] http://www.gridpp.ac.uk/ [5] C.D. Tunnell and C.T. Rogers, MAUS: MICE Analysis User Software, IPAC (2011); http://micewww.pp.rl.ac.uk/projects/maus [6] http://jenkins-ci.org [7] http://g4beamline.muonsinc.com [8] R.C. Fernow, ICOOL: A simulation code for ionization cooling of muon beams, Proc. 1999 Particle Accelerator Conference, New York (1999). [9] S. Agostinelli, et al., GEANT4 - A Simulation Toolkit, Nucl. Instrum. Meth. A 506 (2003) 250-303

5 Reconstruction 13 Fig. 5: Data flow for the MAUS project. Items shown in black have some production implementation, although not necessarily the final one. Those shown in grey are under development.