Computing for LHC in Germany

Similar documents
Grid Computing Activities at KIT

Computing / The DESY Grid Center

Computing Model Tier-2 Plans for Germany Relations to GridKa/Tier-1

DESY. Andreas Gellrich DESY DESY,

ATLAS operations in the GridKa T1/T2 Cloud

The National Analysis DESY

Scientific data processing at global scale The LHC Computing Grid. fabio hernandez

The LHC Computing Grid

DESY at the LHC. Klaus Mőnig. On behalf of the ATLAS, CMS and the Grid/Tier2 communities

Challenges and Evolution of the LHC Production Grid. April 13, 2011 Ian Fisk

Virtualizing a Batch. University Grid Center

CMS Computing Model with Focus on German Tier1 Activities

RUSSIAN DATA INTENSIVE GRID (RDIG): CURRENT STATUS AND PERSPECTIVES TOWARD NATIONAL GRID INITIATIVE

The Grid: Processing the Data from the World s Largest Scientific Machine

Challenges of the LHC Computing Grid by the CMS experiment

Austrian Federated WLCG Tier-2

Network Requirements. September, Introduction

LHC Computing Grid today Did it work?

Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft. Presented by Manfred Alef Contributions of Jos van Wezel, Andreas Heiss

Summary of the LHC Computing Review

150 million sensors deliver data. 40 million times per second

Tier-2 DESY Volker Gülzow, Peter Wegner

The LHC Computing Grid. Slides mostly by: Dr Ian Bird LCG Project Leader 18 March 2008

Status of KISTI Tier2 Center for ALICE

Travelling securely on the Grid to the origin of the Universe

A Grid Computing Centre at Forschungszentrum Karlsruhe. Response on the Requirements for a Regional Data and Computing Centre in Germany 1 (RDCCG)

The LCG 3D Project. Maria Girone, CERN. The 23rd Open Grid Forum - OGF23 4th June 2008, Barcelona. CERN IT Department CH-1211 Genève 23 Switzerland

First Experience with LCG. Board of Sponsors 3 rd April 2009

CHIPP Phoenix Cluster Inauguration

DØ Southern Analysis Region Workshop Goals and Organization

Availability measurement of grid services from the perspective of a scientific computing centre

The grid for LHC Data Analysis

Preparing for High-Luminosity LHC. Bob Jones CERN Bob.Jones <at> cern.ch

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF

The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM

Batch Services at CERN: Status and Future Evolution

New strategies of the LHC experiments to meet the computing requirements of the HL-LHC era

A short introduction to the Worldwide LHC Computing Grid. Maarten Litmaath (CERN)

dcache, activities Patrick Fuhrmann 14 April 2010 Wuppertal, DE 4. dcache Workshop dcache.org

Storage Resource Sharing with CASTOR.

CPU Performance/Power Measurements at the Grid Computing Centre Karlsruhe

Site Report. Stephan Wiesand DESY -DV

Monte Carlo Production on the Grid by the H1 Collaboration

COD DECH giving feedback on their initial shifts

DSIT WP1 WP2. Federated AAI and Federated Storage Report for the Autumn 2014 All Hands Meeting

The Grid. Processing the Data from the World s Largest Scientific Machine II Brazilian LHC Computing Workshop

A scalable storage element and its usage in HEP

Evolution of the ATLAS PanDA Workload Management System for Exascale Computational Science

HTC/HPC Russia-EC. V. Ilyin NRC Kurchatov Institite Moscow State University

Grid-related related Activities around KISTI Tier2 Center for ALICE

CrossGrid testbed status

The LHC Computing Grid

Andrea Sciabà CERN, Switzerland

Support for multiple virtual organizations in the Romanian LCG Federation

Spanish Tier-2. Francisco Matorras (IFCA) Nicanor Colino (CIEMAT) F. Matorras N.Colino, Spain CMS T2,.6 March 2008"

CMS Belgian T2. G. Bruno UCL, Louvain, Belgium on behalf of the CMS Belgian T2 community. GridKa T1/2 meeting, Karlsruhe Germany February

ATLAS NorduGrid related activities

Virtualization of the ATLAS Tier-2/3 environment on the HPC cluster NEMO

Online Steering of HEP Grid Applications

Distributed Monte Carlo Production for

HammerCloud: A Stress Testing System for Distributed Analysis

Overview of ATLAS PanDA Workload Management

Virtualization. A very short summary by Owen Synge

Storage and I/O requirements of the LHC experiments

Tier-2 structure in Poland. R. Gokieli Institute for Nuclear Studies, Warsaw M. Witek Institute of Nuclear Physics, Cracow

CLOUDS OF JINR, UNIVERSITY OF SOFIA AND INRNE JOIN TOGETHER

From raw data to new fundamental particles: The data management lifecycle at the Large Hadron Collider

where the Web was born Experience of Adding New Architectures to the LCG Production Environment

and the GridKa mass storage system Jos van Wezel / GridKa

NAF & NUC reports. Y. Kemp for NAF admin team H. Stadie for NUC 4 th annual Alliance Workshop Dresden,

UNICORE Globus: Interoperability of Grid Infrastructures

The Global Grid and the Local Analysis

The Software Defined Online Storage System at the GridKa WLCG Tier-1 Center

Operating two InfiniBand grid clusters over 28 km distance

IEPSAS-Kosice: experiences in running LCG site

Grid Computing a new tool for science

Presentation of the LHCONE Architecture document

NCP Computing Infrastructure & T2-PK-NCP Site Update. Saqib Haleem National Centre for Physics (NCP), Pakistan

Editorial Manager(tm) for Journal of Physics: Conference Series Manuscript Draft

Computing at Belle II

Grid and Cloud Activities in KISTI

Clouds in High Energy Physics

DØ Regional Analysis Center Concepts

A European Vision and Plan for a Common Grid Infrastructure

Improved ATLAS HammerCloud Monitoring for Local Site Administration

The EGEE-III Project Towards Sustainable e-infrastructures

UW-ATLAS Experiences with Condor

Grid Computing at the IIHE

The University of Oxford campus grid, expansion and integrating new partners. Dr. David Wallom Technical Manager

GRIDS INTRODUCTION TO GRID INFRASTRUCTURES. Fabrizio Gagliardi

Distributed e-infrastructures for data intensive science

Europe and its Open Science Cloud: the Italian perspective. Luciano Gaido Plan-E meeting, Poznan, April

Towards Network Awareness in LHC Computing

Review of the Compact Muon Solenoid (CMS) Collaboration Heavy Ion Computing Proposal

SSRS-4 and the CREMLIN follow up project

INSPIRE and Service Level Management Why it matters and how to implement it

IWR GridKa 6. July GridKa Site Report. Holger Marten

Assessments Audits CERTIFICATION

ISTITUTO NAZIONALE DI FISICA NUCLEARE

The LHC Computing Grid Project in Spain (LCG-ES) Presentation to RECFA

Transcription:

1 Computing for LHC in Germany Günter Quast Universität Karlsruhe (TH) Meeting with RECFA Berlin, October 5th 2007 WLCG Tier1 & Tier2 Additional resources for data analysis - HGF ''Physics at the Terascale'' - D-Grid Grid Computing for HEP in Germany

Mastering the LHC Data volume... Lab m Uni x USA Brookhaven Lab a Physics institutes Desktop USA FermiLab LHC Compute Tier 1 Centre Tier2 γ Uni a UK β α CER N NL Uni y lume... France Uni n. Italy Lab b Dortmund Deutschland GSI Deutschland Lab c Aachen GridKa Uni b DESY Tier 3 2 München Hamburg Wuppertal Heidelberg Berlin... Grid Computing in Germany Freiburg Karlsruhe Mainz Münster Siegen Bonn. Frankfurt... Göttingen Dresden

WLCG Sites in Germany Together building a performant Grid Structure for LHC Data Analysis LCG Sites in Germany - Tier1 / Tier2 / Tier3 (=Institute clusters) - embedded within the international LHC-Grid - Participation in data and service challenges - Grid sites: FZK, DESY, GSI, MPI and installations at (so far) 7 universities - very active user communities - frequent training and schools - analyis facilities are emerging (funded by universities, BMBF & HGF) 3 x x x=non-hep LHC grid is largest Grid in Germany x Desy Hamburg u. Zeuthen Uni Dortmund Uni Wuppertal Uni Siegen RWTH Aachen GSI Darmstadt GridKa Karlsruhe Uni Karlsruhe Uni Freiburg MPI München LMU München October '07 list is growing! Particle physics groups are important partners in the German D-Grid initiative

GridKa, the German T1 4 Forschungszentrum Karlsruhe, member of the Helmholtz Association inaugurated in 2002 well established Tier1 Supports ALICE ATLAS CMS LHCb & 4 non-lhc experiments Global Grid User support ( GGUS ) Certification Authority ( CA ) for Germany

GridKa, the German T1 5 Presently, GridKa has ~2500 job slots all are permanently in use! Will rise by a factor of ~4 in 2008 Successful participation in WLCG and experimental Service Challenges GridKa Resources as pledged in WLCG MOU; funding already approved unitl 2009 ~1 average T1 for ALICE, ATLAS, LHCb ~10% of CMS T1 computing (< 1 average T1) Collection of resource requests 2010-2012 started; reviewed by GridKa Technical advisory board and later by Overview Board, will lead to funding request to FZK/HGF

GridKa associated T2s 6 GridKa supports >20 T2s in 6 countries Alice T2 sites in Russia + successful data transfers from/to all CMS T1 & >20 CMS T2 world-wide Most complex T2 cloud of any T1 in WLCG Coordinated by frequent T2 meetings

T2 Structure in Germany 7 * Federated CMS T2 Aachen-DESY 1.5 average CMS T2 GSI Darmstadt is ALICE T2 3 average ATLAS T2 distributed over 5 sites; 6th partner (Göttingen) being discussed (see Appendix for numbers on T2 sizes) Planned: DESY/Dortmund/MPI Heidelberg as future LHCb T2 for MC production

Funding of Tier2 Resources Before July 2007: Funding only for 1 ATLAS & 1 CMS T2 @ DESY (Hamburg & Zeuthen) ½ ATLAS T2 @ MPI Munich ~¼ CMS T2 @ RWTH Aachen in 2008 1 ALICE T2 @ GSI Darmstadt Since July 2007: Alliance of HGF Institutes and German Universities Physics at the Terascale approved and active. Provides additional funding over a period of five years for - T2 hardware at universities (~3M over 5 years) 3 ½ T2s for ATLAS @ Freiburg, Munich & Wuppertal ½ T2 for CMS @ RWTH Aachen - hardware operation and grid services secured by universities (cost exceeds HGF funding!) - funding of personnel for experiment-specific tasks by BMBF (~8 FTE for ATLAS & CMS T2s) - WLCG MOU resources pledged by all sites; - participation in WLCG and experimental challenges with prototypes 8

End-user Analysis Facilities 9 Also need dedicated user analysis facilities - institute clusters @ universities - National Analysis Facility ( NAF ) @ DESY) - D-Grid resources dedicated to HEP @ D-Grid sites Required size approximately equals T2 capacity, services are complementary! Data samples and proposed services of CMS Analysis Facility Data samples and Analysis in ATLAS D-CMS Workshop, DESY June 2007 German Atlas groups, Aug. 2007 Requirements specified by ATLAS and CMS, presently under review by DESY IT Long-term goal: Virtual IT centre for LHC analysis Funding for development work within Helmholtz Alliance

10 Funding by Helmholtz Alliance for Grid projects: 6.7 M over five years ( hardware & personnel!! ) Contributions by partners (personnel, operation and maintenance of hardware): 15,3 M over five years Projects Hardware: - Tier2 at DESY and three Universities - National Analysis Facility Network: Virtual private network among HEP sites Workpackages: WP1: Establishing a virtual computing centre WP2: Development of GRID tools and optimization of GRID components WP3: Training, workshops and schools

11 D-Grid projects The German Grid initiative Goals of HEP project: distributed and dynamic data management, job scheduling, accounting and monitoring of data resource utilization automated user support tasks, monitoring of jobs, error identification and enabling direct access to executing jobs for early control and steering Grid technologies for individual scientific data analysis. 9 partners from HEP and computer science Additionally, D-Grid initiative provided extra funding for Hardware share of HEP and Hadrons&Nuclei in 2007: ~4.0 M (will be integrated in national grid and analysis infrastructure @ T1 and T2 sites and some universities )

End-user Analysis Facilities 12 Tier3 & Analysis centres funded by : Universities: clusters with 50-100 CPU cores, several 10 TB storage HGF Alliance: CPU & Storage resources & Personnel volume: ~1M over five years for NAF@DESY ~40 FTE years of personnel D-Grid: additional resources, mainly at T1 / T2 sites, for German HEP ~4.0 M for HEP and Hadrons&Nuclei (approved two weeks ago, details being negotiated) BMBF extra initial funding for NAF @ DESY aim: have sufficient analysis capacity available for LHC data (details still being negotiated) Projects of HGF Alliance in the area of T2/T3/NAF: - virtualisation of resources - application-driven monitoring - improved data access management - training, schools and workshops - Grid-based storage systems - High-performance network between HEP sites: Virtual Computer Centre

Training & schools 13 Since 2002: GridKa School with large participation from HEP Lectures and hands-on sessions: - in Grid middleware ( usage and installation & administration) - Grid and HEP applications - developments in science and industry Complemented by training sessions of experimental communities during national workshops HGF Alliance also committed to training and education 140 participants from 14 Countries Grid computing and efficient usage of WLCG resources well established in German HEP!

Organisational Structure German T1/T2 well represented in WLCG committees C-RRB CB OB HGF, BMBF, Experiments GridKa overview board GridKa T1 WLCG GridKa Technical advisory Board MB GDB HGF Alliance Management Board T2s T3s/Analysis HGF Alliance Grid Project Board Large personal overlap Experimental communities 14

Summary & (my) Conclusions 15 GridKa is well established T1 within WLCG -supports ALICE, ATLAS, CMS and LHCb T2 structure being built up for all 4 LHC experiments - funding secured recently by HGF Alliance Physics at the Terascale - all sites involved showed working prototypes already T3 and analysis infrastructure evolving - complicated funding - development of tools and grid integration of end-user analysis facilities depends strongly on personnel funded by HGF Alliance - HEP community projects within German Grid initiative important Very tight situation concerning personnel - difficult to find qualified people for development and operation - lack of positions with long-term prospects

Appendix Pledged WLCG Tier2 resources in Germany Program of HFG Alliance Physics at the Tera Scale 16

Tier2 in Germany WLCG MoU, August 2007 17

Tier2 Resources ALICE CMS 18

Tier2 Ressources (2) ATLAS 19

Physics at the Terascale 20