GRID Activity at Russia and JINR

Similar documents
RUSSIAN DATA INTENSIVE GRID (RDIG): CURRENT STATUS AND PERSPECTIVES TOWARD NATIONAL GRID INITIATIVE

The EGEE-III Project Towards Sustainable e-infrastructures

The LHC Computing Grid

The LHC Computing Grid. Slides mostly by: Dr Ian Bird LCG Project Leader 18 March 2008

Pan-European Grid einfrastructure for LHC Experiments at CERN - SCL's Activities in EGEE

RDMS CMS Computing Activities before the LHC start

Introduction, History

Scientific data processing at global scale The LHC Computing Grid. fabio hernandez

The Grid: Processing the Data from the World s Largest Scientific Machine

Distributed e-infrastructures for data intensive science

Andrea Sciabà CERN, Switzerland

First Experience with LCG. Board of Sponsors 3 rd April 2009

The grid for LHC Data Analysis

Federated data storage system prototype for LHC experiments and data intensive science

Preparing for High-Luminosity LHC. Bob Jones CERN Bob.Jones <at> cern.ch

Experience of Data Grid simulation packages using.

Status of KISTI Tier2 Center for ALICE

Grid Challenges and Experience

The Grid. Processing the Data from the World s Largest Scientific Machine II Brazilian LHC Computing Workshop

Travelling securely on the Grid to the origin of the Universe

The LHC Computing Grid

Operating the Distributed NDGF Tier-1

CHIPP Phoenix Cluster Inauguration

A short introduction to the Worldwide LHC Computing Grid. Maarten Litmaath (CERN)

Laboratory of Information Technologies. Distributed computing and Grid technologies. Grid-infrastructure infrastructure at JINR

where the Web was born Experience of Adding New Architectures to the LCG Production Environment

Introduction to Grid Infrastructures

Grid Interoperation and Regional Collaboration

Federated Data Storage System Prototype based on dcache

On the employment of LCG GRID middleware

The European DataGRID Production Testbed

CLOUDS OF JINR, UNIVERSITY OF SOFIA AND INRNE JOIN TOGETHER

Storage and I/O requirements of the LHC experiments

DESY. Andreas Gellrich DESY DESY,

IEPSAS-Kosice: experiences in running LCG site

HTC/HPC Russia-EC. V. Ilyin NRC Kurchatov Institite Moscow State University

Grids and Security. Ian Neilson Grid Deployment Group CERN. TF-CSIRT London 27 Jan

WLCG and Grid Computing Summer 2011 Part1: WLCG Markus Schulz. IT Grid Technology Group, CERN WLCG

High Performance Computing from an EU perspective

Tier-2 structure in Poland. R. Gokieli Institute for Nuclear Studies, Warsaw M. Witek Institute of Nuclear Physics, Cracow

ALICE Grid Activities in US

Bob Jones. EGEE and glite are registered trademarks. egee EGEE-III INFSO-RI

g-eclipse A Framework for Accessing Grid Infrastructures Nicholas Loulloudes Trainer, University of Cyprus (loulloudes.n_at_cs.ucy.ac.

Batch Services at CERN: Status and Future Evolution

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF

Computing grids, a tool for international collaboration and against digital divide Guy Wormser Director of CNRS Institut des Grilles (CNRS, France)

FREE SCIENTIFIC COMPUTING

N. Marusov, I. Semenov

e-infrastructure: objectives and strategy in FP7

LHC Grid Computing in Russia: present and future

Interconnection of Armenian e- Infrastructures with the pan- Euroepan Integrated Environments

The LHC computing model and its evolution. Dr Bob Jones CERN

A European Vision and Plan for a Common Grid Infrastructure

On the EGI Operational Level Agreement Framework

Computing / The DESY Grid Center

European Grid Infrastructure

The INFN Tier1. 1. INFN-CNAF, Italy

Outline. Infrastructure and operations architecture. Operations. Services Monitoring and management tools

The Portuguese National Grid Initiative (INGRID)

The LCG 3D Project. Maria Girone, CERN. The 23rd Open Grid Forum - OGF23 4th June 2008, Barcelona. CERN IT Department CH-1211 Genève 23 Switzerland

Computing for LHC in Germany

EGEE and Interoperation

AGIS: The ATLAS Grid Information System

GRID AND HPC SUPPORT FOR NATIONAL PARTICIPATION IN LARGE-SCALE COLLABORATIONS

Austrian Federated WLCG Tier-2

Grid and Cloud Activities in KISTI

GRIDS INTRODUCTION TO GRID INFRASTRUCTURES. Fabrizio Gagliardi

Monte Carlo Production on the Grid by the H1 Collaboration

Moving e-infrastructure into a new era the FP7 challenge

e-science Infrastructure and Applications in Taiwan Eric Yen and Simon C. Lin ASGC, Taiwan Apr. 2008

SSRS-4 and the CREMLIN follow up project

Garuda : The National Grid Computing Initiative Of India. Natraj A.C, CDAC Knowledge Park, Bangalore.

Challenges and Evolution of the LHC Production Grid. April 13, 2011 Ian Fisk

Features and Future. Frédéric Hemmer - CERN Deputy Head of IT Department. Enabling Grids for E-sciencE. BEGrid seminar Brussels, October 27, 2006

150 million sensors deliver data. 40 million times per second

The EU DataGrid Testbed

Tier2 Centre in Prague

JINR cloud infrastructure

Monitoring tools in EGEE

PoS(EGICF12-EMITC2)106

WLCG Transfers Dashboard: a Unified Monitoring Tool for Heterogeneous Data Transfers.

Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft. Presented by Manfred Alef Contributions of Jos van Wezel, Andreas Heiss

The JINR Tier1 Site Simulation for Research and Development Purposes

glite Grid Services Overview

Grid services. Enabling Grids for E-sciencE. Dusan Vudragovic Scientific Computing Laboratory Institute of Physics Belgrade, Serbia

Support for multiple virtual organizations in the Romanian LCG Federation

Experience of the WLCG data management system from the first two years of the LHC data taking

Monitoring System for the GRID Monte Carlo Mass Production in the H1 Experiment at DESY

CernVM-FS beyond LHC computing

The EPIKH, GILDA and GISELA Projects

Grid Computing a new tool for science

CERN openlab II. CERN openlab and. Sverre Jarp CERN openlab CTO 16 September 2008

GRID 2008 International Conference on Distributed computing and Grid technologies in science and education 30 June 04 July, 2008, Dubna, Russia

Introduction to FREE National Resources for Scientific Computing. Dana Brunson. Jeff Pummill

HEP Grid Activities in China

EGI: Linking digital resources across Eastern Europe for European science and innovation

Lessons Learned in the NorduGrid Federation

AMGA metadata catalogue system

Storage Virtualization. Eric Yen Academia Sinica Grid Computing Centre (ASGC) Taiwan

Enabling Grids for E-sciencE. EGEE security pitch. Olle Mulmo. EGEE Chief Security Architect KTH, Sweden. INFSO-RI

Transcription:

GRID Activity at Russia and JINR Korenkov Vladimir (LIT,JINR) MMCP-2009, Dubna,, 07.07.09

Grid computing Today there are many definitions of Grid computing: The definitive definition of a Grid is provided by Ian Foster in his article "What is the Grid? A Three Point Checklist" The three points of this checklist are: Computing resources are not administered centrally; Open standards are used; Non-trivial quality of service is achieved.

Global Community

CERN - - European Organization for Nuclear Research Research/ Discovery Technology http://www.egee-rdig.ru Training Collaborating

Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution http://www.egee-rdig.ru Ian.Bird@cern.ch 1.25 GB/sec (ions) 5

Tier 0 Tier 1 Tier 2 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (>200 centres): Simulation End-user analysis http://www.egee-rdig.ru Ian.Bird@cern.ch 6

Some history 1999 Monarc Project Early discussions on how to organise distributed computing for LHC 2001-2003 - EU DataGrid project middleware & testbed for an operational grid 2002-2005 LHC Computing Grid LCG deploying the results of DataGrid to provide a production facility for LHC experiments 2004-2006 EU EGEE project phase 1 starts from the LCG grid shared production infrastructure expanding to other communities and sciences 2006-2008 EU EGEE-II Building on phase 1 Expanding applications and communities 2008-2010 EU EGEE-III CERN

Purpose Develop, build and maintain a distributed computing environment for the storage and analysis of data from the four LHC experiments Ensure the computing service and common application libraries and tools The Worldwide LHC Computing Grid Phase I 2002-05 - Development & planning Phase II 2006-2008 Deployment & commissioning of the initial services

EGEE & OSG LCG depends on two major science grid infrastructures EGEE - Enabling Grids for E-Science OSG - US Open Science Grid

The Map of OSG Sites (in the US) NERSC LBL UDAVIS STANFORD UCSB CALTECH UCLA UCR SDSC UNM CU HARVARD RIT ALBANY MIT BUFFALO T BU UWM UMICH BNL WISC MSU WSU CORNELL FNAL UIC PSU UNI UCHICAGO LEHIGH UNL ANL GEORGETOWN UIOWA ND UIUC PURDUE NSF UMD IUPUI KU IU UVA RENCI ORNL VANDERBILT OU UMISS CLEMSON TTU UTA LTU SMU LSU UFL FIT http://www.egee-rdig.ru Ruth Pordes, FNAL 10

EGEE (Enabling Grids for E-sciencE science) The aim of the project is to create a global Pan-European computing infrastructure of a Grid type. - Integrate regional Grid efforts - Represent leading grid activities in Europe 10 Federations, 27 Countries, 70 Organizations

EGEE-III Flagship Grid infrastructure project co-funded by the European Commission Main Objectives Expand/optimise existing EGEE infrastructure, include more resources and user communities Prepare migration from a projectbased model to a sustainable federated infrastructure based on National Grid Initiatives Duration: 2 years Consortium: ~140 organisations across 33 countries EC co-funding: 32Million http://www.egee-rdig.ru The EGEE project - Bob Jones - EGEE'08-22 September 2008 13

Infrastructure operation Currently includes >350 sites across 55 countries Continuous monitoring of grid services & automated site configuration/management Support ~300 Virtual Organisations from diverse research disciplines EGEE What can we deliver? Middleware Production quality middleware distributed under business friendly open source licence User Support - Managed process from first contact through to production usage Training Expertise in grid-enabling applications Online helpdesk Networking events (User Forum, Conferences etc.)

>300 VOs from several scientific domains Astronomy & Astrophysics Civil Protection Computational Chemistry Comp. Fluid Dynamics Computer Science/Tools Condensed Matter Physics Earth Sciences Fusion High Energy Physics Life Sciences Further applications under evaluation Applications have moved from testing to routine and daily usage ~80-95% efficiency http://www.egee-rdig.ru EGEE Achievements - Applications EGEE - Bob Jones - 4th Grid & e-collaboration workshop, Frascati, 25 February 2009 15

Collaborating e-infrastructures Potential for linking ~80 countries by 2008

Grid - projects around EGEE Infrastructures Registered Collaborating Projects geographical or thematic coverage Applications improved services for academia, industry and the public Support Actions key complementary functions

350 sites 55 countries 120,000 CPUs 26 PetaBytes >15,000 users >300 VOs >370,000 jobs/day Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences http://www.egee-rdig.ru

The Future of Grids From e-infrastructures to Knowledge Infrastructures Network infrastructure connects computing and data resources and allows their seamless usage via Grid infrastructures Federated resources and new technologies enable new application fields: Distributed digital libraries Distributed data mining Digital preservation of cultural heritage Data curation The Future of Grids KNOWLEDGE. INFRASTRUCTURE GRID. INFRASTRUCTURE NETWORK. INFRASTRUCTURE Knowledge Infrastructure Major Opportunity for Academic and Businesses alike http://www.egee-rdig.ru Erwin Laure glite and Business EGEE 08 BT 22 Sept 2008 19

The Russian consortium RDIG The Russian consortium RDIG (Russian Data Intensive Grid, was set up in September 2003 as a national federation in the EGEE project. IHEP - Institute of High Energy Physics (Protvino), IMPB RAS - Institute of Mathematical Problems in Biology (Pushchino), ITEP - Institute of Theoretical and Experimental Physics JINR - Joint Institute for Nuclear Research (Dubna), KIAM RAS - Keldysh Institute of Applied Mathematics PNPI - Petersburg Nuclear Physics Institute (Gatchina), RRC KI - Russian Research Center Kurchatov Institute SINP-MSU - Skobeltsyn Institute of Nuclear Physics,MSU,

The RDIG infrastructure Now the RDIG infrastructure comprises 15 Resource Centers with more 7000 ksi2k CPU and more 1850 TB of disc storage. RDIG Resource Centres: ITEP JINR-LCG2 Kharkov-KIPT RRC-KI RU-Moscow-KIAM RU-Phys-SPbSU RU-Protvino-IHEP RU-SPbSU Ru-Troitsk-INR ru-impb-lcg2 ru-moscow-fian ru-moscow-gcras ru-moscow-mephi ru-pnpi-lcg2 ru-moscow-sinp

Development and maintenance of RDIG e-infrastructure The main directions in development and maintenance of RDIG e-infrastructure are as the following: - support of basic grid-services; - Support of Regional Operations Center (ROC); - Support of Resource Centers (RC) in Russia; - RDIG Certification Authority; - RDIG Monitoring and Accounting; - participation in integration, testing, certification of grid-software; - support of Users, Virtual Organization (VO) and application; - User & Administrator training and education; - Dissemination, outreach and Communication grid activities.

Portal www.egee-rdig.ru

RDIG Certification Authority

GGUS (ITEP from RDIG)

VOs Infrastructure VO's (all RC's): dteam ops Most RC support the WLCG/EGEE VO's Alice Atlas CMS LHCb Supported by some RC's: gear Biomed Fusion Regional VO's Ams, eearth, photon, rdteam, rgstest Flagship applications: LHC, Fusion (toward to ITER), nanotechnology Current interests from: medicine, engineering

RDIG monitoring & accounting http://rocmon.jinr.ru:8080 Monitoring allows to keep an eye on parameters of Grid sites' operation in real time Accounting - resources utilization on Grid sites by virtual organizations and single users

Production Normalised CPU time per EGEE Region (2008 and June 2009)

Production Normalised CPU time per Country (Dec2008-June 2009)

Russia and JINR Normalized CPU time per SITE (December 2008 - June 2009 )

Production Normalised CPU time per site for VO LHC (June 2008-January 2009) FZK 8,640,981 GRIF 6,413,585 IN2P3-CC-T2 6,274,114 IN2P3-CC 5,869,444 NIKHEF 5,479,998 TRIUMF 5,342,865 RAL 4,007,746 JINR 3,970,242

LHC Computing Grid Project (LCG) The protocol between CERN, Russia and JINR on a participation in LCG Project has been approved in 2003. The tasks of the Russian institutes in the LCG have been defined as: LCG software testing; evaluation of new Grid technologies (e.g. Globus toolkit 3) in a context of using in the LCG; event generators repository, data base of physical events: support and development.

LHC Computing Grid Project (LCG) The tasks of the Russian institutes & JINR in the LCG (2008 years): Task 1. MW (glite) Testsuit (supervisor O. Keeble) Task 2. LCG vs Experiments (supervisor I. Bird) Task 3. LCG monitoring (supervisor J. Andreeva) Task 4/5. Genser/ MCDB ( supervisor A. Ribon)

LHC experiments support at JINR A necessary level of all the elements of the JINR telecommunication, network and information infrastructure should be provided: High-throughput telecommunication data links, JINR local area network (LAN) backbone, Central computer complex and Grid segment, Software support of the LHC experiments.

JINR - Moscow telecommunication channel Collaboration with RSCC, RIPN, MSK-IX, JET Infosystems, Nortel

JINR Central Information and Computing Complex (CICC)

Central Information and Computing Complex In 2008,, total CICC performance was 1400 ksi2k, the disk storage capacity 100 TB At present,, the CICC performance equals 2300 ksi2k and the disk storage capacity 400 TB 2008 2009 2010 CPU (ksi2k) Disk (TB) Tapes (TB) 1250 1750 2500 400 800 1200-100 200 SuperBlade 5 BOX 80 CPU Xeon 5430 2.66 GHz Quad Core (two with InfiniBand)

JINR WLCG infrastructure CICC comprises: 65 servers 4 interactive nodes 960 computing nodes, Xeon 5150, 8GB RAM (GEthernet) (160 computing nodes, Xeon X5450, 16GB RAM, InfiniBand). Site name: JINR-LCG2 Internal CICC network 1Gbit/sec Operating system -Scientific Linux CERN 4.6; Middleware version GLITE-3.1 File Systems AFS (the Andrew File System) for user Software and home directories is a world-wide distributed file system. AFS permits to share easily files in an heterogeneous distributed environment (UNIXes, NT) with a unique authentication scheme (Kerberos). dcache- for data. User registration system Kerberos 5 ( AFS use Kerberos 5 for authentication )

JINR Central Information and Computing Complex (CICC)

JINR WLCG infrastructure JINR provides the following services in the WLCG environment: Basic services: - Berkley DB Information Index (top level BDII); - site BDII; - 2 x Computing Element (CE); - Proxy Server (PX); - 2 x Workload Management System (WMS); - Logging&Bookkeeping Service (LB); - RGMA-based monitoring system collector server (MON-box); - LCG File Catalog (LFC); - Storage Element (SE), dcache 400 TB, 4 x gridftp door,, 14 x pool; - 4 x User Interface (UI), installed in AFS. Special Services - VO boxes for ALICE and for CMS; ROCMON; PPS and testing infrastructure - Pre-production glite version; Software for VOs: dcache xrootd door, AliROOT,, ROOT, GEANT packages for ALICE; ATLAS packages; CMSSW packages for CMS and DaVinchi, Gauss packages for LHCb.

JINR in the LCG/EGEE support and development WLCG/EGEE infrastructure; participation in middleware testing/evaluation, participation in Data and Service Challenges, grid monitoring and accounting system development; FTS-monitoring and testing MCDB development; Participation in ARDA activities in coordination with experiments; HEP applications; User & Administrator Training and Education support of JINR member states in the GRID activities.

COURSES LECTURES PRACTICAL TRAINING User Training and Induction Russian and JINR physicists participants of ATLAS experiment train and practise with Grid and the GANGA

glite training infrastructure schema

Frames for Grid cooperation of JINR in 2009 Worldwide LHC Computing Grid (WLCG); Enabling Grids for E-sciencE (EGEE); RDIG Development (Project of FASI) CERN-RFBR project GRID Monitoring from VO perspective BMBF grant Development of the GRID-infrastructure and tools to provide joint investigations performed with participation of JINR and German research centers Development of Grid segment for the LHC experiments was supported in frames of JINR-South Africa cooperation agreement; NATO project "DREAMS-ASIA (Development of grid EnAbling technology in Medicine&Science for Central ASIA); JINR-Romania cooperation Hulubei-Meshcheryakov programme Project "SKIF-GRID" (Program of Belarussian-Russian Union State). Project GridNNN (National Nanotechnological Net) We work in close cooperation and provide support to our partners in Armenia, Belarus, Bulgaria, Czech Republic, Georgia, Germany, Poland, Romania, South Africa, Ukraine, Uzbekistan.

Useful References: OPEN GRID FORUM: //www.ogf.org GLOBUS: //www.globus.org TERAGRID: //www.teragrid.org LCG: //lcg.web.cern.ch/lcg/ EGEE: //www.eu-egee.org EGEE-RDIG: //www.egee-rdig.ru GRIDCLUB: //www.gridclub.ru

The blind men and the elephant in the room Cyberinfrastructure Grids SaaS SOA Shared Infrastructure/ Shared Services Virtualization Web 2.0 Automation