Grid and Cloud Activities in KISTI

Similar documents
Status of KISTI Tier2 Center for ALICE

DESY. Andreas Gellrich DESY DESY,

Scientific data processing at global scale The LHC Computing Grid. fabio hernandez

The LHC Computing Grid

The Grid: Processing the Data from the World s Largest Scientific Machine

Operating the Distributed NDGF Tier-1

Grid-related related Activities around KISTI Tier2 Center for ALICE

Computing / The DESY Grid Center

First Experience with LCG. Board of Sponsors 3 rd April 2009

The LHC Computing Grid. Slides mostly by: Dr Ian Bird LCG Project Leader 18 March 2008

Andrea Sciabà CERN, Switzerland

The LCG 3D Project. Maria Girone, CERN. The 23rd Open Grid Forum - OGF23 4th June 2008, Barcelona. CERN IT Department CH-1211 Genève 23 Switzerland

ALICE Grid Activities in US

Challenges and Evolution of the LHC Production Grid. April 13, 2011 Ian Fisk

Pan-European Grid einfrastructure for LHC Experiments at CERN - SCL's Activities in EGEE

Grid Computing Activities at KIT

Travelling securely on the Grid to the origin of the Universe

AMGA metadata catalogue system

Beob Kyun KIM, Christophe BONNAUD {kyun, NSDC / KISTI

Access the power of Grid with Eclipse

Distributed Computing Framework. A. Tsaregorodtsev, CPPM-IN2P3-CNRS, Marseille

High Performance Computing from an EU perspective

08. February. Jonghu Lee Global Science Data Center/KISTI

The grid for LHC Data Analysis

GRID AND HPC SUPPORT FOR NATIONAL PARTICIPATION IN LARGE-SCALE COLLABORATIONS

g-eclipse A Framework for Accessing Grid Infrastructures Nicholas Loulloudes Trainer, University of Cyprus (loulloudes.n_at_cs.ucy.ac.

Grid Interoperation and Regional Collaboration

A short introduction to the Worldwide LHC Computing Grid. Maarten Litmaath (CERN)

European Grid Infrastructure

The Grid. Processing the Data from the World s Largest Scientific Machine II Brazilian LHC Computing Workshop

The European DataGRID Production Testbed

Support for multiple virtual organizations in the Romanian LCG Federation

Garuda : The National Grid Computing Initiative Of India. Natraj A.C, CDAC Knowledge Park, Bangalore.

Grid services. Enabling Grids for E-sciencE. Dusan Vudragovic Scientific Computing Laboratory Institute of Physics Belgrade, Serbia

RUSSIAN DATA INTENSIVE GRID (RDIG): CURRENT STATUS AND PERSPECTIVES TOWARD NATIONAL GRID INITIATIVE

Current Progress of Grid Project in KMA

CernVM-FS beyond LHC computing

e-science for High-Energy Physics in Korea

Ganga - a job management and optimisation tool. A. Maier CERN

Grid Computing at the IIHE

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF

NCP Computing Infrastructure & T2-PK-NCP Site Update. Saqib Haleem National Centre for Physics (NCP), Pakistan

Clouds in High Energy Physics

Giovanni Lamanna LAPP - Laboratoire d'annecy-le-vieux de Physique des Particules, Université de Savoie, CNRS/IN2P3, Annecy-le-Vieux, France

On the EGI Operational Level Agreement Framework

glite Grid Services Overview

The LHC Computing Grid

The EPIKH, GILDA and GISELA Projects

Grid Computing at Ljubljana and Nova Gorica

University of Johannesburg South Africa. Stavros Lambropoulos Network Engineer

Introduction to Grid Infrastructures

Computing grids, a tool for international collaboration and against digital divide Guy Wormser Director of CNRS Institut des Grilles (CNRS, France)

Outline. Infrastructure and operations architecture. Operations. Services Monitoring and management tools

CHIPP Phoenix Cluster Inauguration

30 Nov Dec Advanced School in High Performance and GRID Computing Concepts and Applications, ICTP, Trieste, Italy

e-infrastructure: objectives and strategy in FP7

From raw data to new fundamental particles: The data management lifecycle at the Large Hadron Collider

Computing at Belle II

where the Web was born Experience of Adding New Architectures to the LCG Production Environment

SZDG, ecom4com technology, EDGeS-EDGI in large P. Kacsuk MTA SZTAKI

ARC integration for CMS

The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM

Clouds at other sites T2-type computing

Site Report. Stephan Wiesand DESY -DV


CC-IN2P3: A High Performance Data Center for Research

CMS Belgian T2. G. Bruno UCL, Louvain, Belgium on behalf of the CMS Belgian T2 community. GridKa T1/2 meeting, Karlsruhe Germany February

Summary of the LHC Computing Review

CERN and Scientific Computing

Grid Scheduling Architectures with Globus

Europe and its Open Science Cloud: the Italian perspective. Luciano Gaido Plan-E meeting, Poznan, April

Tier-2 structure in Poland. R. Gokieli Institute for Nuclear Studies, Warsaw M. Witek Institute of Nuclear Physics, Cracow

A European Vision and Plan for a Common Grid Infrastructure

Bookkeeping and submission tools prototype. L. Tomassetti on behalf of distributed computing group

High-Performance Computing Europe s place in a Global Race

Computing for LHC in Germany

EGI-Engage: Impact & Results. Advanced Computing for Research

GRID COMPUTING APPLIED TO OFF-LINE AGATA DATA PROCESSING. 2nd EGAN School, December 2012, GSI Darmstadt, Germany

ACCI Recommendations on Long Term Cyberinfrastructure Issues: Building Future Development

On the employment of LCG GRID middleware

IEPSAS-Kosice: experiences in running LCG site

HEP Grid Activities in China

EGEODE. !Dominique Thomas;! Compagnie Générale de Géophysique (CGG, France) R&D. Expanding Geosciences On Demand 1. «Expanding Geosciences On Demand»

1. Introduction. Outline

Virtualizing a Batch. University Grid Center

Grid technologies, solutions and concepts in the synchrotron Elettra

e-infrastructures in FP7 INFO DAY - Paris

EGEE and Interoperation

Belle & Belle II. Takanori Hara (KEK) 9 June, 2015 DPHEP Collaboration CERN

Constant monitoring of multi-site network connectivity at the Tokyo Tier2 center

e-science Infrastructure and Applications in Taiwan Eric Yen and Simon C. Lin ASGC, Taiwan Apr. 2008

CMS Grid Computing at TAMU Performance, Monitoring and Current Status of the Brazos Cluster

Presentation Title. Grid Computing Project Officer / Research Assistance. InfoComm Development Center (idec) & Department of Communication

CernVM-FS. Catalin Condurache STFC RAL UK

( PROPOSAL ) THE AGATA GRID COMPUTING MODEL FOR DATA MANAGEMENT AND DATA PROCESSING. version 0.6. July 2010 Revised January 2011

GRID 2008 International Conference on Distributed computing and Grid technologies in science and education 30 June 04 July, 2008, Dubna, Russia

The Portuguese National Grid Initiative (INGRID)

N. Marusov, I. Semenov

Moving e-infrastructure into a new era the FP7 challenge

Grid Computing a new tool for science

Transcription:

Grid and Cloud Activities in KISTI March 23, 2011 Soonwook Hwang KISTI, KOREA 1

Outline Grid Operation and Infrastructure KISTI ALICE Tier2 Center FKPPL VO: Production Grid Infrastructure Global Science Data Center Partnership & Leadership for Nationwide Superc omputing Infrastructure Development of AMGA within EMI

KISTI ALICE Tier2 Center

KISTI ALICE Tier-2 Center Signing of WLCG MoU for the ALI CE Tier-2 center ( 07.10.23) Has been part of ALICE distribut ed computing grid as an official T 2 Providing a reliable and stable no de in the ALICE Grid Funded by MEST ~200,000 US dollars/year

KISTI s contribution to ALICE computing 1.21 % 1.2 % ~1.2% contribution to ALICE computing in the total job execution 120 CPU cores and 50 TB storage dedicated to the ALICE experiment Processing near 8000 jobs per month in average

Accounting and Availability of KISTI

FKPPL VO: Production Grid Inf rastructure

FKPPL VO Grid Based on glite Objective Foster the adoption of grid technology and provide researchers in Kor ea and France with production Grid Infrastructure Operation Has been up and running since October 2008, providing about 10,000 CPU cores and 30 TBytes of Disk Storage last December, KEK has joined FKPPL VO, contributing ~1,600 CPU cores and 27 Tbytes of Disks Under discussion of moving towards the construction of France-Asia VO As of now, ~70 users have joined the FKPPL VO membership LFC IN2P3 CE SE CE WMS CE SE VOMS WMS UI WIKI KISTI SE

Application porting Support on FKPPL VO Deployment of Geant4 applications Used extensively by the National Cancel Center in Korea to carry out compute-intensive simulations relevant to cancer treatment pla nning In collaboration with Dr. Jungwook Shin and Se Byeong Lee of Nat ional Cancer Center in Korea Deployment of two-color QCD (Quantum ChromoDyna mics) simulations in theoretical Physics Several hundreds or thousands of QCD jobs are required to be run on the Grid, with each jobs taking about 10 days. In collaboration with Prof. Seyong Kim of Sejong University

Distribution of Nomalized CPU Time (hepspec06) Grouped by VO (Sep. 2010 Nov. 2010) VO Total 1 atlas 712,986,044 2 cms 228,199,324 3 alice 204,675,996 4 lhcb 79,816,476 5 the ophys 54,402,212 6 dzero 17,933,284 7 compchem 14,215,152 8 ilc 14,167,236 9 vo.cta.in2p3.fr 6,971,240 10 biomed 6,684,528 11 superbvo.org 5,375,896 12 auger 5,358,432 13 hone 5,136,952 14 pheno 4,866,264 15 icecube 4,363,112 16 fkppl.kis ti.re.kr 4,150,920 17 see 4,086,732 18 cosmo 3,862,432 19 vo.lal.in2p3.fr 3,095,064 20 enmr.eu 2,799,256 VO Non LHC Total 1 the ophys 54,402,212 2 dzero 17,933,284 3 compchem 14,215,152 4 ilc 14,167,236 5 vo.cta.in2p3.fr 6,971,240 6 biomed 6,684,528 7 superbvo.org 5,375,896 8 auger 5,358,432 9 hone 5,136,952 10 pheno 4,866,264 11 icecube 4,363,112 12 fkppl.kis ti.re.kr 4,150,920 13 see 4,086,732 14 cosmo 3,862,432 15 vo.lal.in2p3.fr 3,095,064 16 enmr.eu 2,799,256 Total 1,420,148,568 Total (all VOs) 1,420,148,568

Grid Training (1/2) In February, we organized Geant4 and Grid tutorial 2010 f or Korean medical physics community About 34 participants from major hospitals in Korea About 20 new users joined the FKPPL VO membership

Grid Training (2/2) u 2010 Summer Training Course on Geant4, GATE and Gri d computing held in Seoul in July About 50 participants from about 20 institutes in Korea

Global Science Experimental D ata Hub Center

KISTI GSDC Center Korean Gov. support " Computing and Storage Infrastructure " Technology Development " Apply Grid Technology to legacy app. RAW Data Tier-1 ALICE Tier-1 prototype ALICE Tier-2 KiAF (KISTI Analysis Farm) RAW Data Supporting Data Centric Research Communities & Promotion of Research Collaboration

Roadmap 2PB/2,000CPU 5PB/5,000CPU (Alice/CERN) (CDF/FNAL) (STAR /BNL) (Belle /KEK) (LIGO /LLO) Phase 2 Phase 1 (2009~2011) Provide Global Science Data Analysis E nvironment: HEP, Astronomy, etc. National Data Center Asian-Pacific Hub Center (2012~2014) Expand Supporting Fields: Earth Environment, Biometrics, Nano-tech, etc. Global Computing Resources Assigning and I nformation System Cyber Research and Training Environment

Partnership and Leadership for Na tionwide Supercomputing Infrastr ucture (PLSI)

PLSI Consortium of 14 HPC Computing Centers in Korea Distributed HPC computing environments for world- class computational science research Period : 2007 ~ Budget: ~2M$ / year Goal: 400 Tflops, 14 HPC Centers around Korea Current status: ü Have Established ~80 TFlops computing capacity by combining 15 computing resources at 10 partner sites over dedicated high-performance networks

Distributed HPC Infrastructure [Figure 1] PLSI unified computing service infrastructure

PLSI Portal PLSI User Portal PLSI Resource Allocation/ Job submission and management User Accounting Information/ SSH & SFTP Terminal PLSI MGrid Portal Application Portal targeting to molecular dynamics simulation

Development of AMGA

AMGA An official glite middleware component for a metadata catalogue service AMGA provides: Access to metadata for files distributed on the grid KISTI has taken over the leadership of AMGA development since the July of 20 09 AMGA 2.0 supporting the OGF WS-DAIR was successfully released in October in 2009 in c ollaboration with CERN and INFN AMGA 2.1 released in April 2010 supports Data Federation of Metadata and AMGA GUI cli ents was developed as part of the AMGA 2.1 release. KISTI is one of the product teams of EMI by contributing to the evolution and m aintenance of AMGA Drug Discovery High Energy Physics High Energy Physics Digital Library Climate Research

Participation to EMI with AMGA 2

Summary Likewise in EGEE, KISTI is an official partner of Europe an Grid Projects, EGI and EMI With its continuing contribution to production-quality Grid operation and the AMGA development, respectively We are in the middle of moving one step further toward s the setting up of France-Asia VO starting from FKPPL VO With the KISTI GSDC, KISTI is expected to play a role o f a Tier-1 data center as well as its traditional supercom puter center

Thank you for your attention!