ARC NOX AND THE ROADMAP TO THE UNIFIED EUROPEAN MIDDLEWARE

Similar documents
EGEE and Interoperation

g-eclipse A Framework for Accessing Grid Infrastructures Nicholas Loulloudes Trainer, University of Cyprus (loulloudes.n_at_cs.ucy.ac.

The ARC Information System

EMI Deployment Planning. C. Aiftimiei D. Dongiovanni INFN

ARC integration for CMS

Provisioning of Grid Middleware for EGI in the framework of EGI InSPIRE

Scientific data management

Information and monitoring

Chelonia. a lightweight self-healing distributed storage

How to build Scientific Gateways with Vine Toolkit and Liferay/GridSphere framework

glite Grid Services Overview

Lessons Learned in the NorduGrid Federation

EGI-InSPIRE. ARC-CE IPv6 TESTBED. Barbara Krašovec, Jure Kranjc ARNES. EGI-InSPIRE RI

PoS(EGICF12-EMITC2)081

SZDG, ecom4com technology, EDGeS-EDGI in large P. Kacsuk MTA SZTAKI

EUROPEAN MIDDLEWARE INITIATIVE

EUROPEAN MIDDLEWARE INITIATIVE

Operating the Distributed NDGF Tier-1

Grid Scheduling Architectures with Globus

Grid Computing Fall 2005 Lecture 5: Grid Architecture and Globus. Gabrielle Allen

30 Nov Dec Advanced School in High Performance and GRID Computing Concepts and Applications, ICTP, Trieste, Italy

European Globus Community Forum The Future of Globus in Europe

Jozef Cernak, Marek Kocan, Eva Cernakova (P. J. Safarik University in Kosice, Kosice, Slovak Republic)

First European Globus Community Forum Meeting

INITIATIVE FOR GLOBUS IN EUROPE. Dr. Helmut Heller Leibniz Supercomputing Centre (LRZ) Munich, Germany IGE Project Coordinator

The SweGrid Accounting System

Grid Interoperation and Regional Collaboration

Setup Desktop Grids and Bridges. Tutorial. Robert Lovas, MTA SZTAKI

THEBES: THE GRID MIDDLEWARE PROJECT Project Overview, Status Report and Roadmap

EUROPEAN MIDDLEWARE INITIATIVE

The LHC Computing Grid

The glite middleware. Presented by John White EGEE-II JRA1 Dep. Manager On behalf of JRA1 Enabling Grids for E-sciencE

StratusLab Cloud Distribution Installation. Charles Loomis (CNRS/LAL) 3 July 2014

Grid services. Enabling Grids for E-sciencE. Dusan Vudragovic Scientific Computing Laboratory Institute of Physics Belgrade, Serbia

Easy Access to Grid Infrastructures

PoS(EGICF12-EMITC2)074

Gatlet - a Grid Portal Framework

Interoperating AliEn and ARC for a distributed Tier1 in the Nordic countries.

The Grid Monitor. Usage and installation manual. Oxana Smirnova

QosCosGrid Middleware

Bob Jones. EGEE and glite are registered trademarks. egee EGEE-III INFSO-RI

Distributing storage of LHC data - in the nordic countries

LCG-2 and glite Architecture and components

The LHC Computing Grid

Interconnect EGEE and CNGRID e-infrastructures

dcache, activities Patrick Fuhrmann 14 April 2010 Wuppertal, DE 4. dcache Workshop dcache.org

EUROPEAN MIDDLEWARE INITIATIVE

A distributed tier-1. International Conference on Computing in High Energy and Nuclear Physics (CHEP 07) IOP Publishing. c 2008 IOP Publishing Ltd 1

Managing Scientific Computations in Grid Systems

EGI Operations and Best Practices

Scientific data processing at global scale The LHC Computing Grid. fabio hernandez

EDGI Project Infrastructure Benchmarking

Agenda: Alberto dimeglio (AdM) Balazs Konya (BK) Helmut Heller (HH) Steve Crouch (SC)

THE WIDE AREA GRID. Architecture

VOMS Support, MyProxy Tool and Globus Online Tool in GSISSH-Term Siew Hoon Leong (Cerlane) 23rd October 2013 EGI Webinar

Parallel Computing in EGI

Patrick Fuhrmann (DESY)

Introduction to Grid Infrastructures

Molecular dynamics simulations in the MolDynGrid Virtual Laboratory by means of ARC between Grid and Cloud

The glite middleware. Ariel Garcia KIT

UNICORE Globus: Interoperability of Grid Infrastructures

Introduction to GT3. Introduction to GT3. What is a Grid? A Story of Evolution. The Globus Project

Globus Online: File Transfer Made Easy!

Access the power of Grid with Eclipse

DSIT WP1 WP2. Federated AAI and Federated Storage Report for the Autumn 2014 All Hands Meeting

Topics of Discussion

EMI Data, the unified European Data Management Middleware

ARC-XWCH bridge: Running ARC jobs on the XtremWeb-CH volunteer

CHAPTER 2 LITERATURE REVIEW AND BACKGROUND

Interoperable job submission and management with GridSAM, JMEA, and UNICORE

EUROPEAN MIDDLEWARE INITIATIVE

The impact and adoption of GLUE 2.0 in the LCG/EGEE production Grid

Integration of Cloud and Grid Middleware at DGRZR

OGF Production Grid Infrastructure. Glossary of Acronyms and Terms

AGIS: The ATLAS Grid Information System

Towards sustainability: An interoperability outline for a Regional ARC based infrastructure in the WLCG and EGEE infrastructures

Overview of HEP software & LCG from the openlab perspective

Introduction to Programming and Computing for Scientists

Eclipse Technology Project: g-eclipse

Enabling Grids for E-sciencE. EGEE security pitch. Olle Mulmo. EGEE Chief Security Architect KTH, Sweden. INFSO-RI

Client tools know everything

DESY. Andreas Gellrich DESY DESY,

where the Web was born Experience of Adding New Architectures to the LCG Production Environment

THE ATLAS DISTRIBUTED DATA MANAGEMENT SYSTEM & DATABASES

EUROPEAN MIDDLEWARE INITIATIVE

Grid Computing with Globus

Storage Virtualization. Eric Yen Academia Sinica Grid Computing Centre (ASGC) Taiwan

Grids and Clouds Integration and Interoperability: an overview

A VOSpace deployment: interoperability and integration in big infrastructure

Argus Vulnerability Assessment *1

Grid Infrastructure For Collaborative High Performance Scientific Computing

Grid Middleware and Globus Toolkit Architecture

Day 1 : August (Thursday) An overview of Globus Toolkit 2.4

SLCS and VASH Service Interoperability of Shibboleth and glite

Analysis of internal network requirements for the distributed Nordic Tier-1

Open Grid Forum. OGF s Role in the Community

ARC middleware. The NorduGrid Collaboration

On the employment of LCG GRID middleware

Gergely Sipos MTA SZTAKI

AMGA metadata catalogue system

Transcription:

ARC NOX AND THE ROADMAP TO THE UNIFIED EUROPEAN MIDDLEWARE GRID-2010, Dubna, July 2 2010 Oxana Smirnova (on behalf of the NorduGrid Collaboration)

Outlook Usage of ARC in NDGF and ATLAS Overview of the latest ARC release Future of European middleware development and ARC 2010-07-02 www.nordugrid.org 2

NDGF Facility end 2009 All resources belong to local providers 2010-07-02 www.nordugrid.org 3

NDGF cloud in ATLAS Includes sites from far away Very few errors 99% efficiency 2010-07-02 www.nordugrid.org 4

ATLAS storage in NDGF - data 2010-07-02 www.nordugrid.org 5

ATLAS storage in NDGF MC 2010-07-02 www.nordugrid.org 6

ATLAS production in the Nordic cloud,, 2009 2010-07-02 www.nordugrid.org 7

ATLAS analysis on NDGF All this is using traditional ARC 2010-07-02 www.nordugrid.org 8

ARC has changed a lot meanwhile Code base restructured New SVN repository ( arc1 ) base for ARC Nox releases Still a lot of code in arc0, gradual transition Architecture changes Main principles are the same Increased modularity: single service hosting layer, HED Globus/GSI is not needed (separate plugin), standard HTTP(s) instead Some new components (see later) True portability Windows, Mac, Solaris C++ and language bindings for Python and Java (in progress) We are in Linux distributions Packages available in Debian (also Ubuntu), Fedora (EPEL -> RedHat, CEntOS) Interoperability improvements Adherence to standards Many are not practicaly useful: trying to fix them via OGF PGI-WG 2010-07-02 www.nordugrid.org 9

ARC Releases Latest production ARC release: 0.8.2.2 Bugfix release on the 0.8.2 Significantly changed information system internals Contains both traditional and selected Nox components (optional) Came out in June 2010 http://wiki.nordugrid.org/index.php/arc_v0.8.2 Latest technology preview ARC release: nox-1.1.0 Concise distribution of the new components from arc1 tree Not recommended for production deployment Came out in May 2010 http://wiki.nordugrid.org/index.php/nox 2010-07-02 www.nordugrid.org 10

ARC Nox Architecture Same philosophy compute, info, data areas Everything is in HED HED is a Web Service container Handles all external interfaces Built-in modularity Flexible security 2010-07-02 www.nordugrid.org 11

Compute area Two solutions available as an ARC CE: 1) Traditional pre Web Service CE Proprietary interface via GridFTP plugin Grid Manager behind (data staging, cache, LRMS interface, RTE, etc..) Resource and job info published via LDAP 2) A-REX, or Web Service CE introduced in Nox Standard compliant WS interface (extended BES, JSDL, GLUE) The same Grid Manager behind but benefits from the HED Some extra features: e.g. RTE deployment (Janitor) Resource and job info published via WS-RF and BES Deployment: possible to mix the two, run several GMs Deploy A-REX with a GridFTP plugin interface Multiple Grid Managers can run behind single interface Available in ARC 0.8.x Migration guide: wiki.nordugrid.org/index.php/gm_to_arex_migration 2010-07-02 www.nordugrid.org 12

Compute area (accounting) ARC has no own logging/bookkeeping service But there are modules that can publish information to 3rd party services (currently SGAS) The ARC Computing Elements come with accounting modules Prepare usage record for Grid jobs Submit records to SGAS logging database Two modules are available: JURA: collects records of A-REX and logs to SGAS server Urlogger: collects records of Grid Manager and logs to SGAS 3 server 2010-07-02 www.nordugrid.org 13

Information area (1/2) Traditional LDAP-based solution, inspired by MDS Local infosys ( GRIS ) ARC release 0.8.0 introduced BDII into ARC replacing Globus LDAP backend Version 0.8.2 moved to BDII version 5 Index service ( EGIIS ) ARC release 0.8.0 introduced a replacement for Globus GIIS backend Reimplementation of the infoindex slapd wrapper in v0.8.2 Ready for roll-out 2010-07-02 www.nordugrid.org 14

Information area (2/2) New WS-based solution for Nox LIDI - the local information system WSRF interface via HED Every service describes itself through the LIDI interface A-REX LIDI: GLUE2 compliant ISIS information indexing service WS-interface Peer-to-peer infosys backbone Sort of service registry 2010-07-02 www.nordugrid.org 15

Data area (1/3) Traditional ARC has no own storage solution, but is interoperable with 3rd party SRM storages, especially dcache Nox comes with Chelonia Self-healing flexible storage cloud system User friendly interface: FUSE mounting Automatic restoration of number of replicas if storage components fall out Set of services Watch it on youtube: www.youtube.com/watch?v=neuwzghhghc Or try it out: part of Nox release, also available in 0.8.2 2010-07-02 www.nordugrid.org 16

Data area (2/3) Libarcdata library and arc* client tools Simple CLI to move files around, supports many protocols Data capabilities of Grid Manager/A-REX Staging (uploaders/downloaders) fair-share system for transfers which splits transfer slots evenly between users or VOMS VOs/roles/groups intelligent retry strategy for failed data transfers with exponential back-off for temporary errors dynamic output files multiple Grid Managers (or A-REXes) can be run under one GridFTP server to improve throughput the SRM port/protocol ambiguity problem is solved by caching SRM information 2010-07-02 www.nordugrid.org 17

Data area (3/3) Cache Grid Manager can use "remote" caches managed by another GM on the same site added authentication caching so continuous permission checking at source is not needed caches can be cleanly drained before taken offline added ability to specify a lifetime for cache files optimizations in the cleaning tool BUT: There is a strong need for a new data handling system Expert group is developing a new architecture wiki.nordugrid.org/index.php/data_staging 2010-07-02 www.nordugrid.org 18

Security area Traditional ARC: GSI In Nox, all security is handled by HED Capability to support almost everything (TLS, SAML, GSI, VOMS, MyProxy,...) Quite straightforward integration with 3rd party services First steps taken with ARGUS arcproxy command line interface Creates all kinds of proxy certificates Complete re-implementation, available even on MS Windows Plus a large zoo of proof-of-concept services and clients developed (or re-implemented) for Nox Charon, FruitFly, ARC-VOMS, etc... 2010-07-02 www.nordugrid.org 19

Client area arc* commands introduced in Nox MS Windows, Mac OS, Linuxes Traditional standalone client package Still the fastest way to Grid Available for many Linux platforms and there is finaly a graphical client: ArcJobTool Written in Python Utilizes libarclient features Has a Web portal incarnation Code: sourceforge.net/projects/laportal/ Movie: www.youtube.com/watch?v=exgwpip8l6k 2010-07-02 www.nordugrid.org 20

Interoperability Dream: offer standards-based interoperability with other popular middlewares: glite, Unicore, Globus, etc Reality: Standards are not suitable for production needs Support for existing standards varies a lot among middlewares When implemented it comes with dialects and own extensions Involvement in ongoing standardization OGF is the main forum Interoperability demos with hello grid based on BES, JSDL,... OGF tutorials Production Grid Infrastructures Working Group (PGI) effort to define a better standard Nox modularity makes interoperability easier 2010-07-02 www.nordugrid.org 21

Unified European middleware EGEE paved way to EGI A common infrastructure needs a Unified Middleware Distribution UMD However, Europeans today use glite, Globus, UNICORE and ARC Unifying all these is a huge challenge European Middleware Initiative (EMI) aims at finding convergence between ARC, glite and UNICORE Define common interfaces Develop common solutions where possible Harmonise components, get rid of redundant ones Produce bundled releases, candidates for UMD inclusion Includes dcache, too 2010-07-02 www.nordugrid.org 22

ARC in EMI EMI Day 0 release will include all the components that are used today in production infrastructures Traditional components will be gradually phased out Grid Manager to be replaced with A-REX Old ng* clients and library to be replaced with new arc* clients and library Information system yet to be defined by the EMI, definitely Glue2 based Storage solution: common EMI effort, lead by dcache We also would like to get rid of Globus and GSI completely Strong desire to rely on industry standards, especially in security Not all Nox components are embraced by EMI (yet) Will be maintained and developed by ARC community 2010-07-02 www.nordugrid.org 23

Community around ARC Headquarters in Scandinavia (NorduGrid), but contribitors and users everywhere Ukrainian Academic Grid powered by ARC the largest national Grid Last two NorduGrid conferences in Hungary and Slovenia An Open Source development community Mail forum, technical workshops, conferences SVN, build system, Bugzilla, user support, etc... Just write to nordugrid-support@nordugrid.org if you want to contribute or having problems Healthy mixture of developers, sysadmins and (extra)ordinary users Academic environment, lots of volunteer work Crazy but good-natured people Patchy funding from national, regional and EU projects Old : NGN, EU KnowARC, NDGF, NGIn,... New : EU EMI, EU EGI-InSPIRE, EU EDGI, EU IGE, new NDGF Projects come and go, NorduGrid community stays 2010-07-02 www.nordugrid.org 24

Instead of summary: challenges Deliver future releases in sync with EMI Improve systematic testing with help from EMI Implement common EMI interfaces Carry out EMI security integration Re-engineer the data staging Improve configuration of Nox components Finalize the prototypes, preview components Phase out all traditional components, get rid of the old SVN tree Find answers for new architecture approaches (clouds, pilot/agent jobs etc) Continue porting and Linux inclusion work Improve documentation and distribution channels One day celebrate ARC-1.0 2010-07-02 www.nordugrid.org 25