Travelling securely on the Grid to the origin of the Universe

Similar documents
Grid Computing a new tool for science

The LHC Computing Grid. Slides mostly by: Dr Ian Bird LCG Project Leader 18 March 2008

The LHC Computing Grid

RUSSIAN DATA INTENSIVE GRID (RDIG): CURRENT STATUS AND PERSPECTIVES TOWARD NATIONAL GRID INITIATIVE

Pan-European Grid einfrastructure for LHC Experiments at CERN - SCL's Activities in EGEE

The EGEE-III Project Towards Sustainable e-infrastructures

Scientific data processing at global scale The LHC Computing Grid. fabio hernandez

The grid for LHC Data Analysis

Conference The Data Challenges of the LHC. Reda Tafirout, TRIUMF

First Experience with LCG. Board of Sponsors 3 rd April 2009

Outline. Infrastructure and operations architecture. Operations. Services Monitoring and management tools

EGI: Linking digital resources across Eastern Europe for European science and innovation

Moving e-infrastructure into a new era the FP7 challenge

Computing grids, a tool for international collaboration and against digital divide Guy Wormser Director of CNRS Institut des Grilles (CNRS, France)

CERN openlab II. CERN openlab and. Sverre Jarp CERN openlab CTO 16 September 2008

The LHC Computing Grid

High Performance Computing from an EU perspective

Preparing for High-Luminosity LHC. Bob Jones CERN Bob.Jones <at> cern.ch

A short introduction to the Worldwide LHC Computing Grid. Maarten Litmaath (CERN)

On the EGI Operational Level Agreement Framework

Distributed e-infrastructures for data intensive science

The Grid: Processing the Data from the World s Largest Scientific Machine

Grids and Security. Ian Neilson Grid Deployment Group CERN. TF-CSIRT London 27 Jan

The LCG 3D Project. Maria Girone, CERN. The 23rd Open Grid Forum - OGF23 4th June 2008, Barcelona. CERN IT Department CH-1211 Genève 23 Switzerland

Introduction to Grid Infrastructures

CERN and Scientific Computing

GRIDS INTRODUCTION TO GRID INFRASTRUCTURES. Fabrizio Gagliardi

Giovanni Lamanna LAPP - Laboratoire d'annecy-le-vieux de Physique des Particules, Université de Savoie, CNRS/IN2P3, Annecy-le-Vieux, France

CC-IN2P3: A High Performance Data Center for Research

Andrea Sciabà CERN, Switzerland

CERN openlab Communications

CSCS CERN videoconference CFD applications

ASTRONOMY & PARTICLE PHYSICS CLUSTER

The LHC computing model and its evolution. Dr Bob Jones CERN

RESOLUTION 130 (Rev. Antalya, 2006)

EGI federated e-infrastructure, a building block for the Open Science Commons

Overview. About CERN 2 / 11

European Grid Infrastructure

Challenges and Evolution of the LHC Production Grid. April 13, 2011 Ian Fisk

The Portuguese National Grid Initiative (INGRID)

IEPSAS-Kosice: experiences in running LCG site

Grid Computing at the IIHE

Operating the Distributed NDGF Tier-1

EGI Operations and Best Practices

Summary of the LHC Computing Review

Research Infrastructures and Horizon 2020

Petaflop Computing in the European HPC Ecosystem

Big Computing and the Mitchell Institute for Fundamental Physics and Astronomy. David Toback

CERN s Business Computing

High-Performance Computing Europe s place in a Global Race

EGEE and Interoperation

Open Science Commons: A Participatory Model for the Open Science Cloud

Virtualizing a Batch. University Grid Center

CA Security Management

The Grid. Processing the Data from the World s Largest Scientific Machine II Brazilian LHC Computing Workshop

A Multi-Stakeholder Approach in the Fight Against Cybercrime

RESOLUTION 45 (Rev. Hyderabad, 2010)

Some aspect of research and development in ICT in Bulgaria. Authors Kiril Boyanov and Stefan Dodunekov

CLOUDS OF JINR, UNIVERSITY OF SOFIA AND INRNE JOIN TOGETHER

EGI, LSGC: What s in it for you?

Computing for LHC in Germany

Helix Nebula The Science Cloud

Grid Computing: dealing with GB/s dataflows

Storage and I/O requirements of the LHC experiments

The EuroHPC strategic initiative

Access the power of Grid with Eclipse

e-infrastructure: objectives and strategy in FP7

30 Nov Dec Advanced School in High Performance and GRID Computing Concepts and Applications, ICTP, Trieste, Italy

Challenges of the LHC Computing Grid by the CMS experiment

Grid and Cloud Activities in KISTI

ITU-IMPACT Capacity Building for Least Developed & Developed Countries

International Cooperation in High Energy Physics. Barry Barish Caltech 30-Oct-06

Regional e-infrastructures

GÉANT Services Supporting International Networking and Collaboration

e-infrastructures in FP7 INFO DAY - Paris

Accelerating Throughput from the LHC to the World

Grid Computing: dealing with GB/s dataflows

Worldwide Production Distributed Data Management at the LHC. Brian Bockelman MSST 2010, 4 May 2010

Digitising European industry

Bob Jones. EGEE and glite are registered trademarks. egee EGEE-III INFSO-RI

RESOLUTION 130 (REV. BUSAN, 2014)

ACCI Recommendations on Long Term Cyberinfrastructure Issues: Building Future Development

Experience of the WLCG data management system from the first two years of the LHC data taking

CHIPP Phoenix Cluster Inauguration

Grid: data delen op wereldschaal

Grid Services Security Vulnerability and Risk Analysis

De BiG Grid e-infrastructuur digitaal onderzoek verbonden

National R&E Networks: Engines for innovation in research

Big Data Value cppp Big Data Value Association Big Data Value ecosystem

RESEARCH INFRASTRUCTURES IN HORIZON Marine Melkonyan National University of S&T MISIS

EGI-InSPIRE. Security Drill Group: Security Service Challenges. Oscar Koeroo. Together with: 09/23/11 1 EGI-InSPIRE RI

Scottish Wide Area Network (SWAN) update & Partnership Connectivity

The CHAIN-REDS Project

Austrian Federated WLCG Tier-2

CIRT: Requirements and implementation

The challenges of (non-)openness:

Grid Computing Security hack.lu 2006 :: Security in Grid Computing :: Lisa Thalheim 1

Coupled Computing and Data Analytics to support Science EGI Viewpoint Yannick Legré, EGI.eu Director

PROJECT FINAL REPORT. Tel: Fax:

GÉANT Community Programme

Transcription:

1 Travelling securely on the Grid to the origin of the Universe F-Secure SPECIES 2007 conference Wolfgang von Rüden 1 Head, IT Department, CERN, Geneva 24 January 2007

2 CERN stands for over 50 years of fundamental research and discoveries technological innovation training and education bringing the world together 1954 Rebuilding Europe First meeting of the CERN Council 2 2004 Global Collaboration The Large Hadron Collider involves over 80 countries 1980 East meets West Visit of a delegation from Beijing

3 CERN s mission in Science Understand the fundamental laws of nature We accelerate elementary particles and make them collide. We observe the results and compare them with the theory. We try to understand the origin of the Universe. Provide a world-class laboratory to researchers in Europe and beyond New: Support world-wide computing using Grid technologies A few numbers 2500 employees: physicists, engineers, technicians, craftsmen, administrators, secretaries, 8000 visiting scientists (half of the world s particle physicists), representing 500 universities and 3over 80 nationalities Budget: ~1 Billion Swiss Francs per year Additional contributions by participating institutes

4 4

5 5

6 6

7 7

How does the Grid work? It relies on advanced software, called middleware. Middleware automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. 8 Zurich January 2007 8

9 Why does CERN need the Grid? 9

The LHC accelerator and the four experiments Zurich January 2007 10 10

11 Zurich January 2007 View of the LHC tunnel 11

12 View of the ATLAS detector (under construction) 150 million sensors deliver data 40 million times per second 12

13 13

14 14

15 15

16 16

17 17

18 18

19 19

20 Today s installation at CERN: 8500 CPUs (Linux) in 3500 boxes 4000 TB on 14 000 drives (NAS Disk Storage) 20 45 000 Tape Slots installed and 170 high speed drives (10 PB capacity)

21 Massive ramp-up during 2006-08 21

22 Massive ramp-up during 2006-08 22

23 Massive ramp-up during 2006-08 23

LCG Distribution of Computing Services Summary of Computing Resource Requirements All experiments - 2008 From LCG TDR - June 2005 CERN All Tier-1s All Tier-2s Total CPU (MSPECint2000s) 25 56 61 142 Disk (PetaBytes) 7 31 19 57 Tape (PetaBytes) 18 35 53 All Tier-2s 43% CPU Disk Tape CERN 18% All Tier-2s 33% CERN 12% CERN 34% All Tier-1s 39% All Tier-1s 55% All Tier-1s 66% les robertson - cern-it-lcg

LCG WLCG Collaboration The Collaboration 4 LHC experiments ~120 computing centres 12 large centres (Tier-0, Tier-1) 38 federations of smaller Tier-2 centres Growing to ~40 countries Memorandum of Understanding Agreed in October 2005, now being signed Resources Commitment made each October for the coming year 5-year forward look les robertson - cern-it-lcg

26 Worldwide Grid for science ~200 sites some very big, some very small 60 Virtual Organisations with >25 000 CPUs 26

EGEE Enabling Grids for E-sciencE Started in April 2004 Now in 2 nd phase with 91 partners in 32 countries The EGEE project Objectives Large-scale, production-quality grid infrastructure for e-science Attracting new resources and users from industry as well as science Maintain and further improve glite Grid middleware Improve Grid security EGEE-II INFSO-RI-031688 27

Entering the Grid Users Institute A Users Virtual Organizations VO A Resources SITE A Institute B Users... VO B...... SITE B Institute X VO X SITE X Authorization flow Users International Grid Trust Federation (X.509/PKI) peer peer Grids Grids CERN January 2007

Entering the Grid Users Institute A Users Institute B Users are then registered in Virtual Organisations Users (VO) Authorization flow... Institute X Users Virtual Organizations VO A VO B... VO X International Grid Trust Federation (X.509/PKI) Resources... SITE A ITGF brings common policies and standards among accredited CAs SITE B SITE X peer peer Grids Grids CERN January 2007

Entering the Grid According to their role in the Virtual Organizations VO, users Users are authorized to Institute A use the Grid services Users VO A Resources SITE A Authorization flow Institute B Users... Institute X Users VO B...... SITE B Users are authenticated using X509 certificates VO X issued by SITE X International Certificate Grid Authorities (CAs) Trust Federation (X.509/PKI) peer peer Grids Grids CERN January 2007

Security Collaboration in the LHC Grid Common Policies for Grids Joint Security Policy Group Grid Security Vulnerability Group Software vulnerabilities MiddleWare Security Group Architecture Framework Interoperability International Grid Trust Federation Operational Security Coordination Team Trust anchor CA 31 Operations Best Practice CERTs/CSIRTs (Initial picture by Ake Edlund) Zurich January 2007 31

32 Security Collaboration in the LHC Grid Common Policies for Grids Joint Security Policy Group Grid Security Vulnerability Group Software vulnerabilities MiddleWare Security Group Architecture Framework Interoperability International Grid Trust Federation is maintaining global trust relationships between the Certificate Authorities Trust anchor CA International Grid Trust Federation Operational Security Coordination Team Joint Security Policy Group is providing a coherent set of security policies to be used by 32 the GridsOperations Best Practice CERTs/CSIRTs (Initial picture by Ake Edlund)

33 Security Collaboration in the LHC Grid Common Policies for Grids Trust anchor CA Joint Security Policy Group International Grid Trust Grid Security Vulnerability Group Middleware Security Group is defining the security framework and architecture of the Grid software Grid Security Federation Vulnerability Group is handling Grid middleware 33 security vulnerabilities Software vulnerabilities MiddleWare Security Group Operational Security Coordination Team Architecture Framework Interoperability Operations Best Practice CERTs/CSIRTs (Initial picture by Ake Edlund)

34 Security Collaboration in the LHC Grid Common Policies for Grids Security Policy Group Grid Security Vulnerability Group Software vulnerabilities Operational Security Coordination Team is dealing Joint with operational issues, from best practice recommendations to multi-site incident response coordination MiddleWare Security Group Architecture Framework Interoperability International Grid Trust Federation Operational Security Coordination Team Trust anchor CA 34 Operations Best Practice CERTs/CSIRTs (Initial picture by Ake Edlund)

Enabling Grids for E-sciencE Applications on EGEE More than 25 applications from an increasing number of domains Astrophysics Computational Chemistry Earth Sciences Financial Simulation Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences.. EGEE-II INFSO-RI-031688 35

Enabling Grids for E-sciencE EGEE used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. Example: EGEE Attacks Avian Flu 2000 computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks in April - the equivalent of 100 years on a single computer. Potential drug compounds now being identified and ranked. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica. EGEE-II INFSO-RI-031688 36

Example: ITU Project Enabling Grids for E-sciencE International Telecommunication Union ITU/BR: Radio-communication Sector management of the radio-frequency spectrum and satellite orbits for fixed, mobile, broadcasting and other communication services RRC-06 (15 May 16 June 2006) 120 countries negotiate the new frequency plan introduction of digital broadcasting UHF (470-862 Mhz) & VHF (174-230 Mhz) Demanding computing problem with shortdeadlines Using EGEE grid were able to complete a cycle in less than 1 hour EGEE-II INFSO-RI-031688 37

Businesses @ EGEE06 Conference (Sep 06) Enabling Grids for E-sciencE EGEE-II INFSO-RI-031688 38

Sustainability Need to prepare for permanent Grid infrastructure Ensure a high quality of service for all user communities Independent of short project funding cycles Infrastructure managed in collaboration with National Grid Initiatives (NGIs) European Grid Initiative (EGI) Preparation Implementation EGI 39

CERN openlab Industry partners provide state of the art technology, manpower CERN does test and validation in demanding Grid environment Platform competence centre Grid interoperability centre Security activities Joint events 40 Zurich January 2007 40

41 CERN and F-Secure partnership (1/2) The partnership brings together F-Secure s computer security know-how, tools and products CERN s expertise and complex infrastructure as a test bed Collaboration on desktop client security and malware detection within electronic mail transport. Focus on desktop and portable computers protection, email gateways (incoming and outgoing), email message stores Antivirus, anti-spyware, anti-spam, anti-flood, anti-phishing Current areas of investigation Automated installation of Antivirus client software to large number of computers (> 6000) with high reliability > 99.9 % Detecting and stripping back-listed file extensions even when contained in compressed files on mail gateways Regular expressions content filtering in mail gateways Viewers and tools to analyze security log files 41

CERN and F-Secure partnership (2/2) Technical contact between F-Secure specialists and CERN mail and desktop security teams established with good and competent communication All level of skills directly accessible: support, developers, product management, executives F-Secure products are excellent and we are collaborating to improve them further We aim to standardize CERN s infrastructure on F-Secure products 42 Zurich January 2007 42

43 For more information about the Grid: www.gridcafe.org Thank you for your kind attention! 43