Supporting computational science in Nordic area

Similar documents
Petaflop Computing in the European HPC Ecosystem

GRIDS INTRODUCTION TO GRID INFRASTRUCTURES. Fabrizio Gagliardi

Building the European High- Performance Computing Ecosystem

High Performance Computing from an EU perspective

SOMA2 Gateway to Grid Enabled Molecular Modelling Workflows in WWW-Browser. EGI User Forum Dr. Tapani Kinnunen

Grid security and NREN CERTS in the Nordic Countries

Pre-announcement of upcoming procurement, NWP18, at National Supercomputing Centre at Linköping University

Operating the Distributed NDGF Tier-1

PROJECT FINAL REPORT. Tel: Fax:

Garuda : The National Grid Computing Initiative Of India. Natraj A.C, CDAC Knowledge Park, Bangalore.

e-infrastructure: objectives and strategy in FP7

HPC IN EUROPE. Organisation of public HPC resources

Moving e-infrastructure into a new era the FP7 challenge

ESFRI Strategic Roadmap & RI Long-term sustainability an EC overview

The SweGrid Accounting System

ESFRI WORKSHOP ON RIs AND EOSC

Estonian research infrastructure policy. Toivo Räim Vilnius,

SPARC 2 Consultations January-February 2016

Director Professor Børge Obel Chairman of the Board

PID System for eresearch

e-infrastructures in FP7 INFO DAY - Paris

Green Computing and Sustainability

Travelling securely on the Grid to the origin of the Universe

IS-ENES2 Kick-off meeting Sergi Girona, Chair of the Board of Directors

The EuroHPC strategic initiative

Some aspect of research and development in ICT in Bulgaria. Authors Kiril Boyanov and Stefan Dodunekov

Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21)

Striving for efficiency

EUDAT & SeaDataCloud

HPC & Quantum Technologies in Europe

EUDAT. Towards a pan-european Collaborative Data Infrastructure. Damien Lecarpentier CSC-IT Center for Science, Finland EUDAT User Forum, Barcelona

Introduction to FREE National Resources for Scientific Computing. Dana Brunson. Jeff Pummill

Grid Computing a new tool for science

Advanced Grid Technologies, Services & Systems: Research Priorities and Objectives of WP

Huawei European Research University Partnerships. Michael Hill-King European Research Institute, Huawei

RISE SICS North Newsletter 2017:3

A short introduction to the Worldwide LHC Computing Grid. Maarten Litmaath (CERN)

European Cloud Initiative: implementation status. Augusto BURGUEÑO ARJONA European Commission DG CNECT Unit C1: e-infrastructure and Science Cloud

NorStore. a national infrastructure for scientific data. Andreas O Jaunsen UNINETT Sigma as

National Open Source Strategy

Opportunities for collaboration in Big Data between US and EU

Developing a social science data platform. Ron Dekker Director CESSDA

EISCAT_3D Support (E3DS) Project.

Research Infrastructures H2020 Info Day October 2, Ulla Ellmén and Kati Takaluoma, NCPs

The EGEE-III Project Towards Sustainable e-infrastructures

Erabuild National Funding for International Research. ECTP Conference Mika Lautanala Paris

EuroHPC Bologna 23 Marzo Gabriella Scipione

EUDAT. Towards a pan-european Collaborative Data Infrastructure

Interoperating AliEn and ARC for a distributed Tier1 in the Nordic countries.

The CHAIN-REDS Project

SMART. Investing in urban innovation

ACCI Recommendations on Long Term Cyberinfrastructure Issues: Building Future Development

Umeå University

Umeå University

The LHC Computing Grid

TRAINING & CERTIFICATION. An Investment in Knowledge Increases Your Competitive Edge

To Compete You Must Compute

Workshop: Innovation Procurement in Horizon 2020 PCP Contractors wanted

A European Vision and Plan for a Common Grid Infrastructure

einfrastructures Concertation Event

UNCLASSIFIED. UNCLASSIFIED R-1 Line Item #49 Page 1 of 10

GNSSN. Global Nuclear Safety and Security Network

EPISIS. European policies and instruments to support service innovation

BE PART OF THE AUTOMOTIVE JOURNEY ESTABLISH A SOFTWARE DEVELOPMENT CENTER FOR INFOTAINMENT IN SKÅNE

High Performance Computing Data Management. Philippe Trautmann BDM High Performance Computing Global Research

EMBRACE CHANGE Computacenter s Global Solutions Center helps organizations take the risk out of business transformation and IT innovation

e-infrastructures in FP7: Call 7 (WP 2010)

European Open Science Cloud Implementation roadmap: translating the vision into practice. September 2018

Grant for research infrastructure of national interest

UCD Centre for Cybersecurity & Cybercrime Investigation

Pan-European Grid einfrastructure for LHC Experiments at CERN - SCL's Activities in EGEE

At the heart of Europe s ICT ecosystem

Interconnected NRENs in Europe & GÉANT: Mission & Governance Issues

NERSC Site Update. National Energy Research Scientific Computing Center Lawrence Berkeley National Laboratory. Richard Gerber

Mass Big Data: Progressive Growth through Strategic Collaboration

ENCS The European Network for Cyber Security

EuroHPC and the European HPC Strategy HPC User Forum September 4-6, 2018 Dearborn, Michigan, USA

The NORDUnet Next Generation Network Building the future (Nordic) network together

Update on Cray Activities in the Earth Sciences

EUDAT Data Services & Tools for Researchers and Communities. Dr. Per Öster Director, Research Infrastructures CSC IT Center for Science Ltd

Smart Energy Programme. Pia Salokoski

AMRES Combining national, regional and & EU efforts

CANARIE Mandate Renewal Proposal

Pre-Commercial Procurement project - HNSciCloud. 20 January 2015 Bob Jones, CERN

FP7 Information & Communication Technologies. Rules of the Programme

InfraStructure for the European Network for Earth System modelling. From «IS-ENES» to IS-ENES2

University at Buffalo Center for Computational Research

Ceri J Vincent. Chair of CO 2 GeoNet Executive Committee British Geological Survey. website:

IT Governance Framework at KIT

German Research Strategy in the Area of Civil Security Research

CONFERENCE OF EUROPEAN STATISTICIANS ACTIVITIES ON CLIMATE CHANGE-RELATED STATISTICS

We are also organizational home of the Internet Engineering Task Force (IETF), the premier Internet standards-setting body.

National R&E Networks: Engines for innovation in research

Partner in a European project & how to get there - View from Russian Insider on project ISTOK.Ru

GN3plus External Advisory Committee. White Paper on the Structure of GÉANT Research & Development

Response to Industry Canada Consultation Developing a Digital Research Infrastructure Strategy

Funet Activities. GN3 Campus best practise kick-off meeting Trondheim May 27th, 2009

Shaping the Cyber Security R&D Agenda in Europe, Horizon 2020

The Cray Rainier System: Integrated Scalar/Vector Computing

PLAN-E Workshop Switzerland. Welcome! September 8, 2016

Transcription:

Supporting computational science in Nordic area CSC the Finnish IT center for science ORAP Paris 8.11.2005 Kimmo Koski Managing Director, CSC Finland

Contents Nordic support structures Case Norway, Sweden and Denmark Case CSC the Finnish IT center for science Growing need for international collaboration Opinions on hot topics in Nordic area and in Europe Distribution vs. centralization Is there a superior computer architecture: varying needs of top science Respect for the national infrastructures How to build competence to meet the grand challenges

Nordic characteristics Tradition in Nordic collaboration: Nordic Minister council, Nordic funding, Nordunet, Very different national systems in supporting scientific computing Finland: fully centralized (one national center) Sweden and Norway: distributed (4-6 national small centers) Denmark: no center at all, funding distributed to research groups Some challenges in fitting together the national systems Still growing interest in tighter collaboration in HPC and science infrastructures

Nordic development plans Nordic collaboration structure is forming: funding bodies, governments, HPC centers Nordic Data Grid Facility proposal for 2006-> Common interest in: Resource exchange (cycles, application usage, databases, persons, ) Unified coordination of international projects: CERN LHC, ESO, ITER, ) Better interactions through Nordic forums, such as HPC conferences Research network development, dark fiber and lambda networks Future data challenges: opportunity for workload optimization? Tighter links between Nordic computing centers The Nordic collaboration is targeted to contribute in European level In some cases one or more Nordic countries together, in some cases individually

Nordic Collaboration - NDGF NDGF- Nordic Data Grid Facility http://www.ndgf.org Long-term initiative run by the Joint Committee of the Nordic Natural Science Research Councils (NOS-N) in 2002 to establish a Nordic Data Grid Facility and to involve Nordic countries in European and global co-operation in data sharing in a variety of fields Participants: Denmark, Finland, Norway, Sweden National funding: prototype phase (2003-2005) Funding for the year 2006 proposed Management infrastructure - Nordic Steering Committee National test-beds for R&D in Grid-technology are being set up in all the Nordic Countries Using Usingthe the NORDUGrid middleware (funded by bynordunet2) Storage elements: app. app. 10 10 TB TB of of public public available storage

The e VITA Initiative (Norway) escience; Infrastructure, Theory and Applications (coordinated action) Motivation: computational science is one of the most important technical fields of the 21st century because it is essential to advances throughout society. (PITAC, 2005) Scope 2006-2015 Budget~10 MEURO/year

NOTUR The Norwegian High Performance Consortium Operated as a Metacenter, coordinated by UNINETT Sigma (Trondheim) 4 universities and Met office are partners Hardware investments are co-funded by the Research Council and the universities Resources are allocated by the Research Council Current funding from Research Council 4 MEURO/year National coordination ensures diversity of hardware and sharing of resources

SNIC Swedish National Infrastructure for Computing Six national centers PDC Stockholm NSC Lindköping HPC2N Umeå UNICC Gothenburg LUNARC Lund UPPMAX Uppsala SNIC Metacenter projects National helpdesk for SNIC SweGrid HPC portals National storage solutions National application support in Computational Chemistry National application support in Bioinformatics EGEE Regional Operations Center Grid research projects http://www.snic.vr.se/

DCSC Danish Center for Scientific Computing No national centers Funding distributed directly to research groups, coordination by DCSC Regional operation centers: Technical University of Denmark University of Copenhagen University of Southern Denmark Aarhus University The Danish Center for Scientific Computing has been set up as a national centre under the Ministry of Science, Technology and Innovation with government funding allocated for data processing capacity within the area of scientific computing for research assignments http://www.dcsc.dk/

Case CSC the Finnish IT Center for Science

Special for the Finnish model In Finland there is only one national IT center for science (CSC) in which most activities are concentrated Supercomputing and data repositories Research network (FUNET) Scientific support Scientific library systems Scientific applications, databases Strong organization for scientific support for researchers Scientists to support researchers in using the infrastructure Tools and methods Multidisciplinary approach

CSC Fact Sheet Operated on a non-profit principle CSC Turnover in 2004 14.0 M, 137 employees in April 2005 All shares to the Ministry of Education of Finland in 1997 Since March 2005, facilities in Keilaniemi, Espoo Reorganized as a limited company, CSC-Scientific Computing Ltd. in 1993 First supercomputer Cray X-MP/EA 416 in 1989 Founded in 1970 as a technical support unit for Univac 1108 Funet started in1984

CSC Fields of Services FUNET FUNET SERVICES SERVICES UNIVERSITIES UNIVERSITIES POLYTECHNICS POLYTECHNICS RESEARCH RESEARCH INSITUTES INSITUTES COMPANIES COMPANIES COMPUTATIONAL COMPUTATIONAL SERVICES SERVICES APPLICATIONS APPLICATIONS SERVICES SERVICES INFORMATION INFORMATION SYSTEMS SYSTEMS MANAGEMENT MANAGEMENT EXPERTISE EXPERTISE IN IN SCIENTIFIC SCIENTIFIC COMPUTING COMPUTING

CSC Statistics 2004: Processor Time per Discipline Nanoscience Physics 6 % 4 % 2 % Chemistry 12 % Biosciences Weather forecasts (FMI) Computational fluid dynamics 23 % 48 %

Biosciences Physics Chemistry Nanoscience Engineering CSC Statistics 2004: Ten Largest Disciplines by Users 45 43 42 160 331 Linguistics Computational fluid dynamics Computational drug design Weather forecasts (FMI) Mathematics Other 56 67 71 88 127 157

CSC Strategy of the Future HPC Capacity Updates 2004, IBM IBM esuper Cluster 1600 (16 (16 IBM IBM p690 nodes with with a theoretical peak performance of of 2.2 2.2 Tflop/s) switch upgrade with with Federation technology 2004, coordination and and support of of the the Finnish Physics and and Chemistry Material Science Grid Grid (PC (PC clusters in in 8 laboratories, ~ 500 500CPUs totally) 2005, cluster capacity at at CSC (~ (~ 700 700 CPU s cluster) 2006 2008, new new supercomputer purchase: Grant Grant of of 10 10 M M from from Ministry of of Education RFP RFP will will be be out out at at turn turn of of the the year year 2006 2006 First First phase phase is is aimed aimed to to be be installed at at 3Q/2006 Possible second second (and (and third) third) phase phase at at 2007/2008

Funet - Finnish University and research NETwork High High quality academic Internet services for for Finland Advanced services like like IPv6 IPv6 and and IP IP multicast Core Core line line speed speed 2,5 2,5 Gbps Gbps Funet Funetmember access access at at speeds speeds up up to to 1 Gbps Gbps International connectivity through NORDUnet 85 85member organizations: All All Finnish Finnish universities (20) (20) and and polytechnics (30) (30) Public Public research institutions and and administrative organizations close close to to higher higher education (35) (35) Over Over 350 350000 000end users users

M- Grid Finnish Grid in production use http://www.csc.fi/proj/mgrid/ Material Sciences National Grid Infrastructure (M- grid) is a national joint project funded by the Finnish Academy for the National Research Infrastructure Program in the Grid area Participants: CSC, 7 Finnish universities, Helsinki Institute of Physics (HIP) 9 sites totally Objectives: To build a homogeneous PC-cluster environment with the theoretical computing capacity of appr. 1.5 Tflops / 410 nodes Environment Hardware: Dual AMD Opteron 1.8-2.2 Ghz nodes with 2-8 GB memory, 1-2 TB shared storage, separate 2xGE (communications and NFS), remote administration OS: NPACI Rocks Cluster Distribution / 64 bit, based on RedHat Enterprise Linux 3 Grid middleware: NorduGrid ARC Grid MW compiled With Globus 3.2.1 libraries, Sun Grid Engine as LRMS CSC: CSC: Administration tasks tasks Maintain Operating System, LRMS, LRMS, Grid Grid middleware, certain certain libraries Separate small small test test cluster cluster for for testing testing new new software releases Tools Tools for for system system monitoring, integrity checking, etc. etc.

DEISA objectives To enable Europe s terascale science by the integration of Europe s most powerful supercomputing systems Enabling scientific discovery across a broad spectrum of science and technology is the only criterion for success DEISA is an European Supercomputing Service built on top of existing national services DEISA deploys and operates a persistent, production quality, distributed supercomputing environment with continental scope

EU Collaboration - FP6 EMBRACE Project EMBRACE a European Model for Bioinformatics Research and Community Education http://www.embracegrid.info The project s objective is to draw together a wide group of experts throughout Europe who are involved in the use of information technology in the biomolecular sciences Funded by EC with 8.3 M Launched on February 1, 2005 Designed as a Network of Excellence to accumulate experience and professional skills of highly qualified biologists of 17 institutes from 11 European countries. Coordinated by EMBL, the European Molecular Biology Laboratory and its part EBI, the European Bioinformatics Institute, UK. Create Create APIs APIs to to DBs DBs Adjust Adjust SW SW to to use use APIs APIs Grid-enabled Web services -- EMBRACEgrid Test Test problem driven driven development process CSC: CSC: Technology Evaluation and and Watch Watch WP, WP, PairsDB, technology assessments, test test problems, workshops

HAKA Infrastructure and Middleware in the Finnish Higher Education http://www.csc.fi/suomi/funet/middleware/english/index.phtml The purpose of the HAKA infrastructure is to support the teaching and research activities in the Finnish universities and polytechnics by developing and maintaining the common infrastructure for user identification. HAKA is a part of the Finnish Electronic IDentification in Higher Education (FEIDHE), a national funded joint project of the academic community in Finland. The system has been built up as a collaboration project between CSC and academic IT centers since March 2002. 12 different pilots have been carried in 2003-2005. With his own user ID, the user can log in to services connected to the HAKA infrastructure regardless of the organization that maintains the service - in conjunction with the log in, the user s current ID data is transferred to the service provider. The The HAKA HAKA infrastructure infrastructure is is based based on on the the SAML SAML standard standard and and Shibboleth Shibboleth Open Open Source Source code code solutions. solutions. Shibboleth Shibboleth technology technology was was developed developed in in universities universities in in the the United United States. States.

CSC Software Development : SOMA Project http://www.csc.fi/proj/drug2000 The aim of Soma project (part of the drug2000-program)is to organize and update CSC's modeling environment to better meet the European standard in computer-aided drug design and increase its usability by non-experts in computer modeling. Funded by CSC and the National Technology Agency (Tekes) Tailored structure-based small-molecule design environment Consists of several molecular modeling tools (e.g. MDL, CSDS, Corina, Gold, Volsurf, Bodil) and allows designing of hypothetic new ligand molecules and predicts their properties SOMA SOMA can can be be accessed from from the the CSC CSC Scientist s interface

CSC Software Development : MIKA Project Multigrid Instead of K-spAce http://www.csc.fi/physics/mika/ MIKA is the real-space, multigrid-based program package for electronic structure calculations with applications in research of quantum dots, nanostructures and positron physics. 3 year collaboration project with COMP group of the Laboratory of Physics at Helsinki University of Technology and CSC to prepare a free academic SW through the Open Source code MIKA consists of several dierent modules for solving the Kohn-Sham equations of the density- functional theory in dierent geometries a b c d Snapshots from from a simulation of of nanowire breaking by by the the ultimate jellium jelliummodel. model. A catenoid surface surface (a), (a), cluster-derived structures (b) (b) and and (d), (d), and and uniform cylindrical shape shape (c) (c) can can be be seen. seen. Green Green rectangles mark mark the the lead-constriction boundary

CSC Software Development : ELMER Elmer - Finite Element Software for Multiphysical Problems http://www.csc.fi/elmer Elmer is a finite element computational tool for multi-physics problems developed in collaboration with Finnish universities, research laboratories and industry Elmer includes physical models of fluid dynamics, structural mechanics, electromagnetics, heat transfer etc. These are described by partial differential equations which Elmer solves by the Finite Element Method (FEM) The program has been developed at CSC over the past 10 years. ELMER 5.0 is currently released as an open source SW under the GNU General Public License. Elmer comprises of several different parts: the geometry, boundary conditions and physical models are defined in ElmerFront; the resulting problem definition is solved by ElmerSolver. Finally, the results are visualized by ElmerPost. Additionally, a utility ElmerGrid may be used for simple mesh manipulations

Growing need for international collaboration in providing infrastructure Motivation to develop national infrastructure AND form international networks/grids is growing in Nordic area Growing need and impact of computational science (PITAC 2006) Research groups are working increasingly internationally, so the support organisations also need to provide internationally competitive resources Global players are looking for strong collaborators Solving Grand challenge problems might require investments beyond country budgets Survey for Nordic Grand challenges of computing Boosting Nordic area attractiveness for top researchers Competitive infrastructure Stimulating research environment

Hot topics in Nordic and Europe Distribution vs. centralization The optimal solution to might be different depending on the location History often decisive: for example it is very difficult to close centers and human capital is still the major asset Efficient co-usage of centralization and (human) networks, solution is a combination adapted to national needs Is there a superior computer architecture: varying needs of top science No single architecture solves all problems Not all the problems can (usually) be addressed sufficiently within the available budget Requirement for priorities: what are the most important grand challenges (in the opinion of the funding authority)

Hot topics in Nordic and in Europe (cont.) Respect for the national infrastructures Existing investment for hardware, software and peopleware in all of the European countries can not be bypassed New infrastructures need to be built on top of the existing ones Top specialists usually have a choice Number of top specialists is limited starting from scratch requires human capital Grid projects linking national infrastructures can be very useful, if get to address concrete issues

How to build competence to meet the grand challenges of computing with sustainable infra European HPC center(s) National/regional centers, Grid-collaboration (DEISA etc.) Local centers Major problems might require parallel computing with 100000+ processors and we do not even know what new problems this will cause How to build the competence to solve top-end problems Two lower layers very important (majority of person resources, applications development etc.) Balanced system with smooth interoperability between layers is important Lower layers feed upper layers with scalable applications