Introduction to ECMWF resources:

Similar documents
Introduction to computing resources

Metview Introduction

Metview 5.0 and Beyond, to its Pythonic Future

Progress on TIGGE Archive Center in CMA

HPCF Cray Phase 2. User Test period. Cristian Simarro User Support. ECMWF April 18, 2016

Introduction to Metview

Metview s new Python interface

NCEP HPC Transition. 15 th ECMWF Workshop on the Use of HPC in Meteorology. Allan Darling. Deputy Director, NCEP Central Operations

Metview 4 ECMWF s next generation meteorological workstation

Second workshop for MARS administrators

Big changes coming to ECMWF Product Generation system

The challenges of the ECMWF graphics packages

IFS migrates from IBM to Cray CPU, Comms and I/O

Metview and Python - what they can do for each other

Council, 26 March Information Technology Report. Executive summary and recommendations. Introduction

<Insert Picture Here> Oracle Storage

TIGGE and the EU Funded BRIDGE project

Getting started. Carsten Maass. User Support. ECMWF February 20, 2017

C3S Data Portal: Setting the scene

Introduction to terminal and Web access. Computer user training course 2015

Interpolation. Introduction and basic concepts. Computer User Training Course Paul Dando. User Support Section.

Data Centre NetCDF Implementation Pilot

Web Services at ECMWF

ECMWF's Next Generation IO for the IFS Model and Product Generation

Approaches to I/O Scalability Challenges in the ECMWF Forecasting System

REQUEST FOR A SPECIAL PROJECT

ECMWF s Next Generation IO for the IFS Model

Interpolation. Computer User Training Course Paul Dando. User Support. ECMWF 25 February 2016

Supercomputing at the United States National Weather Service (NWS)

LBRN - HPC systems : CCT, LSU

Introduction to ARSC. David Newman (from Tom Logan slides), September Monday, September 14, 15

Current Progress of Grid Project in KMA

Metview 4 ECMWF s latest generation meteorological workstation

CERA: Database System and Data Model

GEMS: Global Earth-system Monitoring using Satellite & in-situ Data

Compiling environment

Providing a first class, enterprise-level, backup and archive service for Oxford University

ITD SERVER MANAGEMENT PROCEDURE

The Logical Data Store

Metview 4 ECMWF s latest generation meteorological workstation

ECMWF Web re-engineering project

Generating SVG weather maps and meteorological graphs using Magics++

Meteorology and Python

Marine Institute Job Description

Copernicus Climate Change Service

JÜLICH SUPERCOMPUTING CENTRE Site Introduction Michael Stephan Forschungszentrum Jülich

Marine Institute Job Description

Data storage services at KEK/CRC -- status and plan

Compiling environment

TECHNICAL GUIDELINES FOR APPLICANTS TO PRACE 13 th CALL (T ier-0)

ECMWF point database: providing direct access to any model output grid-point values

ARCHER/RDF Overview. How do they fit together? Andy Turner, EPCC

HPC Resources & Training

Shared Services Canada Environment and Climate Change Canada HPC Renewal Project

Request for Proposals (RFP) IT Infrastructure Upgrade & Storage Solution January 26, 2017 Proposal Deadline: February 8, 2017

The EU-funded BRIDGE project

Storage Optimization with Oracle Database 11g

Meeting the challenges of the next generation of user interfaces

<Insert Picture Here> Tape Technologies April 4, 2011

Metview s new Python interface first results and roadmap for further developments

cdo Data Processing (and Production) Luis Kornblueh, Uwe Schulzweida, Deike Kleberg, Thomas Jahns, Irina Fast

Using ODB at ECMWF. Piotr Kuchta Sándor Kertész. Development Section ECMWF. Slide 1. MOS Workshop, 2013 November 18-20, ECMWF

What Can a Small Country Do? The MeteoSwiss Implementation of the COSMO Suite on the Cray XT4

Leonhard: a new cluster for Big Data at ETH

NERSC Site Update. National Energy Research Scientific Computing Center Lawrence Berkeley National Laboratory. Richard Gerber

NWP Test suite: Present Status. COSMO General Meeting, 7-10 Sept 2015, Wroclaw, Poland : NWP Test suite session

IMS CLDB and EnviDB. Universal & Reliable Climate Database Management System. IMS CLDB and EnviDB Climatological and Integrated Environmental Database

Brutus. Above and beyond Hreidar and Gonzales

HPC IN EUROPE. Organisation of public HPC resources

Council, 8 February 2017 Information Technology Report Executive summary and recommendations

TECHED USER CONFERENCE MAY 3-4, 2016

Pushing the Limits. ADSM Symposium Sheelagh Treweek September 1999 Oxford University Computing Services 1

XtreemStore A SCALABLE STORAGE MANAGEMENT SOFTWARE WITHOUT LIMITS YOUR DATA. YOUR CONTROL

Vendor must indicate at what level its proposed solution will meet the College s requirements as delineated in the referenced sections of the RFP:

Scality RING on Cisco UCS: Store File, Object, and OpenStack Data at Scale

Organizational Update: December 2015

eccharts Cihan Sahin Slide 1 Meteorological Operational Systems November

Anne Fouilloux. Fig. 1 Use of observational data at ECMWF since CMA file structure.

6/4/2018 Request for Proposal. Upgrade and Consolidation Storage Backup Network Shares Virtual Infrastructure Disaster Recovery

InfiniBand Strengthens Leadership as the Interconnect Of Choice By Providing Best Return on Investment. TOP500 Supercomputers, June 2014

Power Systems for Your Business

G. Srinivasan/Rupa Kumar Kolli

Virtualizing Oracle on VMware

SuperMike-II Launch Workshop. System Overview and Allocations

SPANISH SUPERCOMPUTING NETWORK

Illinois Proposal Considerations Greg Bauer

BLOCKBUSTER BACKUP REDESIGN

High Performance Computing. What is it used for and why?

Bruce Wright, John Ward, Malcolm Field, Met Office, United Kingdom

MyCloud Computing Business computing in the cloud, ready to go in minutes

INFORMATION TECHNOLOGY NETWORK ADMINISTRATOR ANALYST Series Specification Information Technology Network Administrator Analyst II

ADRIAN and SUPERCOMPUTING.

IT Optimization Trends. Summary Results January 2018

Monitoring for Environment and Security in Africa (MESA) From PUMA2010 to PUMA2015

An Oracle Technical White Paper October Sizing Guide for Single Click Configurations of Oracle s MySQL on Sun Fire x86 Servers

The next generation supercomputer. Masami NARITA, Keiichi KATAYAMA Numerical Prediction Division, Japan Meteorological Agency

Observational DataBase (ODB*) and its usage at ECMWF

HPC Capabilities at Research Intensive Universities

18.1 Provision of Premises 18.2 Maintenance and Improvement of Premises

Data Centers and Cloud Computing. Slides courtesy of Tim Wood

Transcription:

Introduction to ECMWF resources: Computing and archive services. and how to access them Paul Dando User Support Paul.Dando@ecmwf.int advisory@ecmwf.int University of Reading - 23 January 2014 ECMWF Slide 1

ECMWF a few figures Age of ECMWF: 38 years Employees: About 260 Supported by: 34 States Budget: Contributions by Member States and Co-operating States 55 million per annum 41 million per annum University of Reading - 23 January 2014 ECMWF Slide 2

ECMWF Objectives Operational forecasting up to 15 days ahead (including waves) R & D activities in forecast modelling Data archiving and related services Operational forecasts for the coming month and season Advanced NWP training Provision of supercomputer resources Assistance to WMO programmes Management of Regional Meteorological Data Communications Network (RMDCN) University of Reading - 23 January 2014 ECMWF Slide 3

Organisation of ECMWF Policy Advisory Committee 5 20 Members COUNCIL 20 Member States Scientific Advisory Committee 12 Members Technical Advisory Committee 20 Members Advisory Committee on Data Policy 5 34 Members DIRECTOR-GENERAL A. Thorpe (UK) Finance Committee 7 Members Advisory Committee of Co-operating States 14 Members Forecast Department F. Rabier (France) Computing Department Vacant (acting: I. Weger, Austria) Administration Department N. Farrell (Ireland) Research Department E. Källén (Sweden) User support Evaluation Networks Operation HR Property Services Atm. Composition Data Development Production Servers and desktops HPC Communication Finance Model Predictability University of Reading - 23 January 2014 ECMWF Slide 4

The operational forecasting system High resolution deterministic forecast (HRES) : twice per day 16 km 137 levels, to 10 days ahead Ensemble forecast (ENS): twice per day 51 members, 32/65 km 91 levels, to 15 days ahead Monday/Thursday 00 UTC extended to 1 month ahead (Monthly Forecast) Ocean waves: twice daily Global: 10 days ahead at 28 km European Waters: 5 days ahead at 10 km Ensemble: 15 days ahead at 55 km Seasonal forecast: once a month System 4 51-members, TL255 (~80 km) 91 levels, to 7 months ahead sub-set of 15 members is run for 13 months every quarter (30 years of hindcasts) University of Reading - 23 January 2014 ECMWF Slide 5 Slide 5

Operational upgrades www.ecmwf.int/products/changes University of Reading - 23 January 2014 ECMWF Slide 6

Education and training Training Courses NWP (Numerical methods, Data assimilation, Model Physics, Predictability) Use and interpretation of ECMWF products Computer user training courses Webinars (remote seminars/lectures) See https://software.ecmwf.int/wiki/display/optr/ecmwf+training+activities Seminars Research Seminar: Use of Satellite Data September 2014 Workshops - High performance computing in meteorology (biennial) - Meteorological Operational Systems (biennial) NEW 27 Jan Understanding the model climate (L. Magnusson) Monthly forecasting (L. Ferranti) 28 Jan Forecasting extreme events (I. Tsonevsky) Extra-tropical cyclone tracking (T. Hewson) Model errors and diagnostic tools (M. Rodwell) 29 Jan Clouds and precipitation: (R. Forbes) University of Reading - 23 January 2014 ECMWF Slide 7

Education and training Calendar 2014: www.ecmwf.int/newsevents/calendar/2014.html 27-31 January Use and interpretation of ECMWF Products 3-7 February Repeat course 10-14 February HPCF (Use of new Cray system) 17-21 February HPCF (Use of new Cray system) 25-28 February GRIB API: library and tools 3-7 March Introduction for new users/mars 10-14 March Data assimilation and use of satellite data (now 5 days) 17-21 March ECMWF/EUMETSAT NWP-SAF Satellite data assimilation 24-28 March Numerical methods, adiabatic formulation of models and ocean wave forecasting 31 Mar 10 April Parametrization of subgrid physical processes 23-25 April Introduction to ecflow 28 April 2 May Magics / Metview 7-16 May Predictability, diagnostics and extended-range forecasting 1-4 September Seminar on Use of satellite data 27-31 October 16th Workshop on High performance computing in meteorology more e-learning, webinars, University of Reading - 23 January 2014 ECMWF Slide 8

HPCF www.ecmwf.int/services/computing/hpcf/ University of Reading - 23 January 2014 ECMWF Slide 9

Current HPCF IBM POWER 7 Two clusters (C2A & C2B) - each with 768 nodes - in total 48,000 compute cores 100 TB of memory 3 PB of usable disk space Performance increase about 3x over previous system (POWER6) ~1.5 petaflops peak ~70 teraflops sustained Provides HPC service until mid-2014 1,800,000 1,600,000 1,400,000 1,200,000 1,000,000 800,000 600,000 C1A Serial C2A Parallel C1A Parallel C2A Serial 400,000 200,000 - University of Reading - 23 January 2014 ECMWF Slide 10

Cray XC30 HPCF Contract with Cray signed on 24 June 2013 2 compute clusters 2 storage clusters 3x sustained performance on ECMWF codes as existing HPCF Performance coming from more rather than faster cores ~3,500 nodes each with 2x 12-core Intel Ivy Bridge processors and 64GiB memory per node ~84,000 cores per cluster Ivy Bridge about 20% less sustained performance than POWER7 per core Focus on scalability of applications University of Reading - 23 January 2014 ECMWF Slide 11

Unix server ecgb www.ecmwf.int/services/computing/ecgate/ University of Reading - 23 January 2014 ECMWF Slide 12

Unix server ecgb 8 compute nodes each with 2 Intel Xeon processors (Sandy Bridge-EP): 16 core at 2.7 GHz 128 GB memory 2 x 900 GB SAS HDD Hyper threading is used providing 32 virtual CPUs per node. One (+one as backup) of these nodes serves as a "login" node. 4 I/O server nodes 8 DS3524 with 24 x 3 x 300 GB 10k SAS HDD storage subsystem providing 172.8 TB raw disk space RedHat Enterprise Linux Server 6.4 Available to ~2700 users at more than 300 institutions University of Reading - 23 January 2014 ECMWF Slide 13

Data Handling System (DHS) www.ecmwf.int/services/computing/overview/datahandling.html University of Reading - 23 January 2014 ECMWF Slide 14

Data archive and storage Data archival and retrieval system for all ECMWF data Archive volume of ~55 PB (January 2014) ~15 PB of backup data in the Disaster Recovery System Primary copy: Oracle SL8500s + T10000 Operational since early 2010 4 Oracle SL8500 Automated Tape Libraries (ATLs) Additional library installed in 2012 Single library image with 40,000 slots Good robotic management interfaces Safety copy (DRS): TS3500 and LTO drives Mix of LTO3 and LTO5 drives Hardly any read access University of Reading - 23 January 2014 ECMWF Slide 15

Data Server http://apps.ecmwf.int/datasets/ A standalone system outside the firewall Public (non-commercial) distribution of data Self-registration Batch access possible with Python, Perl, Java GRIB or netcdf University of Reading - 23 January 2014 ECMWF Slide 16

Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 17

Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 18

Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 19

Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 20

Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 21

Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 22

New Webmars http://apps.ecmwf.int/services/mars/catalogue/ University of Reading - 23 January 2014 ECMWF Slide 23

Access to computing and archive services Type of user How registered? Access Public user Self-registration Public data server Web-only user Registered via UKMO Full MARS archive via webmars Valid data (with UKMO approval) Fully-registered user Registered via UKMO Full MARS archive Valid data (with UKMO approval) Interactive login to ecgate Special Project user Project approved by UKMO and ECMWF Registered by PI Full MARS archive Valid data (with UKMO approval) Interactive login to ecgate Access to HPCF Member State user registration forms available at: http://www.ecmwf.int/about/computer_access_registration/forms/ Complete, sign and send to the UK Computing Rep (Roddy Sharp) at the Met Office University of Reading - 23 January 2014 ECMWF Slide 24

Special Projects http://www.ecmwf.int/about/computer_access_registration/special_projects.html A maximum of 10% of the computing resources available to Member States may be allocated to Special Projects 'experiments or investigations of a scientific or technical nature, undertaken by one or more Member States, likely to be of interest to the general scientific community For 2014 HPCF: 520 million units (1 CPU hour = ~20 SBUs) Data Storage: 1040 terabytes All users within one of ECMWF s Member States can apply Complete the 'Request for a Special Project' and send to ECMWF via the UKMO Provides an estimate of the compute resources needed for a maximum of 3 years Deadline: 30 June of the year preceding the year in which the project will start! Late applications can be considered Additional resources can be requested at any time if needed University of Reading - 23 January 2014 ECMWF Slide 25

ECMWF software www.ecmwf.int/products/data/software/ BUFRDC Encodes and decodes WMO FM-94 BUFR code messages. ecflow ECMWF s workflow manager EMOSLIB Interpolation software and BUFR, CREX encoding/decoding routines. GRIB API Encodes and decodes WMO FM-92 GRIB edition 1 and edition 2 messages. Magics++ Supports the plotting of contours, wind fields, observations, satellite images, symbols, text, axis and graphs (including boxplots) Metview 4 Accesses, manipulates and visualises meteorological data All available for free download University of Reading - 23 January 2014 ECMWF Slide 26

New software support infrastructure Available at http://software.ecmwf.int/ Aim is to improve support for external users Keep track of issues in a central place Spread knowledge throughout ECMWF Based on Atlassian Suite JIRA (issues) Confluence (documentation wiki) Bamboo (Builds) University of Reading - 23 January 2014 ECMWF Slide 27

Web2013 ECMWF s new website Release dates (to be confirmed) 31 Jan 2014 Public beta 31 Mar 2014 Official release Key changes New web content management system New design and content organization New web search New charts functionality Revised and updated content Release strategy Start with minimum viable release Progressively migrate areas after release Maintain old content for one year User impact Bookmarks, references to URLs not redirected curl, wget service to be replaced Enhanced your room service (but not migrated) University of Reading - 23 January 2014 ECMWF Slide 28

ECMWF Help & Support who to contact? Reason to contact Who Availability How Urgent Dissemination problems, issues with model output Generic fault reporting, general service queries etc. Call Desk 24h/7d Call Desk 24h/7d Email: calldesk@ecmwf.int Tel: +44 118 9499 303 Specific advice Specific user query User Support 8h/5d Email: advisory@ecmwf.int My Tel: +44 118 9499 386 or 9499 000 (switchboard) My Email: Paul.Dando@ecmwf.int Changes in dissemination requirements User Support 8h/5d Email: data.services@ecmwf.int Requests for - and queries on - software User Support 8h/5d Email: software.services@ecmwf.int Specific graphics queries Developments 8h/5d Email: metview@ecmwf.int magics@ecmwf.int University of Reading - 23 January 2014 ECMWF Slide 29