Introduction to ECMWF resources: Computing and archive services. and how to access them Paul Dando User Support Paul.Dando@ecmwf.int advisory@ecmwf.int University of Reading - 23 January 2014 ECMWF Slide 1
ECMWF a few figures Age of ECMWF: 38 years Employees: About 260 Supported by: 34 States Budget: Contributions by Member States and Co-operating States 55 million per annum 41 million per annum University of Reading - 23 January 2014 ECMWF Slide 2
ECMWF Objectives Operational forecasting up to 15 days ahead (including waves) R & D activities in forecast modelling Data archiving and related services Operational forecasts for the coming month and season Advanced NWP training Provision of supercomputer resources Assistance to WMO programmes Management of Regional Meteorological Data Communications Network (RMDCN) University of Reading - 23 January 2014 ECMWF Slide 3
Organisation of ECMWF Policy Advisory Committee 5 20 Members COUNCIL 20 Member States Scientific Advisory Committee 12 Members Technical Advisory Committee 20 Members Advisory Committee on Data Policy 5 34 Members DIRECTOR-GENERAL A. Thorpe (UK) Finance Committee 7 Members Advisory Committee of Co-operating States 14 Members Forecast Department F. Rabier (France) Computing Department Vacant (acting: I. Weger, Austria) Administration Department N. Farrell (Ireland) Research Department E. Källén (Sweden) User support Evaluation Networks Operation HR Property Services Atm. Composition Data Development Production Servers and desktops HPC Communication Finance Model Predictability University of Reading - 23 January 2014 ECMWF Slide 4
The operational forecasting system High resolution deterministic forecast (HRES) : twice per day 16 km 137 levels, to 10 days ahead Ensemble forecast (ENS): twice per day 51 members, 32/65 km 91 levels, to 15 days ahead Monday/Thursday 00 UTC extended to 1 month ahead (Monthly Forecast) Ocean waves: twice daily Global: 10 days ahead at 28 km European Waters: 5 days ahead at 10 km Ensemble: 15 days ahead at 55 km Seasonal forecast: once a month System 4 51-members, TL255 (~80 km) 91 levels, to 7 months ahead sub-set of 15 members is run for 13 months every quarter (30 years of hindcasts) University of Reading - 23 January 2014 ECMWF Slide 5 Slide 5
Operational upgrades www.ecmwf.int/products/changes University of Reading - 23 January 2014 ECMWF Slide 6
Education and training Training Courses NWP (Numerical methods, Data assimilation, Model Physics, Predictability) Use and interpretation of ECMWF products Computer user training courses Webinars (remote seminars/lectures) See https://software.ecmwf.int/wiki/display/optr/ecmwf+training+activities Seminars Research Seminar: Use of Satellite Data September 2014 Workshops - High performance computing in meteorology (biennial) - Meteorological Operational Systems (biennial) NEW 27 Jan Understanding the model climate (L. Magnusson) Monthly forecasting (L. Ferranti) 28 Jan Forecasting extreme events (I. Tsonevsky) Extra-tropical cyclone tracking (T. Hewson) Model errors and diagnostic tools (M. Rodwell) 29 Jan Clouds and precipitation: (R. Forbes) University of Reading - 23 January 2014 ECMWF Slide 7
Education and training Calendar 2014: www.ecmwf.int/newsevents/calendar/2014.html 27-31 January Use and interpretation of ECMWF Products 3-7 February Repeat course 10-14 February HPCF (Use of new Cray system) 17-21 February HPCF (Use of new Cray system) 25-28 February GRIB API: library and tools 3-7 March Introduction for new users/mars 10-14 March Data assimilation and use of satellite data (now 5 days) 17-21 March ECMWF/EUMETSAT NWP-SAF Satellite data assimilation 24-28 March Numerical methods, adiabatic formulation of models and ocean wave forecasting 31 Mar 10 April Parametrization of subgrid physical processes 23-25 April Introduction to ecflow 28 April 2 May Magics / Metview 7-16 May Predictability, diagnostics and extended-range forecasting 1-4 September Seminar on Use of satellite data 27-31 October 16th Workshop on High performance computing in meteorology more e-learning, webinars, University of Reading - 23 January 2014 ECMWF Slide 8
HPCF www.ecmwf.int/services/computing/hpcf/ University of Reading - 23 January 2014 ECMWF Slide 9
Current HPCF IBM POWER 7 Two clusters (C2A & C2B) - each with 768 nodes - in total 48,000 compute cores 100 TB of memory 3 PB of usable disk space Performance increase about 3x over previous system (POWER6) ~1.5 petaflops peak ~70 teraflops sustained Provides HPC service until mid-2014 1,800,000 1,600,000 1,400,000 1,200,000 1,000,000 800,000 600,000 C1A Serial C2A Parallel C1A Parallel C2A Serial 400,000 200,000 - University of Reading - 23 January 2014 ECMWF Slide 10
Cray XC30 HPCF Contract with Cray signed on 24 June 2013 2 compute clusters 2 storage clusters 3x sustained performance on ECMWF codes as existing HPCF Performance coming from more rather than faster cores ~3,500 nodes each with 2x 12-core Intel Ivy Bridge processors and 64GiB memory per node ~84,000 cores per cluster Ivy Bridge about 20% less sustained performance than POWER7 per core Focus on scalability of applications University of Reading - 23 January 2014 ECMWF Slide 11
Unix server ecgb www.ecmwf.int/services/computing/ecgate/ University of Reading - 23 January 2014 ECMWF Slide 12
Unix server ecgb 8 compute nodes each with 2 Intel Xeon processors (Sandy Bridge-EP): 16 core at 2.7 GHz 128 GB memory 2 x 900 GB SAS HDD Hyper threading is used providing 32 virtual CPUs per node. One (+one as backup) of these nodes serves as a "login" node. 4 I/O server nodes 8 DS3524 with 24 x 3 x 300 GB 10k SAS HDD storage subsystem providing 172.8 TB raw disk space RedHat Enterprise Linux Server 6.4 Available to ~2700 users at more than 300 institutions University of Reading - 23 January 2014 ECMWF Slide 13
Data Handling System (DHS) www.ecmwf.int/services/computing/overview/datahandling.html University of Reading - 23 January 2014 ECMWF Slide 14
Data archive and storage Data archival and retrieval system for all ECMWF data Archive volume of ~55 PB (January 2014) ~15 PB of backup data in the Disaster Recovery System Primary copy: Oracle SL8500s + T10000 Operational since early 2010 4 Oracle SL8500 Automated Tape Libraries (ATLs) Additional library installed in 2012 Single library image with 40,000 slots Good robotic management interfaces Safety copy (DRS): TS3500 and LTO drives Mix of LTO3 and LTO5 drives Hardly any read access University of Reading - 23 January 2014 ECMWF Slide 15
Data Server http://apps.ecmwf.int/datasets/ A standalone system outside the firewall Public (non-commercial) distribution of data Self-registration Batch access possible with Python, Perl, Java GRIB or netcdf University of Reading - 23 January 2014 ECMWF Slide 16
Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 17
Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 18
Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 19
Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 20
Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 21
Data Server http://apps.ecmwf.int/datasets/ University of Reading - 23 January 2014 ECMWF Slide 22
New Webmars http://apps.ecmwf.int/services/mars/catalogue/ University of Reading - 23 January 2014 ECMWF Slide 23
Access to computing and archive services Type of user How registered? Access Public user Self-registration Public data server Web-only user Registered via UKMO Full MARS archive via webmars Valid data (with UKMO approval) Fully-registered user Registered via UKMO Full MARS archive Valid data (with UKMO approval) Interactive login to ecgate Special Project user Project approved by UKMO and ECMWF Registered by PI Full MARS archive Valid data (with UKMO approval) Interactive login to ecgate Access to HPCF Member State user registration forms available at: http://www.ecmwf.int/about/computer_access_registration/forms/ Complete, sign and send to the UK Computing Rep (Roddy Sharp) at the Met Office University of Reading - 23 January 2014 ECMWF Slide 24
Special Projects http://www.ecmwf.int/about/computer_access_registration/special_projects.html A maximum of 10% of the computing resources available to Member States may be allocated to Special Projects 'experiments or investigations of a scientific or technical nature, undertaken by one or more Member States, likely to be of interest to the general scientific community For 2014 HPCF: 520 million units (1 CPU hour = ~20 SBUs) Data Storage: 1040 terabytes All users within one of ECMWF s Member States can apply Complete the 'Request for a Special Project' and send to ECMWF via the UKMO Provides an estimate of the compute resources needed for a maximum of 3 years Deadline: 30 June of the year preceding the year in which the project will start! Late applications can be considered Additional resources can be requested at any time if needed University of Reading - 23 January 2014 ECMWF Slide 25
ECMWF software www.ecmwf.int/products/data/software/ BUFRDC Encodes and decodes WMO FM-94 BUFR code messages. ecflow ECMWF s workflow manager EMOSLIB Interpolation software and BUFR, CREX encoding/decoding routines. GRIB API Encodes and decodes WMO FM-92 GRIB edition 1 and edition 2 messages. Magics++ Supports the plotting of contours, wind fields, observations, satellite images, symbols, text, axis and graphs (including boxplots) Metview 4 Accesses, manipulates and visualises meteorological data All available for free download University of Reading - 23 January 2014 ECMWF Slide 26
New software support infrastructure Available at http://software.ecmwf.int/ Aim is to improve support for external users Keep track of issues in a central place Spread knowledge throughout ECMWF Based on Atlassian Suite JIRA (issues) Confluence (documentation wiki) Bamboo (Builds) University of Reading - 23 January 2014 ECMWF Slide 27
Web2013 ECMWF s new website Release dates (to be confirmed) 31 Jan 2014 Public beta 31 Mar 2014 Official release Key changes New web content management system New design and content organization New web search New charts functionality Revised and updated content Release strategy Start with minimum viable release Progressively migrate areas after release Maintain old content for one year User impact Bookmarks, references to URLs not redirected curl, wget service to be replaced Enhanced your room service (but not migrated) University of Reading - 23 January 2014 ECMWF Slide 28
ECMWF Help & Support who to contact? Reason to contact Who Availability How Urgent Dissemination problems, issues with model output Generic fault reporting, general service queries etc. Call Desk 24h/7d Call Desk 24h/7d Email: calldesk@ecmwf.int Tel: +44 118 9499 303 Specific advice Specific user query User Support 8h/5d Email: advisory@ecmwf.int My Tel: +44 118 9499 386 or 9499 000 (switchboard) My Email: Paul.Dando@ecmwf.int Changes in dissemination requirements User Support 8h/5d Email: data.services@ecmwf.int Requests for - and queries on - software User Support 8h/5d Email: software.services@ecmwf.int Specific graphics queries Developments 8h/5d Email: metview@ecmwf.int magics@ecmwf.int University of Reading - 23 January 2014 ECMWF Slide 29