The CIME Case Control System

Size: px
Start display at page:

Download "The CIME Case Control System"

Transcription

1 The CIME Case Control System An Object Oriented Python Data Driven Workflow Control System for Earth System Models Jim Edwards 22 nd Annual Community Earth System Model Workshop Boulder, CO June 2017

2 What is CCS? CCS is the workflow control code for creating an experiment in CIME. Creates and configures a case (experiment) Handles access to input data and disposition of output data Configuration and compilation of the model and supporting software. Batch interface with HPC systems Tests of model validity

3 Data and Control in CCS Data describing all aspects of interactions with CIME are stored in XML format files. These files contain all the information required to run a given experiment on a given machine using a given compiler. Code in python interprets these XML files to create a case. Code contains no client specific information. Users can change all aspects of an experiment via manipulation of data in XML files and should never need to modify code.

4 The CCS: Written in python (2.7 core) with a coordinated, consistent look and feel Object oriented approach allows for much greater code reuse Each XML file type has a corresponding python module A large and growing number of internal tests help to assure changes are backward compatible and consistent

5 User Interface scripts in CSS o create_newcase: Create an experiment (case) directory for a given machine, compiler and model configuration. o case.setup: Generate batch scripts in a case for a given pe-layout o case.build: Compile model code and supporting libraries o case.submit: Submit a job or set of jobs to the batch system o xmlquery: examine case data settings o xmlchange: modify case data settings

6 Templates for batch jobs <job name= case.run > <template> template.case.run</template> <task_count>default</task_count> <prereq> $BUILD_COMPLETE and not $TEST</prereq> </job> (from config_batch.xml)

7 Templates generate batch scripts #!/usr/bin/env python {{batchdirectives}} import os os.chdir( {{caseroot}} ) #!/usr/bin/env python # Batch system directives #PBS -N SMS.T62_g16.C.cheyenne_intel _101428_s4bcr8.run #PBS -r n #PBS -j oe #PBS -m ae #PBS -V #PBS -S /bin/bash #PBS -l select=2:ncpus=36:mpiprocs=36:ompthreads=1 os.chdir( '/glade/p/work/jedwards/sandboxes/cesm2_0_alpha/cime/scripts/sms.t62_g 16.C.cheyenne_intel _101428_s4bcr8')

8 CCS design overview Python classes and scripts user interface scripts input classes case classes abstract classes The Case class acts as the central conduit for interaction between objects. Abstract classes contain

9 CCS user interface All user scripts access and extend a common set of command line options. User scripts operate on the case object and through it manage and interpret properties defined in XML

10 Input classes Machines Input and output directories Supported compilers and libraries Batch queueing system MPI execution Compilers Options for linking and building Batch system Options for submitting jobs Components Compsets Grids Pe layouts

11 Case classes Contain variables to control configuration Each case XML file has a corresponding python object These objects inherit common properties from abstract classes

12 Testing CCS includes internal regression tests to assure functionality is maintained as the code evolves. These include pylint, xmllint, python unit tests, internal functionality tests. CCS supports a wide variety of tests of functionality of earth system models. This facilitates development of those components external to CIME. These include model restart tests, baseline comparisons, and model consistency tests.

13 pylint and xmllint Pylint assures that python code follows a community accepted standard and that exceptions to that standard are well documented. xmllint assures that XML files follow (locally) defined standards for that file format. XML file standards for CIME are defined in cimeroot/config/xml_schemas

14 Python doctest def parse_test_name(test_name): """ Given a CIME test name TESTCASE[_CASEOPTS].GRID.COMPSET[.MACHINE_COMPILER[.TESTMODS]], return each component of the testname with machine and compiler split >>> parse_test_name('ers') ['ERS', None, None, None, None, None, None] >>> parse_test_name('ers.fe12_123') ['ERS', None, 'fe12_123', None, None, None, None] >>> parse_test_name('ers.fe12_123.jgf') ['ERS', None, 'fe12_123', 'JGF', None, None, None] >>> parse_test_name('ers_d.fe12_123.jgf') ['ERS', ['D'], 'fe12_123', 'JGF', None, None, None] >>> parse_test_name('ers_d_p1.fe12_123.jgf') ['ERS', ['D', 'P1'], 'fe12_123', 'JGF', None, None, None] >>> parse_test_name('sms_d_ln9_mmpi-serial.f19_g16_rx1.a') ['SMS', ['D', 'Ln9', 'Mmpi-serial'], 'f19_g16_rx1', 'A', None, None, None] >>> parse_test_name('ers.fe12_123.jgf.machine_compiler') ['ERS', None, 'fe12_123', 'JGF', 'machine', 'compiler', None] >>> parse_test_name('ers.fe12_123.jgf.machine_compiler.test-mods') ['ERS', None, 'fe12_123', 'JGF', 'machine', 'compiler', 'test/mods'] """

15 System Tests DAE: Data assimilation test ERI: hybrid/branch/exact restart test ERIO: tests restart with different PIO methods ERP: pes counts hybrid (open-mp/mpi) restart bfb test from startup ERR: Restart with short-term archiving ERS: exact restart from startup NCK: multi-instance validation vs single instance PEA: single pe bfb test PEM: modified pe counts mpi bfb test PET: modified threading openmp bfb test (seq tests) PFS: system performance test PRE: Implementation of the CIME pause/resume test SMS: smoke startup

16 Component Specific Tests CIME allows component developers to add component specific tests to CIME capability while maintaining the test with the component model. As an example, the CLM model includes tests specific to that component but integrated with the CIME software. LII: CLM initial condition interpolation test SSP: CLM spinup test.

17 scripts_regression_tests.py daily cdash test output Name Status Time Summary N_TestUnitTest Passed 2m 22s 670ms Stable Z_FullSystemTest Passed 21m 33s Stable O_TestTestScheduler Passed 1m 17s 90ms Stable A_RunUnitTests Passed 3s 380ms Stable P_TestJenkinsGenericJob Passed 890ms Stable G_TestMacrosBasic Passed 2s 10ms Stable Q_TestBlessTestResults Passed 2m 58s 810ms Stable I_TestCMakeMacros Passed 10m 5s 810ms Stable K_TestCimeCase Passed 1m 3s 560ms Stable X_TestSingleSubmit Passed 3s 690ms Stable T_TestRunRestart Passed 9m 42s 870ms Stable S_TestManageAndQuery Passed 1s 250ms Stable R_TestUpdateACMETests Passed 870ms Stable L_TestSaveTimings Passed 7m 25s 20ms Stable J_TestCreateNewcase Passed 6m 41s 130ms Stable H_TestMakeMacros Passed 3s 50ms Stable B_CheckCode Passed 25s 910ms Stable M_TestWaitForTests Passed 29s 430ms Stable

18

19 CIME source availability and contributions CIME is open source and available from the github repository at: Contributions, issues and bug reports are welcome. Thanks to our collaborators: Alice Bertini, Michael Deakin, Chris Fischer, Jim Foucar, Steve Goldhaber, Rob Jacob, Bill Sacks, Jason Sarich, Mariana Vertenstein, Francis Vitt, Andreas Wilke Questions?

20 Disclaimer This talk is not intended to be a tutorial on how to use CIME/CCS It offers a brief exploration of the code design and structure. You really don t need to know any of this to use the model. A Look under the hood

21 What is a CIME object? Properties: Machine Compiler Grid Compset Events: setup_complete build_complete run_complete Methods: setup build submit postprocess

22 CCS Directory layout cime scripts Tools utils python CIME SystemTests XML tests

23 What is Object Oriented Design? Properties: Properties Machine Events: Compiler On_start setup_complete Make Grid On_stop build_complete Compset On_parked run_complete Model Color Year Price Methods: Methods setup Start build Stop submit Park postprocess

24 The CIME Case Control System Code Structure

25 The CIME Case Control System Introduction

26 Motivation for CCS The CESM1 scripts: Written in several languages and styles Inconsistent Look & Feel of user interface Little or no coordination between scripts or code reuse Changing an input XML file may require changes in several scripts Very difficult to modify, extend and support. Tests for functionality and correctness of scripts are few and far between Communication between scripts is through environment variables and files

27 CESM hub and spoke coupling MOSART Data CISM CLM Data WW3 Data MCT Driver/ Mediator CICE Data CAM Data POP Data

28 Common Infrastructure for Modeling Infrastructure code Freely available open source the Earth CIME Supporting Active component client models Driver-Coupler Code Data Models Case Control System System/Unit Testing Mapping Utilities CESM ACME NOAA/NEMS Post Processing and Workflow Tools Optimized math and communication utilities

29 Outline What is CIME? Case Control System (CCS) CCS Design Overview

30 The CIME Case Control System Code Design

CESM2 Software Update. Mariana Vertenstein CESM Software Engineering Group

CESM2 Software Update. Mariana Vertenstein CESM Software Engineering Group CESM2 Software Update Mariana Vertenstein CESM Software Engineering Group Outline CMIP6 Computational Performance Cheyenne Status CESM2 new user-friendly infrastructure features CIME New porting capabilities

More information

Common Infrastructure for Modeling Earth (CIME) and MOM6. Mariana Vertenstein CESM Software Engineering Group

Common Infrastructure for Modeling Earth (CIME) and MOM6. Mariana Vertenstein CESM Software Engineering Group Common Infrastructure for Modeling Earth (CIME) and MOM6 Mariana Vertenstein CESM Software Engineering Group Outline What is CIME? New CIME coupling infrastructure and MOM6 CESM2/DART Data Assimilation

More information

CESM Tutorial. NCAR Climate and Global Dynamics Laboratory. CESM 2.0 CESM1.2.x and previous (see earlier tutorials) Alice Bertini

CESM Tutorial. NCAR Climate and Global Dynamics Laboratory. CESM 2.0 CESM1.2.x and previous (see earlier tutorials) Alice Bertini CESM Tutorial NCAR Climate and Global Dynamics Laboratory CESM 2.0 CESM1.2.x and previous (see earlier tutorials) Alice Bertini NCAR is sponsored by the National Science Foundation Outline The CESM webpage

More information

Porting CESM2. Jim Edwards CESM software engineering group

Porting CESM2. Jim Edwards CESM software engineering group Porting CESM2 Jim Edwards CESM software engineering group The $HOME/.cime directory When you use CESM2 it will look for a directory $HOME/.cime You may put several files in that directory for CESM to use.

More information

The Community Land Model (CLM)

The Community Land Model (CLM) The Community Land Model (CLM) Cecile Hannay, CAM Science Liaison Atmospheric Modeling and Predictability Section Climate and Global Dynamics Division Outline CESM: workflow reminder Archiving results/long

More information

Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Quiz

Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Quiz Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Quiz Cecile Hannay, CAM Science Liaison Atmospheric Modeling and Predictability Section Climate

More information

CESM Workflow Refactor Project Land Model and Biogeochemistry Working Groups 2015 Winter Meeting CSEG & ASAP/CISL

CESM Workflow Refactor Project Land Model and Biogeochemistry Working Groups 2015 Winter Meeting CSEG & ASAP/CISL CESM Workflow Refactor Project Land Model and Biogeochemistry Working Groups 2015 Winter Meeting Alice Bertini Sheri Mickelson CSEG & ASAP/CISL CESM Workflow Refactor Project Who s involved? Joint project

More information

Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Quiz

Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Quiz Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Quiz Cecile Hannay, CAM Science Liaison Atmospheric Modeling and Predictability Section Climate

More information

I1850Clm50SpG is the short name for 1850_DATM%GSWP3v1_CLM50%SP_SICE_SOCN_MOSART_CISM2%EVOLVE_SWAV.

I1850Clm50SpG is the short name for 1850_DATM%GSWP3v1_CLM50%SP_SICE_SOCN_MOSART_CISM2%EVOLVE_SWAV. In this exercise, you will use CESM to compute the surface mass balance of the Greenland ice sheet. You will make a simple code modification to perform a crude global warming or cooling experiment. Create

More information

Getting up and running with CESM Cécile Hannay Climate and Global Dynamics (CGD), NCAR

Getting up and running with CESM Cécile Hannay Climate and Global Dynamics (CGD), NCAR Getting up and running with CESM Cécile Hannay Climate and Global Dynamics (CGD), NCAR NCAR is sponsored by the National Science Foundation Why CESM? State of the Art Climate Model Widely used by the Climate

More information

The Community Land Model tutorial session

The Community Land Model tutorial session The Community Land Model tutorial session Keith Oleson, Erik Kluzek, Keith Lindsay CGD/NCAR Thanks to TSS group for providing tutorial material CLM5.0 Offline I compsets Compsets are shortcuts designed

More information

Introducing a new tool set. Sean Patrick Santos. 19th Annual CESM Workshop, 2014

Introducing a new tool set. Sean Patrick Santos. 19th Annual CESM Workshop, 2014 Unit Testing in CESM Introducing a new tool set Sean Patrick Santos National Center for Atmospheric Research 19th Annual CESM Workshop, 2014 Overview Outline 1 Overview 2 Workflows Running Unit Tests Creating

More information

Kepler Scientific Workflow and Climate Modeling

Kepler Scientific Workflow and Climate Modeling Kepler Scientific Workflow and Climate Modeling Ufuk Turuncoglu Istanbul Technical University Informatics Institute Cecelia DeLuca Sylvia Murphy NOAA/ESRL Computational Science and Engineering Dept. NESII

More information

Porting CESM Jim Edwards CESM Software Engineering Group

Porting CESM Jim Edwards CESM Software Engineering Group Porting CESM 1.2.2 Jim Edwards CESM Software Engineering Group Note: Porting CESM can be a difficult task which may require knowledge of the UNIX operating system, building code with gmake and cmake, scripting

More information

A Software Developing Environment for Earth System Modeling. Depei Qian Beihang University CScADS Workshop, Snowbird, Utah June 27, 2012

A Software Developing Environment for Earth System Modeling. Depei Qian Beihang University CScADS Workshop, Snowbird, Utah June 27, 2012 A Software Developing Environment for Earth System Modeling Depei Qian Beihang University CScADS Workshop, Snowbird, Utah June 27, 2012 1 Outline Motivation Purpose and Significance Research Contents Technology

More information

CCSM Performance with the New Coupler, cpl6

CCSM Performance with the New Coupler, cpl6 CCSM Performance with the New Coupler, cpl6 Tony Craig Brian Kauffman Tom Bettge National Center for Atmospheric Research Jay Larson Rob Jacob Everest Ong Argonne National Laboratory Chris Ding Helen He

More information

Experiences with Porting CESM to ARCHER

Experiences with Porting CESM to ARCHER Experiences with Porting CESM to ARCHER ARCHER Technical Forum Webinar, 25th February, 2015 Gavin J. Pringle 25 February 2015 ARCHER Technical Forum Webinar Overview of talk Overview of the associated

More information

Software Infrastructure for Data Assimilation: Object Oriented Prediction System

Software Infrastructure for Data Assimilation: Object Oriented Prediction System Software Infrastructure for Data Assimilation: Object Oriented Prediction System Yannick Trémolet ECMWF Blueprints for Next-Generation Data Assimilation Systems, Boulder, March 2016 Why OOPS? Y. Trémolet

More information

Adding MOAB to CIME s MCT driver

Adding MOAB to CIME s MCT driver Adding MOAB to CIME s MCT driver Robert Jacob, Iulian Grindeanu, Vijay Mahadevan, Jason Sarich CESM SEWG winter meeting February 27, 2018 Support: DOE BER Climate Model Development and Validation project

More information

Batch Systems. Running calculations on HPC resources

Batch Systems. Running calculations on HPC resources Batch Systems Running calculations on HPC resources Outline What is a batch system? How do I interact with the batch system Job submission scripts Interactive jobs Common batch systems Converting between

More information

Improving climate model coupling through complete mesh representation

Improving climate model coupling through complete mesh representation Improving climate model coupling through complete mesh representation Robert Jacob, Iulian Grindeanu, Vijay Mahadevan, Jason Sarich July 12, 2018 3 rd Workshop on Physics Dynamics Coupling Support: U.S.

More information

Batch Systems & Parallel Application Launchers Running your jobs on an HPC machine

Batch Systems & Parallel Application Launchers Running your jobs on an HPC machine Batch Systems & Parallel Application Launchers Running your jobs on an HPC machine Partners Funding Reusing this material This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike

More information

Centre de Calcul de l Institut National de Physique Nucléaire et de Physique des Particules. Singularity overview. Vanessa HAMAR

Centre de Calcul de l Institut National de Physique Nucléaire et de Physique des Particules. Singularity overview. Vanessa HAMAR Centre de Calcul de l Institut National de Physique Nucléaire et de Physique des Particules Singularity overview Vanessa HAMAR Disclaimer } The information in this presentation was compiled from different

More information

CESM1 for Deep Time Paleoclimate

CESM1 for Deep Time Paleoclimate CESM1 for Deep Time Paleoclimate Christine A. Shields NCAR Thanks to Mariana Vertenstein, Nancy Norton, Gokhan Danabasoglu, Brian Kauffman, Erik Kluzek, Sam Levis, and Nan Rosenbloom NCAR is sponsored

More information

MPAS-O: Plan for 2013

MPAS-O: Plan for 2013 MPAS-O: Plan for 2013 Todd Ringler Theoretical Division Los Alamos National Laboratory Climate, Ocean and Sea-Ice Modeling Project http://public.lanl.gov/ringler/ringler.html Staffing, Effort and Focus

More information

Collaboration in Teams: Simulink Projects Demonstration

Collaboration in Teams: Simulink Projects Demonstration Collaboration in Teams: Simulink Projects Demonstration 김종헌차장 Senior Application Engineer MathWorks Korea 2011 The MathWorks, Inc. 1 Agenda Motivation 7 common technical challenges Next steps Q & A allen.kim@mathworks.com

More information

Python ecosystem for scientific computing with ABINIT: challenges and opportunities. M. Giantomassi and the AbiPy group

Python ecosystem for scientific computing with ABINIT: challenges and opportunities. M. Giantomassi and the AbiPy group Python ecosystem for scientific computing with ABINIT: challenges and opportunities M. Giantomassi and the AbiPy group Frejus, May 9, 2017 Python package for: generating input files automatically post-processing

More information

The PRISM infrastructure System Architecture and User Interface

The PRISM infrastructure System Architecture and User Interface The PRISM infrastructure System Architecture and User Interface Claes Larsson ECMWF UK PRISM Hamburg D&M presentation p.1/20 Architecture goals PRISM architecture to provide an efficient climate modelling

More information

Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Quiz

Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Quiz Namelist and Code Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Quiz Cecile Hannay, CAM Science Liaison Atmospheric Modeling and Predictability Section Climate and Global

More information

Batch Systems. Running your jobs on an HPC machine

Batch Systems. Running your jobs on an HPC machine Batch Systems Running your jobs on an HPC machine Reusing this material This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 4.0 International License. http://creativecommons.org/licenses/by-nc-sa/4.0/deed.en_us

More information

MERCED CLUSTER BASICS Multi-Environment Research Computer for Exploration and Discovery A Centerpiece for Computational Science at UC Merced

MERCED CLUSTER BASICS Multi-Environment Research Computer for Exploration and Discovery A Centerpiece for Computational Science at UC Merced MERCED CLUSTER BASICS Multi-Environment Research Computer for Exploration and Discovery A Centerpiece for Computational Science at UC Merced Sarvani Chadalapaka HPC Administrator University of California

More information

RegCM-ROMS Tutorial: Coupling RegCM-ROMS

RegCM-ROMS Tutorial: Coupling RegCM-ROMS RegCM-ROMS Tutorial: Coupling RegCM-ROMS Ufuk Utku Turuncoglu ICTP (International Center for Theoretical Physics) Earth System Physics Section - Outline Outline Information about coupling and ESMF Installation

More information

STARTING THE DDT DEBUGGER ON MIO, AUN, & MC2. (Mouse over to the left to see thumbnails of all of the slides)

STARTING THE DDT DEBUGGER ON MIO, AUN, & MC2. (Mouse over to the left to see thumbnails of all of the slides) STARTING THE DDT DEBUGGER ON MIO, AUN, & MC2 (Mouse over to the left to see thumbnails of all of the slides) ALLINEA DDT Allinea DDT is a powerful, easy-to-use graphical debugger capable of debugging a

More information

Guillimin HPC Users Meeting March 17, 2016

Guillimin HPC Users Meeting March 17, 2016 Guillimin HPC Users Meeting March 17, 2016 guillimin@calculquebec.ca McGill University / Calcul Québec / Compute Canada Montréal, QC Canada Outline Compute Canada News System Status Software Updates Training

More information

DDT: A visual, parallel debugger on Ra

DDT: A visual, parallel debugger on Ra DDT: A visual, parallel debugger on Ra David M. Larue dlarue@mines.edu High Performance & Research Computing Campus Computing, Communications, and Information Technologies Colorado School of Mines March,

More information

Compiling applications for the Cray XC

Compiling applications for the Cray XC Compiling applications for the Cray XC Compiler Driver Wrappers (1) All applications that will run in parallel on the Cray XC should be compiled with the standard language wrappers. The compiler drivers

More information

CESM Projects Using ESMF and NUOPC Conventions

CESM Projects Using ESMF and NUOPC Conventions CESM Projects Using ESMF and NUOPC Conventions Cecelia DeLuca NOAA ESRL/University of Colorado CESM Annual Workshop June 18, 2014 Outline ESMF development update Joint CESM-ESMF projects ESMF applications:

More information

Parameter searches and the batch system

Parameter searches and the batch system Parameter searches and the batch system Scientific Computing Group css@rrzn.uni-hannover.de Parameter searches and the batch system Scientific Computing Group 1st of October 2012 1 Contents 1 Parameter

More information

Geant4 on Azure using Docker containers

Geant4 on Azure using Docker containers http://www.geant4.org Geant4 on Azure using Docker containers Andrea Dotti (adotti@slac.stanford.edu) ; SD/EPP/Computing 1 Outlook Motivation/overview Docker + G4 Azure + G4 Conclusions 2 Motivation/overview

More information

ACME SE Update. Robert Jacob. June 16, th CESW Workshop SEWG Meeting Breckenridge, CO

ACME SE Update. Robert Jacob. June 16, th CESW Workshop SEWG Meeting Breckenridge, CO ACME SE Update Robert Jacob June 16, 2015 20th CESW Workshop SEWG Meeting Breckenridge, CO ACME in a nutshell A new U.S. climate modeling effort led by the U.S. Department of Energy Office of Biological

More information

Improving the Eclipse Parallel Tools Platform in Support of Earth Sciences High Performance Computing

Improving the Eclipse Parallel Tools Platform in Support of Earth Sciences High Performance Computing Improving the Eclipse Parallel Tools Platform in Support of Earth Sciences High Performance Computing Jay Alameda National Center for Supercomputing Applications, University of Illinois at Urbana-Champaign

More information

NCAR CCSM with Task Geometry Support in LSF

NCAR CCSM with Task Geometry Support in LSF NCAR CCSM with Task Geometry Support in LSF Mike Page ScicomP 11 Conference Edinburgh, Scotland June 1, 2005 NCAR/CISL/SCD Consulting Services Group mpage@ucar.edu Overview Description of CCSM Task Geometry

More information

Parallelism. Wolfgang Kastaun. May 9, 2008

Parallelism. Wolfgang Kastaun. May 9, 2008 Parallelism Wolfgang Kastaun May 9, 2008 Outline Parallel computing Frameworks MPI and the batch system Running MPI code at TAT The CACTUS framework Overview Mesh refinement Writing Cactus modules Links

More information

Techno Expert Solutions An institute for specialized studies! Introduction to Advance QTP course Content

Techno Expert Solutions An institute for specialized studies! Introduction to Advance QTP course Content Introduction to Advance QTP course Content NTRODUCTION TO AUTOMATION Automation Testing Benefits of Automation Testing Automation Testing Vs Manual Testing Automation Test Tools Tool selection criteria

More information

Running applications on the Cray XC30

Running applications on the Cray XC30 Running applications on the Cray XC30 Running on compute nodes By default, users do not access compute nodes directly. Instead they launch jobs on compute nodes using one of three available modes: 1. Extreme

More information

A Hands-On Tutorial: RNA Sequencing Using High-Performance Computing

A Hands-On Tutorial: RNA Sequencing Using High-Performance Computing A Hands-On Tutorial: RNA Sequencing Using Computing February 11th and 12th, 2016 1st session (Thursday) Preliminaries: Linux, HPC, command line interface Using HPC: modules, queuing system Presented by:

More information

ELFms industrialisation plans

ELFms industrialisation plans ELFms industrialisation plans CERN openlab workshop 13 June 2005 German Cancio CERN IT/FIO http://cern.ch/elfms ELFms industrialisation plans, 13/6/05 Outline Background What is ELFms Collaboration with

More information

A Characterization of Shared Data Access Patterns in UPC Programs

A Characterization of Shared Data Access Patterns in UPC Programs IBM T.J. Watson Research Center A Characterization of Shared Data Access Patterns in UPC Programs Christopher Barton, Calin Cascaval, Jose Nelson Amaral LCPC `06 November 2, 2006 Outline Motivation Overview

More information

ECMWF Workshop on High Performance Computing in Meteorology. 3 rd November Dean Stewart

ECMWF Workshop on High Performance Computing in Meteorology. 3 rd November Dean Stewart ECMWF Workshop on High Performance Computing in Meteorology 3 rd November 2010 Dean Stewart Agenda Company Overview Rogue Wave Product Overview IMSL Fortran TotalView Debugger Acumem ThreadSpotter 1 Copyright

More information

Subject : Computer Science. Paper : Software Quality Management. Module : CASE Tools

Subject : Computer Science. Paper : Software Quality Management. Module : CASE Tools e-pg Pathshala Subject : Computer Science Paper : Software Quality Management Module : CASE Tools Module No: Quadrant 1: CS/SQM/26 e-text An increasing variety of specialized computerized tools (actually

More information

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System Image Sharpening Practical Introduction to HPC Exercise Instructions for Cirrus Tier-2 System 2 1. Aims The aim of this exercise is to get you used to logging into an HPC resource, using the command line

More information

Development Environments for HPC: The View from NCSA

Development Environments for HPC: The View from NCSA Development Environments for HPC: The View from NCSA Jay Alameda National Center for Supercomputing Applications, University of Illinois at Urbana-Champaign DEHPC 15 San Francisco, CA 18 October 2015 Acknowledgements

More information

Introduction to HPC Using zcluster at GACRC

Introduction to HPC Using zcluster at GACRC Introduction to HPC Using zcluster at GACRC On-class PBIO/BINF8350 Georgia Advanced Computing Resource Center University of Georgia Zhuofei Hou, HPC Trainer zhuofei@uga.edu Outline What is GACRC? What

More information

MOHA: Many-Task Computing Framework on Hadoop

MOHA: Many-Task Computing Framework on Hadoop Apache: Big Data North America 2017 @ Miami MOHA: Many-Task Computing Framework on Hadoop Soonwook Hwang Korea Institute of Science and Technology Information May 18, 2017 Table of Contents Introduction

More information

CESM1/CCSM4 Tutorial: Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Solutions

CESM1/CCSM4 Tutorial: Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Solutions CESM1/CCSM4 Tutorial: Modifications Part 1: Namelist Modifications Part 2: Code Modifications Part 3: Exercises and Solutions Cecile Hannay, CAM Science Liaison Atmospheric Modeling and Predictability

More information

MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization

MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization 2 Glenn Bresnahan Director, SCV MGHPCC Buy-in Program Kadin Tseng HPC Programmer/Consultant

More information

I/O analysis of climate applications

I/O analysis of climate applications I/O analysis of climate applications Arne Beer, MN 6489196, Frank Röder, MN 6526113 Introduction About the paper and our goals In this paper we analyze and present the strengths and weaknesses of different

More information

Improving the Productivity of Scalable Application Development with TotalView May 18th, 2010

Improving the Productivity of Scalable Application Development with TotalView May 18th, 2010 Improving the Productivity of Scalable Application Development with TotalView May 18th, 2010 Chris Gottbrath Principal Product Manager Rogue Wave Major Product Offerings 2 TotalView Technologies Family

More information

Using Rmpi within the HPC4Stats framework

Using Rmpi within the HPC4Stats framework Using Rmpi within the HPC4Stats framework Dorit Hammerling Analytics and Integrative Machine Learning Group National Center for Atmospheric Research (NCAR) Based on work by Doug Nychka (Applied Mathematics

More information

High Performance Computing (HPC) Using zcluster at GACRC

High Performance Computing (HPC) Using zcluster at GACRC High Performance Computing (HPC) Using zcluster at GACRC On-class STAT8060 Georgia Advanced Computing Resource Center University of Georgia Zhuofei Hou, HPC Trainer zhuofei@uga.edu Outline What is GACRC?

More information

ESMF. Earth System Modeling Framework. Carsten Lemmen. Schnakenbek, 17 Sep /23

ESMF. Earth System Modeling Framework. Carsten Lemmen. Schnakenbek, 17 Sep /23 1/23 ESMF Earth System Modeling Framework Carsten Lemmen Schnakenbek, 17 Sep 2013 2/23 Why couple? GEOS5 vorticity We live in a coupled world combine, extend existing models (domains + processes) reuse

More information

Microsoft SharePoint End User level 1 course content (3-day)

Microsoft SharePoint End User level 1 course content (3-day) http://www.multimediacentre.co.za Cape Town: 021 790 3684 Johannesburg: 011 083 8384 Microsoft SharePoint End User level 1 course content (3-day) Course Description SharePoint End User Level 1 teaches

More information

ReFrame: A Regression Testing Framework Enabling Continuous Integration of Large HPC Systems

ReFrame: A Regression Testing Framework Enabling Continuous Integration of Large HPC Systems ReFrame: A Regression Testing Framework Enabling Continuous Integration of Large HPC Systems HPC Advisory Council 2018 Victor Holanda, Vasileios Karakasis, CSCS Apr. 11, 2018 ReFrame in a nutshell Regression

More information

Introduction to Abel/Colossus and the queuing system

Introduction to Abel/Colossus and the queuing system Introduction to Abel/Colossus and the queuing system November 14, 2018 Sabry Razick Research Infrastructure Services Group, USIT Topics First 7 slides are about us and links The Research Computing Services

More information

Introduction to HPC Using zcluster at GACRC

Introduction to HPC Using zcluster at GACRC Introduction to HPC Using zcluster at GACRC Georgia Advanced Computing Resource Center University of Georgia Zhuofei Hou, HPC Trainer zhuofei@uga.edu Outline What is GACRC? What is HPC Concept? What is

More information

HPC Performance Advances for Existing US Navy NWP Systems

HPC Performance Advances for Existing US Navy NWP Systems HPC Performance Advances for Existing US Navy NWP Systems Timothy Whitcomb, Kevin Viner Naval Research Laboratory Marine Meteorology Division Monterey, CA Matthew Turner DeVine Consulting, Monterey, CA

More information

Style Report Enterprise Edition

Style Report Enterprise Edition INTRODUCTION Style Report Enterprise Edition Welcome to Style Report Enterprise Edition! Style Report is a report design and interactive analysis package that allows you to explore, analyze, monitor, report,

More information

AMath 483/583 Lecture 2. Notes: Notes: Homework #1. Class Virtual Machine. Notes: Outline:

AMath 483/583 Lecture 2. Notes: Notes: Homework #1. Class Virtual Machine. Notes: Outline: AMath 483/583 Lecture 2 Outline: Binary storage, floating point numbers Version control main ideas Client-server version control, e.g., CVS, Subversion Distributed version control, e.g., git, Mercurial

More information

AMath 483/583 Lecture 2

AMath 483/583 Lecture 2 AMath 483/583 Lecture 2 Outline: Binary storage, floating point numbers Version control main ideas Client-server version control, e.g., CVS, Subversion Distributed version control, e.g., git, Mercurial

More information

A efficient iterated technique for spinning up coupled CESM-CISM2 Jeremy Fyke, Bill Sacks, Marcus Löfverström, Keith Lindsay

A efficient iterated technique for spinning up coupled CESM-CISM2 Jeremy Fyke, Bill Sacks, Marcus Löfverström, Keith Lindsay A efficient iterated technique for spinning up coupled CESM-CISM2 Jeremy Fyke, Bill Sacks, Marcus Löfverström, Keith Lindsay Problem: Coupled ice-sheet/climate system needs long equilibration but it s

More information

Sharpen Exercise: Using HPC resources and running parallel applications

Sharpen Exercise: Using HPC resources and running parallel applications Sharpen Exercise: Using HPC resources and running parallel applications Andrew Turner, Dominic Sloan-Murphy, David Henty, Adrian Jackson Contents 1 Aims 2 2 Introduction 2 3 Instructions 3 3.1 Log into

More information

Introduction to Parallel Programming

Introduction to Parallel Programming Introduction to Parallel Programming January 14, 2015 www.cac.cornell.edu What is Parallel Programming? Theoretically a very simple concept Use more than one processor to complete a task Operationally

More information

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer About the Tutorial PyCharm is the most popular IDE for Python, and includes great features such as excellent code completion and inspection with advanced debugger and support for web programming and various

More information

Our Workshop Environment

Our Workshop Environment Our Workshop Environment John Urbanic Parallel Computing Scientist Pittsburgh Supercomputing Center Copyright 2015 Our Environment Today Your laptops or workstations: only used for portal access Blue Waters

More information

Get Started on SOA. People Entry Point Interaction and Collaboration Services. Case for an SOA Portal

Get Started on SOA. People Entry Point Interaction and Collaboration Services. Case for an SOA Portal Get Started on SOA People Entry Point Interaction and Collaboration Services Case for an SOA Our customers are our highest priorities; our employees are our highest cost We need to make our employees more

More information

THE IMPORTANCE OF NICHE TECHNOLOGIES IN BUSINESS ANALYSIS. - Kat Okwera Jan 2019

THE IMPORTANCE OF NICHE TECHNOLOGIES IN BUSINESS ANALYSIS. - Kat Okwera Jan 2019 THE IMPORTANCE OF NICHE TECHNOLOGIES IN BUSINESS ANALYSIS - Kat Okwera Jan 2019 HEY THERE I M A BA TOO! Kat Okwera Programmer Systems Designer Web Developer Project Manager Business Analyst E-Learning

More information

Adding Support For a New Resource Manager

Adding Support For a New Resource Manager Greg Watson PTP User/Developer Meeting, Chicago, September 2012 Adding Support For a New Resource Manager Introduction Based on The (JAXB) Configurable Resource Manager for PTP by Albert L. Rossi http://wiki.eclipse.org/images/2/28/jaxbdemo.pdf

More information

A simple OASIS interface for CESM E. Maisonnave TR/CMGC/11/63

A simple OASIS interface for CESM E. Maisonnave TR/CMGC/11/63 A simple OASIS interface for CESM E. Maisonnave TR/CMGC/11/63 Index Strategy... 4 Implementation... 6 Advantages... 6 Current limitations... 7 Annex 1: OASIS3 interface implementation on CESM... 9 Annex

More information

Developing Scientific Applications with the IBM Parallel Environment Developer Edition

Developing Scientific Applications with the IBM Parallel Environment Developer Edition Developing Scientific Applications with the IBM Parallel Environment Developer Edition Greg Watson, IBM grw@us.ibm.com Christoph Pospiech, IBM christoph.pospiech@de.ibm.com ScicomP 13 May 2013 Portions

More information

Python for Earth Scientists

Python for Earth Scientists Python for Earth Scientists Andrew Walker andrew.walker@bris.ac.uk Python is: A dynamic, interpreted programming language. Python is: A dynamic, interpreted programming language. Data Source code Object

More information

Introduction to GALILEO

Introduction to GALILEO Introduction to GALILEO Parallel & production environment Mirko Cestari m.cestari@cineca.it Alessandro Marani a.marani@cineca.it Domenico Guida d.guida@cineca.it Maurizio Cremonesi m.cremonesi@cineca.it

More information

OMNIO: A Tool for I/O Recording, Analysis and Replay

OMNIO: A Tool for I/O Recording, Analysis and Replay OMNIO: A Tool for I/O Recording, Analysis and Replay Bryan Flynt Cooperative Institute for Research in the Atmosphere Colorado State University Fort Collins, Colorado USA Mark Govett Advanced Technology

More information

6.S096 Lecture 10 Course Recap, Interviews, Advanced Topics

6.S096 Lecture 10 Course Recap, Interviews, Advanced Topics 6.S096 Lecture 10 Course Recap, Interviews, Advanced Topics Grab Bag & Perspective January 31, 2014 6.S096 Lecture 10 Course Recap, Interviews, Advanced Topics January 31, 2014 1 / 19 Outline 1 Perspective

More information

My name is Brian Pottle. I will be your guide for the next 45 minutes of interactive lectures and review on this lesson.

My name is Brian Pottle. I will be your guide for the next 45 minutes of interactive lectures and review on this lesson. Hello, and welcome to this online, self-paced lesson entitled ORE Embedded R Scripts: SQL Interface. This session is part of an eight-lesson tutorial series on Oracle R Enterprise. My name is Brian Pottle.

More information

Impossible Solutions, Inc. JDF Ticket Creator & DP2 to Indigo scripts Reference Manual Rev

Impossible Solutions, Inc. JDF Ticket Creator & DP2 to Indigo scripts Reference Manual Rev Impossible Solutions, Inc. JDF Ticket Creator & DP2 to Indigo scripts Reference Manual Rev. 06.29.09 Overview: This reference manual will cover two separate applications that work together to produce a

More information

Running CCSM at CCS. Created: Patrick Kelly 8/26/11 Modifications: Sarah Larson & Hosmay Lopez 4/30/13

Running CCSM at CCS. Created: Patrick Kelly 8/26/11 Modifications: Sarah Larson & Hosmay Lopez 4/30/13 Running CCSM at CCS Created: Patrick Kelly 8/26/11 Modifications: Sarah Larson & Hosmay Lopez 4/30/13 Running CCSM / CAM First Time Only Set up CCS account Install model code and input data from NCAR Port

More information

Using the Eclipse Parallel Tools Platform in Support of Earth Sciences High Performance Computing

Using the Eclipse Parallel Tools Platform in Support of Earth Sciences High Performance Computing Using the Eclipse Parallel Tools Platform in Support of Earth Sciences High Performance Computing Jay Alameda National Center for Supercomputing Applications, University of Illinois at Urbana-Champaign

More information

XSEDE New User Training. Ritu Arora November 14, 2014

XSEDE New User Training. Ritu Arora   November 14, 2014 XSEDE New User Training Ritu Arora Email: rauta@tacc.utexas.edu November 14, 2014 1 Objectives Provide a brief overview of XSEDE Computational, Visualization and Storage Resources Extended Collaborative

More information

Parallel Debugging with TotalView BSC-CNS

Parallel Debugging with TotalView BSC-CNS Parallel Debugging with TotalView BSC-CNS AGENDA What debugging means? Debugging Tools in the RES Allinea DDT as alternative (RogueWave Software) What is TotalView Compiling Your Program Starting totalview

More information

L A T E X crash course

L A T E X crash course L A T E X crash (for PhDs) 1 1 Research group on Computational Geo-Ecology Instituut voor Biodiversiteit en Ecosysteem Dynamica Universiteit van Amsterdam June 26, 2010 Outline 1 2 What is L A T E X? 3

More information

Siebel Application Deployment Manager Guide. Version 8.0, Rev. A April 2007

Siebel Application Deployment Manager Guide. Version 8.0, Rev. A April 2007 Siebel Application Deployment Manager Guide Version 8.0, Rev. A April 2007 Copyright 2005, 2006, 2007 Oracle. All rights reserved. The Programs (which include both the software and documentation) contain

More information

Introduction to Python for Scientific Computing

Introduction to Python for Scientific Computing 1 Introduction to Python for Scientific Computing http://tinyurl.com/cq-intro-python-20151022 By: Bart Oldeman, Calcul Québec McGill HPC Bart.Oldeman@calculquebec.ca, Bart.Oldeman@mcgill.ca Partners and

More information

Clare Richards, Benjamin Evans, Kate Snow, Chris Allen, Jingbo Wang, Kelsey A Druken, Sean Pringle, Jon Smillie and Matt Nethery. nci.org.

Clare Richards, Benjamin Evans, Kate Snow, Chris Allen, Jingbo Wang, Kelsey A Druken, Sean Pringle, Jon Smillie and Matt Nethery. nci.org. The important role of HPC and data-intensive infrastructure facilities in supporting a diversity of Virtual Research Environments (VREs): working with Climate Clare Richards, Benjamin Evans, Kate Snow,

More information

CICE and POP2 Boundary-exchange Performance Optimization (High-resolution and Large-scale Run)

CICE and POP2 Boundary-exchange Performance Optimization (High-resolution and Large-scale Run) CICE and POP2 Boundary-exchange Performance Optimization (High-resolution and Large-scale Run) June 19, 2018 Youngsung Kim and John M. Dennis CESM Workshop 2018 CESM high-resolution simulation Adding more

More information

Table of Contents EVALUATION COPY

Table of Contents EVALUATION COPY Table of Contents Introduction... 1-2 A Brief History of Python... 1-3 Python Versions... 1-4 Installing Python... 1-5 Environment Variables... 1-6 Executing Python from the Command Line... 1-7 IDLE...

More information

Shared Parallel Filesystems in Heterogeneous Linux Multi-Cluster Environments

Shared Parallel Filesystems in Heterogeneous Linux Multi-Cluster Environments LCI HPC Revolution 2005 26 April 2005 Shared Parallel Filesystems in Heterogeneous Linux Multi-Cluster Environments Matthew Woitaszek matthew.woitaszek@colorado.edu Collaborators Organizations National

More information

The Earth System Modeling Framework (and Beyond)

The Earth System Modeling Framework (and Beyond) The Earth System Modeling Framework (and Beyond) Fei Liu NOAA Environmental Software Infrastructure and Interoperability http://www.esrl.noaa.gov/nesii/ March 27, 2013 GEOSS Community ESMF is an established

More information

The CESM Land Ice Model Documentation and User s Guide

The CESM Land Ice Model Documentation and User s Guide The CESM Land Ice Model Documentation and User s Guide William Lipscomb (Los Alamos National Laboratory) William Sacks (National Center for Atmospheric Research) November 26, 2012 1 Table of Contents 1

More information

Introduction to NCAR HPC. 25 May 2017 Consulting Services Group Brian Vanderwende

Introduction to NCAR HPC. 25 May 2017 Consulting Services Group Brian Vanderwende Introduction to NCAR HPC 25 May 2017 Consulting Services Group Brian Vanderwende Topics we will cover Technical overview of our HPC systems The NCAR computing environment Accessing software on Cheyenne

More information

walberla: Developing a Massively Parallel HPC Framework

walberla: Developing a Massively Parallel HPC Framework walberla: Developing a Massively Parallel HPC Framework SIAM CS&E 2013, Boston February 26, 2013 Florian Schornbaum*, Christian Godenschwager*, Martin Bauer*, Matthias Markl, Ulrich Rüde* *Chair for System

More information