Surrogate models and the optimization of systems in the absence of algebraic models

Size: px
Start display at page:

Download "Surrogate models and the optimization of systems in the absence of algebraic models"

Transcription

1 Surrogate models and the optimization of systems in the absence of algebraic models Nick Sahinidis Acknowledgments: Alison Cozad, David Miller, Zach Wilson Atharv Bhosekar, Luis Miguel Rios, Hua Zheng

2 SIMULATION OPTIMIZATION Pulverized coal plant Aspen Plus simulation provided by the National Energy Technology Laboratory 2

3 EXPERIMENT OPTIMIZATION Extraction Fermentation (twice) Polymerization Extrusion Distillation Process Systems Engineering Six functional units Maximize PTT yield Experimental system available in the Unit Operations Lab Several degrees of freedom Fermenter ph, T, NaCl concentration, O2 sparging conditions Extruder input concentrations, spinning conditions (discrete) 3

4 CHALLENGES No algebraic model Complex process alternatives OPTIMIZER Cost $ 1 reactor 2 reactors 3 reactors Reactor size SIMULATOR Costly simulations minutes seconds hours Scarcity of fully robust simulations 4

5 OPTIMIZATION TECHNOLOGY Derivative free optimization (DFO) Optimizes in the absence of algebraic models Derivatives are symbolically unavailable Derivatives may be time consuming to compute numerically Rios and Sahinidis (JOGO, 2013) Reviews algorithms and compares 20+ codes on 500+ problems Simulation optimization (SO) Addresses noise in function values Amaran, Sahinidis, Sharda and Bury (AOR, 2015) Reviews algorithms and applications 5

6 DFO ALGORITHMS LOCAL SEARCH METHODS Direct local search Nelder Mead simplex algorithm Generalized pattern search and generating search set Based on surrogate models Trust region methods Implicit filtering GLOBAL SEARCH METHODS Deterministic global search Lipschitzian based partitioning Multilevel coordinate search Stochastic global optimization Hit and run Simulated annealing Genetic algorithms Particle swarm Based on surrogate models Response surface methods Surrogate management framework Branch and fit 6

7 DFO SOFTWARE LOCAL SEARCH FMINSEARCH (Nelder-Mead) DAKOTA PATTERN (PPS) HOPSPACK (PPS) SID-PSM (Simplex gradient PPS) NOMAD (MADS) DFO (Trust region, quadratic model) IMFIL (Implicit Filtering) BOBYQA (Trust region, quadratic model) NEWUOA (Trust region, quadratic model) GLOBAL SEARCH DAKOTA SOLIS-WETS (DIRECT) DAKOTA DIRECT (DIRECT) TOMLAB GLBSOLVE (DIRECT) TOMLAB GLCSOLVE (DIRECT) MCS (Multilevel coordinate search) TOMLAB EGO (RSM using Kriging) TOMLAB RBF (RSM using RBF) SNOBFIT (Branch-and-Fit) TOMLAB LGO (LGO algorithm) STOCHASTIC ASA (Simulated annealing) CMA-ES (Evolutionary algorithm) DAKOTA EA (Evolutionary algorithm) GLOBAL (Clustering Multistart) PSWARM (Particle swarm) 7

8 PROBLEMS SOLVED 8

9 PROCESS DISAGGREGATION Block 1: Simulator Model generation Block 2: Simulator Model generation Block 3: Simulator Model generation Process Simulation Disaggregate process into process blocks Surrogate Models Build simple and accurate models with a functional form tailored for an optimization framework Optimization Model Add algebraic constraints design specs, heat/mass balances, and logic constraints 9

10 LEARNING PROBLEM Build a model of output variables as a function of input variables x over a specified interval Process simulation or Experiment Independent variables: Operating conditions, inlet flow properties, unit geometry Dependent variables: Efficiency, outlet flow conditions, conversions, heat flow, etc. 10

11 DESIRED MODEL ATTRIBUTES We aim to build surrogate models that are Accurate We want to reflect the true nature of the simulation Simple Interpretable; tailored for algebraic optimization Generated from a minimal data set Reduce experimental and simulation requirements 11

12 ALAMO Automated Learning of Algebraic Models using Optimization Start Initial sampling Build surrogate model Update training data set false Adaptive sampling Model converged? true Stop Model error Black-box function New model Current model 12

13 MODEL COMPLEXITY TRADEOFF Kriging [Krige, 63] Neural nets [McCulloch-Pitts, 43] Radial basis functions [Buhman, 00] Model accuracy Preferred region Linear response surface Model complexity 13

14 MODEL IDENTIFICATION Goal: Identify the functional form and complexity of the surrogate models Functional form: General functional form is unknown: Our method will identify models with combinations of simple basis functions 14

15 OVERFITTING AND TRUE ERROR Step 1: Define a large set of potential basis functions Step 2: Model reduction Error Ideal Model True error Empirical error Complexity Underfitting Overfitting 15

16 MODEL REDUCTION TECHNIQUES Qualitative tradeoffs of model reduction methods Stepwise regression [Efroymson, 60] Backward elimination [Oosterhof, 63] Forward selection [Hamaker, 62] Best subset methods Enumerate all possible subsets Regularized regression techniques Penalize the least squares objective using the magnitude of the regressors [Tibshirani, 95] 16

17 MODEL SIZING Goodness of fit measure Some measure of error that is sensitive to overfitting (AICc, BIC, Cp) Solve for the best one term model Solve for the best two term model 6 th term was not worth the added complexity Final model includes 5 terms Complexity = number of terms allowed in the model 17

18 MODEL SELECTION CRITERIA Balance fit (sum of square errors) with model complexity (number of terms in the model; denoted by p) Corrected Akaike Information Criterion log 1 2 Mallows Cp 2 Hannan Quinn Information Criterion log 1 Bayes Information Criterion Mean Squared Error log log log 1 18

19 CPU TIME COMPARISON Eight benchmarks from the UCI and CMU data sets Seventy noisy data sets were generated with multicolinearity and increasing problem size (number of bases) Cp BIC AIC, MSE, HQC CPU time (s) Problem Size BIC solves more than two orders of magnitude faster than AIC, MSE and HQC Optimized directly via a single mixed integer convex quadratic model 19

20 MODEL QUALITY COMPARISON BIC leads to smaller, more accurate models Larger penalty for model complexity Spurious Variables Included Cp BIC AIC HQC MSE Problem Size % Deviation from True Model Problem Size 20

21 ALAMO Automated Learning of Algebraic Models using Optimization Start Initial sampling Build surrogate model Update training data set Adaptive sampling Model error false Model converged? true Stop Error maximization sampling 21

22 ERROR MAXIMIZATION SAMPLING Search the problem space for areas of model inconsistency or model mismatch Find points that maximize the model error with respect to the independent variables Surrogate model Derivative free solvers work well in low dimensional spaces [Rios and Sahinidis, 12] Optimized using a black box or derivative free solver (SNOBFIT) [Huyer and Neumaier, 08] 22

23 COMPUTATIONAL RESULTS Goal Compare methods on three target metrics 1 Model accuracy 2 Data efficiency 3 Model simplicity Modeling methods compared ALAMO modeler Proposed best subset methodology The LASSO The lasso regularization Ordinary regression Ordinary least squares regression Sampling methods compared (over the same data set size) ALAMO sampler Proposed error maximization technique Single LH Single Latin hypercube (no feedback) 23

24 1 Model accuracy 2 Data efficiency 3 Model simplicity 70% of problems solved exactly Fraction of problems solved (0.005, 0.80) 80% of the problems had 0.5% error error maximization sampling ALAMO modeler the lasso Ordinary regression Normalized test error ALAMO modeler Ordinary regression the lasso single Latin hypercube Normalized test error 24

25 1 Model accuracy 2 Data efficiency 3 Model simplicity Fraction of problems solved ALAMO sampler Single LH ALAMO modeler ALAMO sampler Single LH the lasso ALAMO sampler Single LH Ordinary regression Normalized test error 25

26 1 Model accuracy 2 Data efficiency 3 Model simplicity Modeling type, Median more complexity than required Results over a test set of 45 known functions treated as black boxes with bases that are available to all modeling methods. 26

27 THEORY UTILIZATION empirical data non empirical information Use freely available system knowledge to strengthen model Physical limits First principles knowledge Intuition Non empirical restrictions can be applied to general regression problems 27

28 CONSTRAINED REGRESSION Challenging due to the semi infinite nature of the regression constraints Standard regression easy Surrogate model tough 28

29 IMPLIED PARAMETER RESTRICTIONS Step 1: Formulate constraint in z and x space Step 2: Identify a sufficient set of β space constraints 1 parametric constraint 4 β constraints Global optimization problems solved with BARON: archimedes.cheme.cmu.edu/?q=baron and 29

30 TYPES OF RESTRICTIONS Response bounds Individual responses Multiple responses pressure, temperature, compositions mass and energy balances, physical limitations mass balances, sum toone, state variables Response derivatives Alternative domains Boundary conditions Extrapolation zone velocity profile model monotonicity, numerical properties, convexity Problem Space safe extrapolation, boundary conditions Add no slip 30

31 CARBON CAPTURE SYSTEM DESIGN Surrogate models for each reactor and technology used Discrete decisions: How many units? Parallel trains? What technology used for each reactor? Continuous decisions: Unit geometries Operating conditions: Vessel temperature and pressure, flow rates, compositions 31

32 SUPERSTRUCTURE OPTIMIZATION Mixed-integer nonlinear programming model Economic model Process model Material balances Hydrodynamic/Energy balances Reactor surrogate models Link between economic model and process model Binary variable constraints Bounds for variables MINLP solved with BARON 32

33 ALAMO REMARKS Surrogate model New surrogate model Expanding the scope of algebraic optimization Using low complexity surrogate models to strike a balance between optimal decision making and model fidelity Surrogate model identification Simple, accurate model identification MINLP Error maximization sampling More information found per simulated data point DFO 33

34 NEW DFO ALGORITHMS Rios Sahinidis (2013) conclusions: Deterministic solvers outperform stochastic ones Surrogate model based algorithms have an edge Solve auxiliary algebraic models to global optimality to expedite search Decide where to sample objective function Guarantee geometry Construct surrogate models Higher quality surrogates Solve trust region subproblems Escape local minima; guarantee convergence BARON is highly efficient for problems below 100 variables Model and search (MAS) local search algorithm Branch and model (BAM) global search algorithm 34

35 MAS LOCAL SEARCH ALGORITHM x 5 x 3 x 1 Collect points around current iterate r 1 r 2 x 2 x 4 Add points: n linearly independent points r 1 r 2 Build and optimize linear or quadratic interpolating model Check for positive basis. Add points if necessary 35

36 PRELIMINARY RESULTS WITH MAS MAS 359 test problems 36

37 BAM GLOBAL SEARCH ALGORITHM Partition the space in a collection of hypercubes Eliminate potentially suboptimal hypercubes For remaining hypercubes, FIT models Sort hypercubes by size. Explore the largestt Sort hypercubes by predicted optimal values. Explore the best Optimize surrogate models 37

38 PRELIMINARY RESULTS WITH BAM BAM 38

39 CONCLUSIONS ALAMO provides algebraic models that are Accurate and simple Generated from a minimal number of function evaluations ALAMO s constrained regression facility allows modeling of Bounds on response variables Convexity/monotonicity of response variables Extensive results with MAS and BAM will be presented at the upcoming INFORMS and AIChE meetings ALAMO site archimedes.cheme.cmu.edu/?q=alamo DFO/SO site archimedes.cheme.cmu.edu/?q=bbo Test sets 39

The ALAMO approach to machine learning: Best subset selection, adaptive sampling, and constrained regression

The ALAMO approach to machine learning: Best subset selection, adaptive sampling, and constrained regression The ALAMO approach to machine learning: Best subset selection, adaptive sampling, and constrained regression Nick Sahinidis Acknowledgments: Alison Cozad, David Miller, Zach Wilson MACHINE LEARNING PROBLEM

More information

ALGEBRAIC MODELS: Algorithms, software,

ALGEBRAIC MODELS: Algorithms, software, OPTIMIZATION WITHOUT ALGEBRAIC MODELS: Algorithms, software, and applications Nick Sahinidis Center for Computer-Aided Process Decision-making Department t of fchemical lengineering i Carnegie Mellon University

More information

ALAMO: Automatic Learning of Algebraic Models for Optimization

ALAMO: Automatic Learning of Algebraic Models for Optimization ALAMO: Automatic Learning of Algebraic Models for Optimization Alison Cozad 1,2, Nick Sahinidis 1,2, David Miller 2 1 National Energy Technology Laboratory, Pittsburgh, PA,USA 2 Department of Chemical

More information

DERIVATIVE-FREE OPTIMIZATION ENHANCED-SURROGATE MODEL DEVELOPMENT FOR OPTIMIZATION. Alison Cozad, Nick Sahinidis, David Miller

DERIVATIVE-FREE OPTIMIZATION ENHANCED-SURROGATE MODEL DEVELOPMENT FOR OPTIMIZATION. Alison Cozad, Nick Sahinidis, David Miller DERIVATIVE-FREE OPTIMIZATION ENHANCED-SURROGATE MODEL DEVELOPMENT FOR OPTIMIZATION Alison Cozad, Nick Sahinidis, David Miller Carbon Capture Challenge The traditional pathway from discovery to commercialization

More information

ALAMO user manual and installation guide v

ALAMO user manual and installation guide v ALAMO user manual and installation guide v. 215.9.18 September 18, 215 For information about this software, contact Nick Sahinidis at niksah@minlp.com. Contents 1 Introduction.................................

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Reduced Order Models for Oxycombustion Boiler Optimization

Reduced Order Models for Oxycombustion Boiler Optimization Reduced Order Models for Oxycombustion Boiler Optimization John Eason Lorenz T. Biegler 9 March 2014 Project Objective Develop an equation oriented framework to optimize a coal oxycombustion flowsheet.

More information

The ALAMO approach to machine learning

The ALAMO approach to machine learning The ALAMO approach to machine learning Zachary T. Wilson 1 and Nikolaos V. Sahinidis 1 arxiv:1705.10918v1 [cs.lg] 31 May 2017 Abstract 1 Department of Chemical Engineering, Carnegie Mellon University,

More information

Linear Model Selection and Regularization. especially usefull in high dimensions p>>100.

Linear Model Selection and Regularization. especially usefull in high dimensions p>>100. Linear Model Selection and Regularization especially usefull in high dimensions p>>100. 1 Why Linear Model Regularization? Linear models are simple, BUT consider p>>n, we have more features than data records

More information

Learning surrogate models for simulation-based optimization

Learning surrogate models for simulation-based optimization Learning surrogate models for simulation-based optimization Alison Cozad 1,2, Nikolaos V. Sahinidis 1,2, and David C. Miller 1 1 National Energy Technology Laboratory, Pittsburgh, PA 15236 2 Department

More information

New developments in LS-OPT

New developments in LS-OPT 7. LS-DYNA Anwenderforum, Bamberg 2008 Optimierung II New developments in LS-OPT Nielen Stander, Tushar Goel, Willem Roux Livermore Software Technology Corporation, Livermore, CA94551, USA Summary: This

More information

Cover Page. The handle holds various files of this Leiden University dissertation.

Cover Page. The handle   holds various files of this Leiden University dissertation. Cover Page The handle http://hdl.handle.net/1887/22055 holds various files of this Leiden University dissertation. Author: Koch, Patrick Title: Efficient tuning in supervised machine learning Issue Date:

More information

A Personal View on Global Optimisation

A Personal View on Global Optimisation A Personal View on Global Optimisation Hermann Schichl March 20, 2014 2nd UK Workshop on Optimisation in Space Engineering Turing Gateway to Mathematics, Cambridge c Universität Wien. All rights reserved.

More information

Stochastic Separable Mixed-Integer Nonlinear Programming via Nonconvex Generalized Benders Decomposition

Stochastic Separable Mixed-Integer Nonlinear Programming via Nonconvex Generalized Benders Decomposition Stochastic Separable Mixed-Integer Nonlinear Programming via Nonconvex Generalized Benders Decomposition Xiang Li Process Systems Engineering Laboratory Department of Chemical Engineering Massachusetts

More information

I How does the formulation (5) serve the purpose of the composite parameterization

I How does the formulation (5) serve the purpose of the composite parameterization Supplemental Material to Identifying Alzheimer s Disease-Related Brain Regions from Multi-Modality Neuroimaging Data using Sparse Composite Linear Discrimination Analysis I How does the formulation (5)

More information

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer

More information

Simplicial Global Optimization

Simplicial Global Optimization Simplicial Global Optimization Julius Žilinskas Vilnius University, Lithuania September, 7 http://web.vu.lt/mii/j.zilinskas Global optimization Find f = min x A f (x) and x A, f (x ) = f, where A R n.

More information

What is machine learning?

What is machine learning? Machine learning, pattern recognition and statistical data modelling Lecture 12. The last lecture Coryn Bailer-Jones 1 What is machine learning? Data description and interpretation finding simpler relationship

More information

Webinar. Machine Tool Optimization with ANSYS optislang

Webinar. Machine Tool Optimization with ANSYS optislang Webinar Machine Tool Optimization with ANSYS optislang 1 Outline Introduction Process Integration Design of Experiments & Sensitivity Analysis Multi-objective Optimization Single-objective Optimization

More information

Ant Colony Optimization: A New Stochastic Solver for Modeling Vapor-Liquid Equilibrium Data

Ant Colony Optimization: A New Stochastic Solver for Modeling Vapor-Liquid Equilibrium Data Ant Colony Optimization: A New Stochastic Solver for Modeling Vapor-Liquid Equilibrium Data Jorge Adan Fernández-Vargas 1, Adrián Bonilla-Petriciolet *, Juan Gabriel Segovia- Hernández 1 and Salvador Hernández

More information

Computer Experiments. Designs

Computer Experiments. Designs Computer Experiments Designs Differences between physical and computer Recall experiments 1. The code is deterministic. There is no random error (measurement error). As a result, no replication is needed.

More information

"optislang inside ANSYS Workbench" efficient, easy, and safe to use Robust Design Optimization (RDO) - Part I: Sensitivity and Optimization

optislang inside ANSYS Workbench efficient, easy, and safe to use Robust Design Optimization (RDO) - Part I: Sensitivity and Optimization "optislang inside ANSYS Workbench" efficient, easy, and safe to use Robust Design Optimization (RDO) - Part I: Sensitivity and Optimization Johannes Will, CEO Dynardo GmbH 1 Optimization using optislang

More information

Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH

Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH Evolutionary and Genetic Algorithms in OptiSLang Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH www.dynardo.de Genetic Algorithms (GA) versus Evolutionary Algorithms

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 6, NOVEMBER Inverting Feedforward Neural Networks Using Linear and Nonlinear Programming

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 6, NOVEMBER Inverting Feedforward Neural Networks Using Linear and Nonlinear Programming IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 6, NOVEMBER 1999 1271 Inverting Feedforward Neural Networks Using Linear and Nonlinear Programming Bao-Liang Lu, Member, IEEE, Hajime Kita, and Yoshikazu

More information

Analysis and optimization methods of graph based meta-models for data flow simulation

Analysis and optimization methods of graph based meta-models for data flow simulation Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 8-1-2010 Analysis and optimization methods of graph based meta-models for data flow simulation Jeffrey Harrison

More information

Optimization and Probabilistic Analysis Using LS-DYNA

Optimization and Probabilistic Analysis Using LS-DYNA LS-OPT Optimization and Probabilistic Analysis Using LS-DYNA Nielen Stander Willem Roux Tushar Goel Livermore Software Technology Corporation Overview Introduction: LS-OPT features overview Design improvement

More information

Global Solution of Mixed-Integer Dynamic Optimization Problems

Global Solution of Mixed-Integer Dynamic Optimization Problems European Symposium on Computer Arded Aided Process Engineering 15 L. Puigjaner and A. Espuña (Editors) 25 Elsevier Science B.V. All rights reserved. Global Solution of Mixed-Integer Dynamic Optimization

More information

ALAMO user manual and installation guide v

ALAMO user manual and installation guide v ALAMO user manual and installation guide v. 218.4.3 April 3, 218 For information about this software, contact Nick Sahinidis at niksah@minlp.com. Contents 1 Introduction.................................

More information

2017 ITRON EFG Meeting. Abdul Razack. Specialist, Load Forecasting NV Energy

2017 ITRON EFG Meeting. Abdul Razack. Specialist, Load Forecasting NV Energy 2017 ITRON EFG Meeting Abdul Razack Specialist, Load Forecasting NV Energy Topics 1. Concepts 2. Model (Variable) Selection Methods 3. Cross- Validation 4. Cross-Validation: Time Series 5. Example 1 6.

More information

The exam is closed book, closed notes except your one-page cheat sheet.

The exam is closed book, closed notes except your one-page cheat sheet. CS 189 Fall 2015 Introduction to Machine Learning Final Please do not turn over the page before you are instructed to do so. You have 2 hours and 50 minutes. Please write your initials on the top-right

More information

Design of a control system model in SimulationX using calibration and optimization. Dynardo GmbH

Design of a control system model in SimulationX using calibration and optimization. Dynardo GmbH Design of a control system model in SimulationX using calibration and optimization Dynardo GmbH 1 Notes Please let your microphone muted Use the chat window to ask questions During short breaks we will

More information

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)

More information

Optimization. Industrial AI Lab.

Optimization. Industrial AI Lab. Optimization Industrial AI Lab. Optimization An important tool in 1) Engineering problem solving and 2) Decision science People optimize Nature optimizes 2 Optimization People optimize (source: http://nautil.us/blog/to-save-drowning-people-ask-yourself-what-would-light-do)

More information

An Evolutionary Algorithm for Minimizing Multimodal Functions

An Evolutionary Algorithm for Minimizing Multimodal Functions An Evolutionary Algorithm for Minimizing Multimodal Functions D.G. Sotiropoulos, V.P. Plagianakos and M.N. Vrahatis University of Patras, Department of Mamatics, Division of Computational Mamatics & Informatics,

More information

Applying Supervised Learning

Applying Supervised Learning Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains

More information

Machine Learning / Jan 27, 2010

Machine Learning / Jan 27, 2010 Revisiting Logistic Regression & Naïve Bayes Aarti Singh Machine Learning 10-701/15-781 Jan 27, 2010 Generative and Discriminative Classifiers Training classifiers involves learning a mapping f: X -> Y,

More information

MATLAB Based Optimization Techniques and Parallel Computing

MATLAB Based Optimization Techniques and Parallel Computing MATLAB Based Optimization Techniques and Parallel Computing Bratislava June 4, 2009 2009 The MathWorks, Inc. Jörg-M. Sautter Application Engineer The MathWorks Agenda Introduction Local and Smooth Optimization

More information

Hardware-Software Codesign

Hardware-Software Codesign Hardware-Software Codesign 4. System Partitioning Lothar Thiele 4-1 System Design specification system synthesis estimation SW-compilation intellectual prop. code instruction set HW-synthesis intellectual

More information

Machine Learning for Software Engineering

Machine Learning for Software Engineering Machine Learning for Software Engineering Introduction and Motivation Prof. Dr.-Ing. Norbert Siegmund Intelligent Software Systems 1 2 Organizational Stuff Lectures: Tuesday 11:00 12:30 in room SR015 Cover

More information

Introduction to ANSYS DesignXplorer

Introduction to ANSYS DesignXplorer Lecture 4 14. 5 Release Introduction to ANSYS DesignXplorer 1 2013 ANSYS, Inc. September 27, 2013 s are functions of different nature where the output parameters are described in terms of the input parameters

More information

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Metaheuristic Optimization with Evolver, Genocop and OptQuest Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:

More information

NUMERICAL METHODS PERFORMANCE OPTIMIZATION IN ELECTROLYTES PROPERTIES MODELING

NUMERICAL METHODS PERFORMANCE OPTIMIZATION IN ELECTROLYTES PROPERTIES MODELING NUMERICAL METHODS PERFORMANCE OPTIMIZATION IN ELECTROLYTES PROPERTIES MODELING Dmitry Potapov National Research Nuclear University MEPHI, Russia, Moscow, Kashirskoe Highway, The European Laboratory for

More information

Nonparametric Methods Recap

Nonparametric Methods Recap Nonparametric Methods Recap Aarti Singh Machine Learning 10-701/15-781 Oct 4, 2010 Nonparametric Methods Kernel Density estimate (also Histogram) Weighted frequency Classification - K-NN Classifier Majority

More information

Feature Subset Selection for Logistic Regression via Mixed Integer Optimization

Feature Subset Selection for Logistic Regression via Mixed Integer Optimization Feature Subset Selection for Logistic Regression via Mixed Integer Optimization Yuichi TAKANO (Senshu University, Japan) Toshiki SATO (University of Tsukuba) Ryuhei MIYASHIRO (Tokyo University of Agriculture

More information

Parallel Direct Search in Structural Optimization

Parallel Direct Search in Structural Optimization Parallel Direct Search in Structural Optimization J.B. Cardoso 1, P.G. Coelho 1 and A.L. Custódio 2, 1 Department of Mechanical and Industrial Engineering, New University of Lisbon, Portugal, 2 CMA and

More information

Machine Learning Techniques for Data Mining

Machine Learning Techniques for Data Mining Machine Learning Techniques for Data Mining Eibe Frank University of Waikato New Zealand 10/25/2000 1 PART VII Moving on: Engineering the input and output 10/25/2000 2 Applying a learner is not all Already

More information

Delaunay-based Derivative-free Optimization via Global Surrogate. Pooriya Beyhaghi, Daniele Cavaglieri and Thomas Bewley

Delaunay-based Derivative-free Optimization via Global Surrogate. Pooriya Beyhaghi, Daniele Cavaglieri and Thomas Bewley Delaunay-based Derivative-free Optimization via Global Surrogate Pooriya Beyhaghi, Daniele Cavaglieri and Thomas Bewley May 23, 2014 Delaunay-based Derivative-free Optimization via Global Surrogate Pooriya

More information

Embedding Formulations, Complexity and Representability for Unions of Convex Sets

Embedding Formulations, Complexity and Representability for Unions of Convex Sets , Complexity and Representability for Unions of Convex Sets Juan Pablo Vielma Massachusetts Institute of Technology CMO-BIRS Workshop: Modern Techniques in Discrete Optimization: Mathematics, Algorithms

More information

Topics in Machine Learning

Topics in Machine Learning Topics in Machine Learning Gilad Lerman School of Mathematics University of Minnesota Text/slides stolen from G. James, D. Witten, T. Hastie, R. Tibshirani and A. Ng Machine Learning - Motivation Arthur

More information

A Course in Machine Learning

A Course in Machine Learning A Course in Machine Learning Hal Daumé III 13 UNSUPERVISED LEARNING If you have access to labeled training data, you know what to do. This is the supervised setting, in which you have a teacher telling

More information

Simulation and Optimization Methods for Reliability Analysis

Simulation and Optimization Methods for Reliability Analysis Simulation and Optimization Methods for Reliability Analysis M. Oberguggenberger, M. Prackwieser, M. Schwarz University of Innsbruck, Department of Engineering Science INTALES GmbH Engineering Solutions

More information

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014

More information

Introduction to ANSYS DesignXplorer

Introduction to ANSYS DesignXplorer Lecture 5 Goal Driven Optimization 14. 5 Release Introduction to ANSYS DesignXplorer 1 2013 ANSYS, Inc. September 27, 2013 Goal Driven Optimization (GDO) Goal Driven Optimization (GDO) is a multi objective

More information

The AIMMS Outer Approximation Algorithm for MINLP

The AIMMS Outer Approximation Algorithm for MINLP The AIMMS Outer Approximation Algorithm for MINLP (using GMP functionality) By Marcel Hunting Paragon Decision Technology BV An AIMMS White Paper November, 2011 Abstract This document describes how to

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

The AIMMS Outer Approximation Algorithm for MINLP

The AIMMS Outer Approximation Algorithm for MINLP The AIMMS Outer Approximation Algorithm for MINLP (using GMP functionality) By Marcel Hunting marcel.hunting@aimms.com November 2011 This document describes how to use the GMP variant of the AIMMS Outer

More information

Combinatorial optimization and its applications in image Processing. Filip Malmberg

Combinatorial optimization and its applications in image Processing. Filip Malmberg Combinatorial optimization and its applications in image Processing Filip Malmberg Part 1: Optimization in image processing Optimization in image processing Many image processing problems can be formulated

More information

Classification: Linear Discriminant Functions

Classification: Linear Discriminant Functions Classification: Linear Discriminant Functions CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Discriminant functions Linear Discriminant functions

More information

Slides for Data Mining by I. H. Witten and E. Frank

Slides for Data Mining by I. H. Witten and E. Frank Slides for Data Mining by I. H. Witten and E. Frank 7 Engineering the input and output Attribute selection Scheme-independent, scheme-specific Attribute discretization Unsupervised, supervised, error-

More information

The Automation of the Feature Selection Process. Ronen Meiri & Jacob Zahavi

The Automation of the Feature Selection Process. Ronen Meiri & Jacob Zahavi The Automation of the Feature Selection Process Ronen Meiri & Jacob Zahavi Automated Data Science http://www.kdnuggets.com/2016/03/automated-data-science.html Outline The feature selection problem Objective

More information

Variable Selection 6.783, Biomedical Decision Support

Variable Selection 6.783, Biomedical Decision Support 6.783, Biomedical Decision Support (lrosasco@mit.edu) Department of Brain and Cognitive Science- MIT November 2, 2009 About this class Why selecting variables Approaches to variable selection Sparsity-based

More information

Training Digital Circuits with Hamming Clustering

Training Digital Circuits with Hamming Clustering IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 47, NO. 4, APRIL 2000 513 Training Digital Circuits with Hamming Clustering Marco Muselli, Member, IEEE, and Diego

More information

Robust Regression. Robust Data Mining Techniques By Boonyakorn Jantaranuson

Robust Regression. Robust Data Mining Techniques By Boonyakorn Jantaranuson Robust Regression Robust Data Mining Techniques By Boonyakorn Jantaranuson Outline Introduction OLS and important terminology Least Median of Squares (LMedS) M-estimator Penalized least squares What is

More information

Gradient LASSO algoithm

Gradient LASSO algoithm Gradient LASSO algoithm Yongdai Kim Seoul National University, Korea jointly with Yuwon Kim University of Minnesota, USA and Jinseog Kim Statistical Research Center for Complex Systems, Korea Contents

More information

Basis Functions. Volker Tresp Summer 2017

Basis Functions. Volker Tresp Summer 2017 Basis Functions Volker Tresp Summer 2017 1 Nonlinear Mappings and Nonlinear Classifiers Regression: Linearity is often a good assumption when many inputs influence the output Some natural laws are (approximately)

More information

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient Optimization Last time Root finding: definition, motivation Algorithms: Bisection, false position, secant, Newton-Raphson Convergence & tradeoffs Example applications of Newton s method Root finding in

More information

RDO-BOOKLET. CAE-Software & Consulting

RDO-BOOKLET. CAE-Software & Consulting dynamic software & engineering CAE-Software & Consulting Robust Design Optimization (RDO) Key technology for resource-efficient product development and performance enhancement RDO-BOOKLET optislang multiplas

More information

Design Optimization of Hydroformed Crashworthy Automotive Body Structures

Design Optimization of Hydroformed Crashworthy Automotive Body Structures Design Optimization of Hydroformed Crashworthy Automotive Body Structures Akbar Farahani a, Ronald C. Averill b, and Ranny Sidhu b a Engineering Technology Associates, Troy, MI, USA b Red Cedar Technology,

More information

CS6375: Machine Learning Gautam Kunapuli. Mid-Term Review

CS6375: Machine Learning Gautam Kunapuli. Mid-Term Review Gautam Kunapuli Machine Learning Data is identically and independently distributed Goal is to learn a function that maps to Data is generated using an unknown function Learn a hypothesis that minimizes

More information

Optimization Methods for Machine Learning (OMML)

Optimization Methods for Machine Learning (OMML) Optimization Methods for Machine Learning (OMML) 2nd lecture Prof. L. Palagi References: 1. Bishop Pattern Recognition and Machine Learning, Springer, 2006 (Chap 1) 2. V. Cherlassky, F. Mulier - Learning

More information

DETERMINISTIC OPERATIONS RESEARCH

DETERMINISTIC OPERATIONS RESEARCH DETERMINISTIC OPERATIONS RESEARCH Models and Methods in Optimization Linear DAVID J. RADER, JR. Rose-Hulman Institute of Technology Department of Mathematics Terre Haute, IN WILEY A JOHN WILEY & SONS,

More information

Large-Scale Multi-Disciplinary Mass Optimization in the Auto Industry

Large-Scale Multi-Disciplinary Mass Optimization in the Auto Industry Large-Scale Multi-Disciplinary Mass Optimization in the Auto Industry Don Jones Technical Fellow General Motors Product Development Presented at the MOPTA 2008 Conference, August 20, 2008 Outline The complexity

More information

Optimization Techniques for Design Space Exploration

Optimization Techniques for Design Space Exploration 0-0-7 Optimization Techniques for Design Space Exploration Zebo Peng Embedded Systems Laboratory (ESLAB) Linköping University Outline Optimization problems in ERT system design Heuristic techniques Simulated

More information

CS 229 Midterm Review

CS 229 Midterm Review CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask

More information

Contents. Preface CHAPTER III

Contents. Preface CHAPTER III Optimization Edited by G.L. Nemhauser Georgia Institute of Technology A.H.G. Rinnooy Kan Erasmus University Rotterdam M.J. Todd Cornell Univerisity 1989 NORTH-HOLLAND AMSTERDAM NEW YORK OXFORD TOKYO Preface

More information

Lecture on Modeling Tools for Clustering & Regression

Lecture on Modeling Tools for Clustering & Regression Lecture on Modeling Tools for Clustering & Regression CS 590.21 Analysis and Modeling of Brain Networks Department of Computer Science University of Crete Data Clustering Overview Organizing data into

More information

Support Vector Machines

Support Vector Machines Support Vector Machines RBF-networks Support Vector Machines Good Decision Boundary Optimization Problem Soft margin Hyperplane Non-linear Decision Boundary Kernel-Trick Approximation Accurancy Overtraining

More information

Generalized Additive Model

Generalized Additive Model Generalized Additive Model by Huimin Liu Department of Mathematics and Statistics University of Minnesota Duluth, Duluth, MN 55812 December 2008 Table of Contents Abstract... 2 Chapter 1 Introduction 1.1

More information

Machine Learning: Think Big and Parallel

Machine Learning: Think Big and Parallel Day 1 Inderjit S. Dhillon Dept of Computer Science UT Austin CS395T: Topics in Multicore Programming Oct 1, 2013 Outline Scikit-learn: Machine Learning in Python Supervised Learning day1 Regression: Least

More information

Data Mining: Concepts and Techniques. Chapter 9 Classification: Support Vector Machines. Support Vector Machines (SVMs)

Data Mining: Concepts and Techniques. Chapter 9 Classification: Support Vector Machines. Support Vector Machines (SVMs) Data Mining: Concepts and Techniques Chapter 9 Classification: Support Vector Machines 1 Support Vector Machines (SVMs) SVMs are a set of related supervised learning methods used for classification Based

More information

COMS 4771 Support Vector Machines. Nakul Verma

COMS 4771 Support Vector Machines. Nakul Verma COMS 4771 Support Vector Machines Nakul Verma Last time Decision boundaries for classification Linear decision boundary (linear classification) The Perceptron algorithm Mistake bound for the perceptron

More information

Network Lasso: Clustering and Optimization in Large Graphs

Network Lasso: Clustering and Optimization in Large Graphs Network Lasso: Clustering and Optimization in Large Graphs David Hallac, Jure Leskovec, Stephen Boyd Stanford University September 28, 2015 Convex optimization Convex optimization is everywhere Introduction

More information

Predictive Analytics: Demystifying Current and Emerging Methodologies. Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA

Predictive Analytics: Demystifying Current and Emerging Methodologies. Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA Predictive Analytics: Demystifying Current and Emerging Methodologies Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA May 18, 2017 About the Presenters Tom Kolde, FCAS, MAAA Consulting Actuary Chicago,

More information

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997 56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997 Answer #1 and any five of the remaining six problems! possible score 1. Multiple Choice 25 2. Traveling Salesman Problem 15 3.

More information

Generalized Additive Models

Generalized Additive Models :p Texts in Statistical Science Generalized Additive Models An Introduction with R Simon N. Wood Contents Preface XV 1 Linear Models 1 1.1 A simple linear model 2 Simple least squares estimation 3 1.1.1

More information

OPTIMIZATION METHODS. For more information visit: or send an to:

OPTIMIZATION METHODS. For more information visit:  or send an  to: OPTIMIZATION METHODS modefrontier is a registered product of ESTECO srl Copyright ESTECO srl 1999-2007 For more information visit: www.esteco.com or send an e-mail to: modefrontier@esteco.com NEOS Optimization

More information

Model Parameter Estimation

Model Parameter Estimation Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about model parameter

More information

CPSC 340: Machine Learning and Data Mining. More Regularization Fall 2017

CPSC 340: Machine Learning and Data Mining. More Regularization Fall 2017 CPSC 340: Machine Learning and Data Mining More Regularization Fall 2017 Assignment 3: Admin Out soon, due Friday of next week. Midterm: You can view your exam during instructor office hours or after class

More information

Theoretical Concepts of Machine Learning

Theoretical Concepts of Machine Learning Theoretical Concepts of Machine Learning Part 2 Institute of Bioinformatics Johannes Kepler University, Linz, Austria Outline 1 Introduction 2 Generalization Error 3 Maximum Likelihood 4 Noise Models 5

More information

Leave-One-Out Support Vector Machines

Leave-One-Out Support Vector Machines Leave-One-Out Support Vector Machines Jason Weston Department of Computer Science Royal Holloway, University of London, Egham Hill, Egham, Surrey, TW20 OEX, UK. Abstract We present a new learning algorithm

More information

NUMERICAL INVESTIGATION OF THE FLOW BEHAVIOR INTO THE INLET GUIDE VANE SYSTEM (IGV)

NUMERICAL INVESTIGATION OF THE FLOW BEHAVIOR INTO THE INLET GUIDE VANE SYSTEM (IGV) University of West Bohemia» Department of Power System Engineering NUMERICAL INVESTIGATION OF THE FLOW BEHAVIOR INTO THE INLET GUIDE VANE SYSTEM (IGV) Publication was supported by project: Budování excelentního

More information

Gaussian Processes for Robotics. McGill COMP 765 Oct 24 th, 2017

Gaussian Processes for Robotics. McGill COMP 765 Oct 24 th, 2017 Gaussian Processes for Robotics McGill COMP 765 Oct 24 th, 2017 A robot must learn Modeling the environment is sometimes an end goal: Space exploration Disaster recovery Environmental monitoring Other

More information

Exploiting a database to predict the in-flight stability of the F-16

Exploiting a database to predict the in-flight stability of the F-16 Exploiting a database to predict the in-flight stability of the F-16 David Amsallem and Julien Cortial December 12, 2008 1 Introduction Among the critical phenomena that have to be taken into account when

More information

Large-Scale Lasso and Elastic-Net Regularized Generalized Linear Models

Large-Scale Lasso and Elastic-Net Regularized Generalized Linear Models Large-Scale Lasso and Elastic-Net Regularized Generalized Linear Models DB Tsai Steven Hillion Outline Introduction Linear / Nonlinear Classification Feature Engineering - Polynomial Expansion Big-data

More information

Simplex of Nelder & Mead Algorithm

Simplex of Nelder & Mead Algorithm Simplex of N & M Simplex of Nelder & Mead Algorithm AKA the Amoeba algorithm In the class of direct search methods Unconstrained (although constraints can be added as part of error function) nonlinear

More information

Linear Models. Lecture Outline: Numeric Prediction: Linear Regression. Linear Classification. The Perceptron. Support Vector Machines

Linear Models. Lecture Outline: Numeric Prediction: Linear Regression. Linear Classification. The Perceptron. Support Vector Machines Linear Models Lecture Outline: Numeric Prediction: Linear Regression Linear Classification The Perceptron Support Vector Machines Reading: Chapter 4.6 Witten and Frank, 2nd ed. Chapter 4 of Mitchell Solving

More information

Linear Methods for Regression and Shrinkage Methods

Linear Methods for Regression and Shrinkage Methods Linear Methods for Regression and Shrinkage Methods Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 Linear Regression Models Least Squares Input vectors

More information

Preface to the Second Edition. Preface to the First Edition. 1 Introduction 1

Preface to the Second Edition. Preface to the First Edition. 1 Introduction 1 Preface to the Second Edition Preface to the First Edition vii xi 1 Introduction 1 2 Overview of Supervised Learning 9 2.1 Introduction... 9 2.2 Variable Types and Terminology... 9 2.3 Two Simple Approaches

More information

CPSC 340: Machine Learning and Data Mining. Feature Selection Fall 2017

CPSC 340: Machine Learning and Data Mining. Feature Selection Fall 2017 CPSC 340: Machine Learning and Data Mining Feature Selection Fall 2017 Assignment 2: Admin 1 late day to hand in tonight, 2 for Wednesday, answers posted Thursday. Extra office hours Thursday at 4pm (ICICS

More information