Model Parameter Estimation

Similar documents
Kyrre Glette INF3490 Evolvable Hardware Cartesian Genetic Programming

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

PARTICLE SWARM OPTIMIZATION (PSO)

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

An Island Based Hybrid Evolutionary Algorithm for Optimization

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

1 Lab 5: Particle Swarm Optimization

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Swarm Intelligence Particle Swarm Optimization. Erick Luerken 13.Feb.2006 CS 790R, University of Nevada, Reno

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Particle Swarm Optimization

1 Lab + Hwk 5: Particle Swarm Optimization

OPTIMIZATION METHODS. For more information visit: or send an to:

Optimal Power Flow Using Particle Swarm Optimization

MATH 209, Lab 5. Richard M. Slevinsky

Theoretical Concepts of Machine Learning

Surrogate-assisted Self-accelerated Particle Swarm Optimization

Jednociljna i višeciljna optimizacija korištenjem HUMANT algoritma

Introduction to unconstrained optimization - derivative-free methods

Modified Particle Swarm Optimization

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

SwarmOps for Matlab. Numeric & Heuristic Optimization Source-Code Library for Matlab The Manual Revision 1.0

The Pennsylvania State University. The Graduate School. Department of Electrical Engineering COMPARISON OF CAT SWARM OPTIMIZATION WITH PARTICLE SWARM

Tracking Algorithms. Lecture16: Visual Tracking I. Probabilistic Tracking. Joint Probability and Graphical Model. Deterministic methods

An Introduction to Evolutionary Algorithms

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem

CHAPTER 1 INTRODUCTION

A hybrid constrained optimization approach coupling PSO and adaptive constraint-handling technique

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

LECTURE 20: SWARM INTELLIGENCE 6 / ANT COLONY OPTIMIZATION 2

Introduction to Optimization Using Metaheuristics. Thomas J. K. Stidsen

x n+1 = x n f(x n) f (x n ), (1)

Ant Colony Optimization

Adaptive Radiation Pattern Optimization for Antenna Arrays by Phase Perturbations using Particle Swarm Optimization

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Center-Based Sampling for Population-Based Algorithms

Modern Methods of Data Analysis - WS 07/08

Particle Swarm Optimization

Constraints in Particle Swarm Optimization of Hidden Markov Models

Step Size Optimization of LMS Algorithm Using Particle Swarm Optimization Algorithm in System Identification


Simplicial Global Optimization

Short-Cut MCMC: An Alternative to Adaptation

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

1 Lab + Hwk 5: Particle Swarm Optimization

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

10-701/15-781, Fall 2006, Final

A Native Approach to Cell to Switch Assignment Using Firefly Algorithm

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Pre-requisite Material for Course Heuristics and Approximation Algorithms

PARTICLE Swarm Optimization (PSO), an algorithm by

10703 Deep Reinforcement Learning and Control

Modified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

A Polar Coordinate Particle Swarm Optimiser

Cost Functions in Machine Learning

Local Selection for Heuristic Algorithms as a Factor in Accelerating Optimum Search

Particle Swarm Optimization applied to Pattern Recognition

Fuzzy finite element model updating using metaheuristic optimization algorithms

Introduction to Optimization Using Metaheuristics. The Lecturer: Thomas Stidsen. Outline. Name: Thomas Stidsen: Nationality: Danish.

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

Comparative Study of Meta-heuristics Optimization Algorithm using Benchmark Function

Opportunistic Self Organizing Migrating Algorithm for Real-Time Dynamic Traveling Salesman Problem

Performance Comparison of Genetic Algorithm, Particle Swarm Optimization and Simulated Annealing Applied to TSP

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks

Ant Colony Optimization: A New Stochastic Solver for Modeling Vapor-Liquid Equilibrium Data

Derating NichePSO. Clive Naicker

Ant Colony Optimization

Evolutionary Methods for State-based Testing

A Study on Optimization Algorithms for Clustering Gene Expression Data

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem

The Design and Implementation of a Modeling Package

Finding Optimal Trajectory Points for TDOA/FDOA Geo-Location Sensors

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

Solving Optimization Problems with MATLAB Loren Shure

Implementation and Comparison between PSO and BAT Algorithms for Path Planning with Unknown Environment

Evolutionary Algorithms. CS Evolutionary Algorithms 1

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Parameter Estimation of DC Motor using Adaptive Transfer Function based on Nelder-Mead Optimisation

GSO: A New Solution for Solving Unconstrained Optimization Tasks Using Garter Snake s Behavior

Non-deterministic Search techniques. Emma Hart

Transcription:

Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB

Outline Outline of Topics Concepts about model parameter estimation Parameter estimation using Nelder-Mead Simplex Method Parameter estimation using Particle Swarm Optimiser Parameter estimation for Agent-based models Assignments

Concepts about model parameter estimation What is model parameter estimation? After building the model, we need to determine the model parameters. Usually only a fraction parameters can be measured by experiments. Many parameters are hard, expensive, time consuming, or even impossible to measure. Model parameter estimation: to indirectly determine unknown parameters from measurements of experimental data. Challenging because: Experimental data is noisy and sparse, e.g., only a few number of time points. Many models are not mathmatically well defined, e.g., Agent-based models

Concepts about model parameter estimation Methods for parameter estimation For equation-based models, we have: Derivative approximation methods: essentially approximate derivatives by finite differences, e.g., Euler s method, and then fit the parameters by linear regression. Pros: the computation time is typically very fast Cons: derivative approximations lead to inaccurate parameters. Bayesian methods: essentially use Bayesian methods to infer parameters from data Pros: can handle noisy or uncertain data; can also infer the whole probability distributions of the parameters rather than a point estimate. Cons: Computational time is very slow because of the need to solve high-dimensional integration problems Markov Chain Monte Carlo Optimisation methods.

Concepts about model parameter estimation Parameters estimation as an optimisation problem Suppose we have a system of ODE: dy dt = f (t, y; p), y Rn, f R n, where p R m is the vector of parameters, and a collection of k measurements of experimental data: (t 1, y 1 ), (t 2, y 2 ),..., (t k, y k ). We aim to minimise the following objective function, which is the mean square error between the model output and experimental data: obj(p) = k y(t j ; p) y j 2, j=1 where denotes standard Euclidean vector norm.

Concepts about model parameter estimation Optimisation algorithms To solve the above optimisation problem, a variety optimisation algorithms can be chosen: Gradient-based, e.g., Levenberg-Marquardt. Requirements: Mathematically well defined Smooth and differentiable Derivative-free optimisation algorithms: Direct search algorithms, e.g., pattern search, Nelder-Mead method Metaheuristic algorithms, e.g., Evolutionary Computation and Particle Swarm Optimiser

Concepts about model parameter estimation Example 1: A simple model Observation: 120 100 data1 data2 80 60 40 20 0 20 0 1 2 3 4 5 6 Suppose we have constructed a simple ODE model: { dx dt = ay + x + t2 + 6t + b dy dt = bx + ay (a + b)(1 t2 ) Task: Estimate parameters a and b from observation

Parameter estimation using Nelder-Mead Simplex Method Nelder-Mead Simplex Method A well-established direct search algorithm A heuristic search method, no guarantee to find optimal solutions Based on the concept of a simplex, which is a special polytope of N + 1 vertices in N dimensions Derivative-free: does not use numerical or analytic gradients Implemented in MATLAB as a function: [x,fval] = fminsearch(fun,x0), where fun is the objective function to be minimised and x0 is the starting point

Parameter estimation using Nelder-Mead Simplex Method Pros and Cons of Nelder-Mead Simplex Method Pros: Very fast Cons: only suitable for low dimensional and unimodal cases. Performance sensitive to starting point x0

Parameter estimation using Nelder-Mead Simplex Method Example 2: Parameter estimation of LV model The fur data was collected by Hudson Bay Company more than 100 years ago Theoretical model for predator-prey interaction: Lotka-Volterra (LV) Model

Parameter estimation using Nelder-Mead Simplex Method Example: Lotka-Volterra model { dx dt dy dt = x(α βy) = y(γ δx) α: prey population growth rate β: prey population decline rate γ: predator population decline rate δ: predator population growth rate

Parameter estimation using Nelder-Mead Simplex Method Example: Parameter estimation of LV model Objective: to estimate parameters of LV model from the Hudson Bay Company fur data We select the period between 1908-1935 We estimate parameters by metaheuristic algorithms

Parameter estimation using Particle Swarm Optimiser Evolutionary Computation (EC) Umbrella term for algorithms inspired by evolutionary systems Algorithms are known as evolutionary algorithms (EAs): Genetic Algorithms (GAs), Evolutionary Programming (EP) and Evolution Strategy (ES), etc. Also includes closely related nature-inspired approaches: Particle Swarm Optimiser (PSO) and Ant Colony Optimiser (ACO)

Parameter estimation using Particle Swarm Optimiser Evolution is amazing! Walking leaf insect Walking Stick insect Spiny Rainforest Katydid Sand grasshopper http://www.environmentalgraffiti.com/featured/amazing-insect-camouflage/14128

Parameter estimation using Particle Swarm Optimiser Evolutionary Computation (EC) Two main properties: Algorithms are usually population based: maintain a set of potential solutions at any one time; Algorithms are stochastic (non-deterministic): random elements help obtain good solutions.

Parameter estimation using Particle Swarm Optimiser General design In order to apply an EC to a problem, you need: A suitable representation of solutions to the problem; A way to evaluate solutions; A way to explore the space of solutions (variation operators); A way to select better solution to guide the search (exploit).

Parameter estimation using Particle Swarm Optimiser EC terminology Population: set of solutions Individual: a member in the population: Parents: reproducing individuals: Fitness function: a function to summarise how close an individual (solution) is to achieving the set aims

Parameter estimation using Particle Swarm Optimiser Standard EC procedure Generate the initial population P(0) at random, and set t = 1 repeat Evaluate the fitness of each individual in P(t). Select parents from P(t) based on their fitness. Obtain population P(t + 1) by applying variation operators to parents Set t = t + 1. until termination criterion satisfied

Parameter estimation using Particle Swarm Optimiser Particle Swarm Optimiser (PSO) Invented by Kennedy and Eberhart 1995 Inspired by bird flocking and fish schooling, more precisely, BIOD Simple rules for searching global optima Primarily for real-valued optimisation problems Simpler but sometimes better than GAs

Parameter estimation using Particle Swarm Optimiser PSO: detailed algorithm Can be seen as a swarm of particles flying in the search space to find the optimal solution. The variation operator consists of only two equations: V k+1 i = ωv k i + c 1 r 1 (P k i X k i ) + c 2 r 1 (P k g X k i ) X k+1 i = X k i + V k+1 i where Xi k and Vi k are current position and velocity of the i th particle, respectively; P i is the best previous position of the i th particle; P g is the global best position of the swarm; ω is inertia weigh, typically in the range of (0, 1]; c 1 and c 2 are constants, or so-called learning factors; r 1 and r 2 are random number in the range of (0, 1)

Parameter estimation using Particle Swarm Optimiser PSO: algorithm illustration The search direction of PSO is determined by: The autobiographical memory, which remembers the best previous position of each individual P i in the swarm The publicized knowledge, which is the best solution P g currently found by the population

Parameter estimation using Particle Swarm Optimiser PSO: practical issues One key problem faced by EC researcher is how to choose the parameter PSO: Plug-and-play optimisation algorithm Only 3 parameters, all not very sensitive ω can be a linearly decreasing value - better local search

Parameter estimation using Particle Swarm Optimiser EC: further readings and MATLAB toolboxes My tutorial slides on Evolutionary Computation Evolutionary Computation Online tutorial. MATLAB Global Optimization Toolbox Genetic Algorithm Optimization Toolbox (GAOT)

Parameter estimation for Agent-based models Parameter estimation for Agent-based models Estimating parameters by minimising the mean square error of model output and experimental data. obj(p) = k y(t j ; p) y j 2, j=1 where y denotes the agent-based model. Because of the stochastic nature of agent-based model, we need to run the model many times to get the average output: obj(p) = k 1 m j=1 m y(t ij ; p) y j 2, Problem: Computationally very demanding! Solution: parallel computing or GPU to speed up simulation. i=1

Parameter estimation for Agent-based models Future direction: automated model construction from data Genetic Programming (GP) is a power tools for machine creativity: Genetic Programming for reinventing patented inventions. GP is useful for constructing model (with parameters) from experiment data Distilling Free-Form Natural Laws from Experimental Data. Automated reverse engineering of nonlinear dynamical systems. Dialogue for Reverse Engineering Assessments and Methods (DREAM).

Assignments Assignments Download my MATLAB code and data here, please: 1. use GAOT toolbox to estimate parameters of LV model using the the Hudson Bay Company fur data from year 1860 to 1880; 2. try PSO to find the starting point for Nelder-Mead Simplex Method.