Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB
Outline Outline of Topics Concepts about model parameter estimation Parameter estimation using Nelder-Mead Simplex Method Parameter estimation using Particle Swarm Optimiser Parameter estimation for Agent-based models Assignments
Concepts about model parameter estimation What is model parameter estimation? After building the model, we need to determine the model parameters. Usually only a fraction parameters can be measured by experiments. Many parameters are hard, expensive, time consuming, or even impossible to measure. Model parameter estimation: to indirectly determine unknown parameters from measurements of experimental data. Challenging because: Experimental data is noisy and sparse, e.g., only a few number of time points. Many models are not mathmatically well defined, e.g., Agent-based models
Concepts about model parameter estimation Methods for parameter estimation For equation-based models, we have: Derivative approximation methods: essentially approximate derivatives by finite differences, e.g., Euler s method, and then fit the parameters by linear regression. Pros: the computation time is typically very fast Cons: derivative approximations lead to inaccurate parameters. Bayesian methods: essentially use Bayesian methods to infer parameters from data Pros: can handle noisy or uncertain data; can also infer the whole probability distributions of the parameters rather than a point estimate. Cons: Computational time is very slow because of the need to solve high-dimensional integration problems Markov Chain Monte Carlo Optimisation methods.
Concepts about model parameter estimation Parameters estimation as an optimisation problem Suppose we have a system of ODE: dy dt = f (t, y; p), y Rn, f R n, where p R m is the vector of parameters, and a collection of k measurements of experimental data: (t 1, y 1 ), (t 2, y 2 ),..., (t k, y k ). We aim to minimise the following objective function, which is the mean square error between the model output and experimental data: obj(p) = k y(t j ; p) y j 2, j=1 where denotes standard Euclidean vector norm.
Concepts about model parameter estimation Optimisation algorithms To solve the above optimisation problem, a variety optimisation algorithms can be chosen: Gradient-based, e.g., Levenberg-Marquardt. Requirements: Mathematically well defined Smooth and differentiable Derivative-free optimisation algorithms: Direct search algorithms, e.g., pattern search, Nelder-Mead method Metaheuristic algorithms, e.g., Evolutionary Computation and Particle Swarm Optimiser
Concepts about model parameter estimation Example 1: A simple model Observation: 120 100 data1 data2 80 60 40 20 0 20 0 1 2 3 4 5 6 Suppose we have constructed a simple ODE model: { dx dt = ay + x + t2 + 6t + b dy dt = bx + ay (a + b)(1 t2 ) Task: Estimate parameters a and b from observation
Parameter estimation using Nelder-Mead Simplex Method Nelder-Mead Simplex Method A well-established direct search algorithm A heuristic search method, no guarantee to find optimal solutions Based on the concept of a simplex, which is a special polytope of N + 1 vertices in N dimensions Derivative-free: does not use numerical or analytic gradients Implemented in MATLAB as a function: [x,fval] = fminsearch(fun,x0), where fun is the objective function to be minimised and x0 is the starting point
Parameter estimation using Nelder-Mead Simplex Method Pros and Cons of Nelder-Mead Simplex Method Pros: Very fast Cons: only suitable for low dimensional and unimodal cases. Performance sensitive to starting point x0
Parameter estimation using Nelder-Mead Simplex Method Example 2: Parameter estimation of LV model The fur data was collected by Hudson Bay Company more than 100 years ago Theoretical model for predator-prey interaction: Lotka-Volterra (LV) Model
Parameter estimation using Nelder-Mead Simplex Method Example: Lotka-Volterra model { dx dt dy dt = x(α βy) = y(γ δx) α: prey population growth rate β: prey population decline rate γ: predator population decline rate δ: predator population growth rate
Parameter estimation using Nelder-Mead Simplex Method Example: Parameter estimation of LV model Objective: to estimate parameters of LV model from the Hudson Bay Company fur data We select the period between 1908-1935 We estimate parameters by metaheuristic algorithms
Parameter estimation using Particle Swarm Optimiser Evolutionary Computation (EC) Umbrella term for algorithms inspired by evolutionary systems Algorithms are known as evolutionary algorithms (EAs): Genetic Algorithms (GAs), Evolutionary Programming (EP) and Evolution Strategy (ES), etc. Also includes closely related nature-inspired approaches: Particle Swarm Optimiser (PSO) and Ant Colony Optimiser (ACO)
Parameter estimation using Particle Swarm Optimiser Evolution is amazing! Walking leaf insect Walking Stick insect Spiny Rainforest Katydid Sand grasshopper http://www.environmentalgraffiti.com/featured/amazing-insect-camouflage/14128
Parameter estimation using Particle Swarm Optimiser Evolutionary Computation (EC) Two main properties: Algorithms are usually population based: maintain a set of potential solutions at any one time; Algorithms are stochastic (non-deterministic): random elements help obtain good solutions.
Parameter estimation using Particle Swarm Optimiser General design In order to apply an EC to a problem, you need: A suitable representation of solutions to the problem; A way to evaluate solutions; A way to explore the space of solutions (variation operators); A way to select better solution to guide the search (exploit).
Parameter estimation using Particle Swarm Optimiser EC terminology Population: set of solutions Individual: a member in the population: Parents: reproducing individuals: Fitness function: a function to summarise how close an individual (solution) is to achieving the set aims
Parameter estimation using Particle Swarm Optimiser Standard EC procedure Generate the initial population P(0) at random, and set t = 1 repeat Evaluate the fitness of each individual in P(t). Select parents from P(t) based on their fitness. Obtain population P(t + 1) by applying variation operators to parents Set t = t + 1. until termination criterion satisfied
Parameter estimation using Particle Swarm Optimiser Particle Swarm Optimiser (PSO) Invented by Kennedy and Eberhart 1995 Inspired by bird flocking and fish schooling, more precisely, BIOD Simple rules for searching global optima Primarily for real-valued optimisation problems Simpler but sometimes better than GAs
Parameter estimation using Particle Swarm Optimiser PSO: detailed algorithm Can be seen as a swarm of particles flying in the search space to find the optimal solution. The variation operator consists of only two equations: V k+1 i = ωv k i + c 1 r 1 (P k i X k i ) + c 2 r 1 (P k g X k i ) X k+1 i = X k i + V k+1 i where Xi k and Vi k are current position and velocity of the i th particle, respectively; P i is the best previous position of the i th particle; P g is the global best position of the swarm; ω is inertia weigh, typically in the range of (0, 1]; c 1 and c 2 are constants, or so-called learning factors; r 1 and r 2 are random number in the range of (0, 1)
Parameter estimation using Particle Swarm Optimiser PSO: algorithm illustration The search direction of PSO is determined by: The autobiographical memory, which remembers the best previous position of each individual P i in the swarm The publicized knowledge, which is the best solution P g currently found by the population
Parameter estimation using Particle Swarm Optimiser PSO: practical issues One key problem faced by EC researcher is how to choose the parameter PSO: Plug-and-play optimisation algorithm Only 3 parameters, all not very sensitive ω can be a linearly decreasing value - better local search
Parameter estimation using Particle Swarm Optimiser EC: further readings and MATLAB toolboxes My tutorial slides on Evolutionary Computation Evolutionary Computation Online tutorial. MATLAB Global Optimization Toolbox Genetic Algorithm Optimization Toolbox (GAOT)
Parameter estimation for Agent-based models Parameter estimation for Agent-based models Estimating parameters by minimising the mean square error of model output and experimental data. obj(p) = k y(t j ; p) y j 2, j=1 where y denotes the agent-based model. Because of the stochastic nature of agent-based model, we need to run the model many times to get the average output: obj(p) = k 1 m j=1 m y(t ij ; p) y j 2, Problem: Computationally very demanding! Solution: parallel computing or GPU to speed up simulation. i=1
Parameter estimation for Agent-based models Future direction: automated model construction from data Genetic Programming (GP) is a power tools for machine creativity: Genetic Programming for reinventing patented inventions. GP is useful for constructing model (with parameters) from experiment data Distilling Free-Form Natural Laws from Experimental Data. Automated reverse engineering of nonlinear dynamical systems. Dialogue for Reverse Engineering Assessments and Methods (DREAM).
Assignments Assignments Download my MATLAB code and data here, please: 1. use GAOT toolbox to estimate parameters of LV model using the the Hudson Bay Company fur data from year 1860 to 1880; 2. try PSO to find the starting point for Nelder-Mead Simplex Method.