The Cross-Entropy Method for Mathematical Programming

Size: px
Start display at page:

Download "The Cross-Entropy Method for Mathematical Programming"

Transcription

1 The Cross-Entropy Method for Mathematical Programming Dirk P. Kroese Reuven Y. Rubinstein Department of Mathematics, The University of Queensland, Australia Faculty of Industrial Engineering and Management, Technion, Israel The Cross-Entropy Method formathematical Programming p. 1/42

2 Contents 1. Introduction 2. CE Methodology 3. Applications 4. Some Theory on CE 5. Conclusion + Discussion The Cross-Entropy Method formathematical Programming p. 2/42

3 CE Matters Book: R.Y. Rubinstein and D.P. Kroese. The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte Carlo Simulation and Machine Learning, Springer-Verlag, New York, The Cross-Entropy Method formathematical Programming p. 3/42

4 CE Matters Book: R.Y. Rubinstein and D.P. Kroese. The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte Carlo Simulation and Machine Learning, Springer-Verlag, New York, Special Issue: Annals of Operations Research (Jan 2005). The Cross-Entropy Method formathematical Programming p. 3/42

5 CE Matters Book: R.Y. Rubinstein and D.P. Kroese. The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte Carlo Simulation and Machine Learning, Springer-Verlag, New York, Special Issue: Annals of Operations Research (Jan 2005). The CE home page: The Cross-Entropy Method formathematical Programming p. 3/42

6 Introduction The Cross-Entropy Method was originally developed as a simulation method for the estimation of rare event probabilities: Estimate P(S(X) γ) X: random vector/process taking values in some set X. S: real-valued function on X. The Cross-Entropy Method formathematical Programming p. 4/42

7 Introduction The Cross-Entropy Method was originally developed as a simulation method for the estimation of rare event probabilities: Estimate P(S(X) γ) X: random vector/process taking values in some set X. S: real-valued function on X. It was soon realised that the CE Method could also be used as an optimization method: Determine max x X S(x) The Cross-Entropy Method formathematical Programming p. 4/42

8 Some Applications Combinatorial Optimization (e.g., Travelling Salesman, Maximal Cut and Quadratic Assignment Problems) Noisy Optimization (e.g., Buffer Allocation, Financial Engineering) Multi-Extremal Continuous Optimization Pattern Recognition, Clustering and Image Analysis Production Lines and Project Management Network Reliability Estimation Vehicle Routing and Scheduling DNA Sequence Alignment The Cross-Entropy Method formathematical Programming p. 5/42

9 A Multi-extremal Function The Cross-Entropy Method formathematical Programming p. 6/42

10 A Maze Problem The Optimal Trajectory The Cross-Entropy Method formathematical Programming p. 7/42

11 A Maze Problem Iteration 1: The Cross-Entropy Method formathematical Programming p. 8/42

12 A Maze Problem Iteration 2: The Cross-Entropy Method formathematical Programming p. 9/42

13 A Maze Problem Iteration 3: The Cross-Entropy Method formathematical Programming p. 10/42

14 A Maze Problem Iteration 4: The Cross-Entropy Method formathematical Programming p. 11/42

15 The Optimisation Setting Many real-world optimisation problems can be formulated mathematically as either (or both) 1. Locate some element x in a set X such that S(x ) S(x) for all x X, where S is a objective function or performance measure defined on X. 2. Find γ = S(x ), the globally maximal value of the function. The Cross-Entropy Method formathematical Programming p. 12/42

16 The Optimisation Setting Many real-world optimisation problems can be formulated mathematically as either (or both) 1. Locate some element x in a set X such that S(x ) S(x) for all x X, where S is a objective function or performance measure defined on X. 2. Find γ = S(x ), the globally maximal value of the function. X discrete: combinatorial optimisation problem X continuous: continuous optimisation problem. The Cross-Entropy Method formathematical Programming p. 12/42

17 A New Approach: CE Instead of locating optimal solutions to a particular problem directly, the CE method aims to locate an optimal sampling distribution The sampling distribution is optimal if only optimal solutions can be sampled from it. The Cross-Entropy Method formathematical Programming p. 13/42

18 General CE Procedure First, provide the state space X (whether the states are coordinates, permutations, or some other mathematical entities) over which the problem is defined, along with a performance measure S on this space. The Cross-Entropy Method formathematical Programming p. 13/42

19 General CE Procedure First, provide the state space X (whether the states are coordinates, permutations, or some other mathematical entities) over which the problem is defined, along with a performance measure S on this space. Second, formulate the parameterized sampling distribution to generate objects X X. Then, iterate the following steps: The Cross-Entropy Method formathematical Programming p. 13/42

20 General CE Procedure First, provide the state space X (whether the states are coordinates, permutations, or some other mathematical entities) over which the problem is defined, along with a performance measure S on this space. Second, formulate the parameterized sampling distribution to generate objects X X. Then, iterate the following steps: Generate a random sample of states X 1,..., X N X. The Cross-Entropy Method formathematical Programming p. 13/42

21 General CE Procedure First, provide the state space X (whether the states are coordinates, permutations, or some other mathematical entities) over which the problem is defined, along with a performance measure S on this space. Second, formulate the parameterized sampling distribution to generate objects X X. Then, iterate the following steps: Generate a random sample of states X 1,..., X N X. Update the parameters of the random mechanism (obtained via CE minimization), in order to produce a better sample in the next iteration. The Cross-Entropy Method formathematical Programming p. 13/42

22 General CE Algorithm 1. Initialise the parameters of the algorithm. 2. Generate a random sample from the sampling distribution. 3. Update the parameters of the sampling distribution on the basis of the best scoring samples, the so-called elite samples. This step involves the Cross-Entropy distance. 4. Repeat from Step 2 until some stopping criterion is met. The Cross-Entropy Method formathematical Programming p. 14/42

23 Degenerate Sampling Distribution Suppose that there is a unique solution x to the problem. Then, the degenerate distribution assigns all of the probability mass (in the discrete case) or density (in the continuous case) to this point in X. It is exactly the degenerate distribution from which we wish to ultimately sample, in order to obtain an optimal solution. The Cross-Entropy Method formathematical Programming p. 15/42

24 Detailed Generic CE Algorithm 1. Choose an initial parameter vector, v 0. Set t = Generate X 1, X 2,..., X N from the density f( ; v t 1 ) 3. Compute the sample (1 ρ)-quantile, ˆγ t = S ( (1 ρ)n ), locate the elite samples E = {X i : S(X i ) ˆγ t }, and determine ṽ t = argmax ln f(x i ; v). v X i E Set the current estimate for the optimal parameter vector, v t to: v t = αṽ t + (1 α)v t If the stopping criterion is met, stop; otherwise set t = t + 1, and return to Step 2. The Cross-Entropy Method formathematical Programming p. 16/42

25 Maximum Likelihood There is a connection between the Cross-Entropy method and Maximum Likelihood Estimation, which implies a very simple way of updating the parameter vector: take the maximum likelihood estimate of the parameter vector based on the elite samples. Thus, the sampling distribution comes closer at each iteration to the degenerate distribution, which is the optimal sampling distribution for the problem. The Cross-Entropy Method formathematical Programming p. 17/42

26 Example: The Max-Cut Problem Consider a weighted graph G with node set V = {1,..., n}. Partition the nodes of the graph into two subsets V 1 and V 2 such that the sum of the weights of the edges going from one subset to the other is maximised. Example: The Cross-Entropy Method formathematical Programming p. 18/42

27 Cost matrix (assume symmetric): 0 c 12 c c 21 0 c 23 c c 31 c 32 0 c 34 c 35 0 C = 0 c 42 c 43 0 c 45 c c 53 c 54 0 c c 64 c {V 1, V 2 } = {{1, 3, 4}, {2, 5, 6}} is a possible cut. The cost of the cut is c 12 + c 32 + c 35 + c 42 + c 45 + c 46. The Cross-Entropy Method formathematical Programming p. 19/42

28 Random Cut Vector We can represent a cut via a cut vector x = (x 1,..., x n ), where x i = 1 if node i belongs to same partition as 1, and 0 else. The Cross-Entropy Method formathematical Programming p. 20/42

29 Random Cut Vector We can represent a cut via a cut vector x = (x 1,..., x n ), where x i = 1 if node i belongs to same partition as 1, and 0 else. For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented via the cut vector (1, 0, 1, 1, 0, 0). The Cross-Entropy Method formathematical Programming p. 20/42

30 Random Cut Vector We can represent a cut via a cut vector x = (x 1,..., x n ), where x i = 1 if node i belongs to same partition as 1, and 0 else. For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented via the cut vector (1, 0, 1, 1, 0, 0). Let X be the set of all cut vectors x = (1, x 2,..., x n ). The Cross-Entropy Method formathematical Programming p. 20/42

31 Random Cut Vector We can represent a cut via a cut vector x = (x 1,..., x n ), where x i = 1 if node i belongs to same partition as 1, and 0 else. For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented via the cut vector (1, 0, 1, 1, 0, 0). Let X be the set of all cut vectors x = (1, x 2,..., x n ). Let S(x) be the corresponding cost of the cut. The Cross-Entropy Method formathematical Programming p. 20/42

32 Random Cut Vector We can represent a cut via a cut vector x = (x 1,..., x n ), where x i = 1 if node i belongs to same partition as 1, and 0 else. For example, the cut {{1, 3, 4}, {2, 5, 6}} can be represented via the cut vector (1, 0, 1, 1, 0, 0). Let X be the set of all cut vectors x = (1, x 2,..., x n ). Let S(x) be the corresponding cost of the cut. We wish to maximise S(x) via the CE method. The Cross-Entropy Method formathematical Programming p. 20/42

33 Generation and Updating Formulas Generation of cut vectors: The most natural and easiest way to generate the cut vectors X i = (1, X i1,..., X in ) is to let X i2,..., X in be independent Bernoulli random variables with success probabilities p 2,..., p n. The Cross-Entropy Method formathematical Programming p. 21/42

34 Generation and Updating Formulas Generation of cut vectors: The most natural and easiest way to generate the cut vectors X i = (1, X i1,..., X in ) is to let X i2,..., X in be independent Bernoulli random variables with success probabilities p 2,..., p n. Updating formulas: From CE minimization: the updated probabilities are the maximum likelihood estimates of the E = ρn elite samples: ˆp t,j = X i E X ij E, j = 2,..., n. The Cross-Entropy Method formathematical Programming p. 21/42

35 Algorithm 1 Start with ˆp 0 = (1, 1/2,..., 1/2). Let t := 1. The Cross-Entropy Method formathematical Programming p. 22/42

36 Algorithm 1 Start with ˆp 0 = (1, 1/2,..., 1/2). Let t := 1. 2 Generate: Draw X 1,..., X N from Ber(ˆp t ). Let ˆγ t be the worst performance of the elite performances. The Cross-Entropy Method formathematical Programming p. 22/42

37 Algorithm 1 Start with ˆp 0 = (1, 1/2,..., 1/2). Let t := 1. 2 Generate: Draw X 1,..., X N from Ber(ˆp t ). Let ˆγ t be the worst performance of the elite performances. 3 Update ˆp t : Use the same elite sample to calculate X ˆp t,j = i E X ij, j = 2,..., n, E where X i = (1, X i2,..., X in ), and increase t by 1. The Cross-Entropy Method formathematical Programming p. 22/42

38 Algorithm 1 Start with ˆp 0 = (1, 1/2,..., 1/2). Let t := 1. 2 Generate: Draw X 1,..., X N from Ber(ˆp t ). Let ˆγ t be the worst performance of the elite performances. 3 Update ˆp t : Use the same elite sample to calculate X ˆp t,j = i E X ij, j = 2,..., n, E where X i = (1, X i2,..., X in ), and increase t by 1. 4 If the stopping criterion is met, then stop; otherwise set t := t + 1 and reiterate from step 2. The Cross-Entropy Method formathematical Programming p. 22/42

39 Example Results for the case with n = 400, m = 200 nodes are given next. The Cross-Entropy Method formathematical Programming p. 23/42

40 Example Results for the case with n = 400, m = 200 nodes are given next. Parameters: ρ = 0.1, N = 1000 (thus E = 100). The Cross-Entropy Method formathematical Programming p. 23/42

41 Example Results for the case with n = 400, m = 200 nodes are given next. Parameters: ρ = 0.1, N = 1000 (thus E = 100). The CPU time was only 100 seconds (Matlab, pentium III, 500 Mhz). The Cross-Entropy Method formathematical Programming p. 23/42

42 Example Results for the case with n = 400, m = 200 nodes are given next. Parameters: ρ = 0.1, N = 1000 (thus E = 100). The CPU time was only 100 seconds (Matlab, pentium III, 500 Mhz). The CE algorithm converges quickly, yielding the exact optimal solution in 22 iterations. The Cross-Entropy Method formathematical Programming p. 23/42

43 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

44 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

45 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

46 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

47 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

48 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

49 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

50 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

51 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

52 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

53 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

54 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

55 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

56 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

57 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

58 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

59 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

60 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

61 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

62 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

63 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

64 Max-Cut The Cross-Entropy Method formathematical Programming p. 24/42

65 Example: Continuous Optimization 1 S(x) x 2 6 S(x) = e (x 2) e (x+2)2 The Cross-Entropy Method formathematical Programming p. 25/42

66 Matlab Program S = inline( exp(-(x-2).^2) + 0.8*exp(-(x+2).^2) ); mu = -10; sigma = 10; rho = 0.1; N = 100; eps = 1E-3; t=0; % iteration counter while sigma > eps t = t+1; x = mu + sigma*randn(n,1); SX = S(x); % Compute the performance. sortsx = sortrows([x SX],2); mu = mean(sortsx((1-rho)*n:n,1)); sigma = std(sortsx((1-rho)*n:n,1)); fprintf( %g %6.9f %6.9f %6.9f \n, t, S(mu),mu, sigma) end The Cross-Entropy Method formathematical Programming p. 26/42

67 Numerical Result Iteration 1 Iteration Iteration Iteration The Cross-Entropy Method formathematical Programming p. 27/42

68 Example: Constrained Optimization The nonlinear programming problem is to find x that minimizes the objective function S(x) = 10 j=1 x j (c j + ln x ) j, x x 10 (the c i are constants) subject to the following set of constraints: x 1 + 2x 2 + 2x 3 + x 6 + x 10 2 = 0, x 4 + 2x 5 + x 6 + x 7 1 = 0, x 3 + x 7 + x 8 + 2x 9 + x 10 1 = 0, x j , j = 1,..., 10. The Cross-Entropy Method formathematical Programming p. 28/42

69 The best known solution in Hock & Schittkowski (Test Examples for Nonlinear Programming Code) was x = ( , , , , , , , , , ) with S(x ) = However, using genetic algorithms Michalewicz (Genetic Algorithm + Data Structures = Evolution Programs) finds a better solution: x = ( , , , , , , , , , ), with S(x ) = The Cross-Entropy Method formathematical Programming p. 29/42

70 Using the CE method we can find an even better solution (in less time): x = ( , , , , , , , , , ) and S(x ) = (using ε = 10 4 ). The Cross-Entropy Method formathematical Programming p. 30/42

71 How was it done? The first step is to reduce the search space by expressing x 1, x 4, x 8 in terms of the other seven variables: x 1 = 2 (2x 2 + 2x 3 + x 6 + x 10 ), x 4 = 1 (2x 5 + x 6 + x 7 ), x 8 = 1 (x 3 + x 7 + 2x 9 + x 10 ). Hence, we have reduced the original problem of ten variables to that of a function of seven variables, which are subject to the constraints x 2, x 3, x 5, x 6, x 7, x 9, x , and 2 (2x 2 + 2x 3 + x 6 + x 10 ) , etc. The Cross-Entropy Method formathematical Programming p. 31/42

72 Next step choose an appropriate rectangular trust region R that contains the optimal solution. Draw the samples from a truncated normal distribution (with independent components) on this space R. Reject the sample if the constraints are not satisfied (acceptance rejection method). The updating formulas remain the same as for untruncated normal sampling: update via the mean and variance of the elite samples. An alternative is to add a penalty term to S. The Cross-Entropy Method formathematical Programming p. 32/42

73 Cross-Entropy: Some Theory Let X 1,..., X N be a random sample from f( ; v t 1 ). Let γ t be the (1 ρ) quantile of the performance: ρ = P vt 1 (S(X) γ t ), with X f( ; v t 1 ). Consider now the rare event estimation problem where we estimate ρ via Importance Sampling: ˆρ = 1 N N i=1 I {S(Xi ) γ t } f(x i ; v t 1 ) g(x i ), with X 1,..., X N g( ). A zero variance estimator is g (x) := I {S(x) γ t }f(x;v t 1 ) ρ. Problem: g depends on the unknown ρ. The Cross-Entropy Method formathematical Programming p. 33/42

74 Idea: Update v t such that the distance between the densities g and f( ; v t ) is minimal. The Cross-Entropy Method formathematical Programming p. 34/42

75 Idea: Update v t such that the distance between the densities g and f( ; v t ) is minimal. The Kullback-Leibler or cross-entropy distance is defined as: D(g, h) = E g log g(x) h(x) = g(x) log g(x) dx g(x) log h(x) dx. The Cross-Entropy Method formathematical Programming p. 34/42

76 Idea: Update v t such that the distance between the densities g and f( ; v t ) is minimal. The Kullback-Leibler or cross-entropy distance is defined as: D(g, h) = E g log g(x) h(x) = g(x) log g(x) dx g(x) log h(x) dx. Determine the optimal v t from min v D(g, f( ; v)). The Cross-Entropy Method formathematical Programming p. 34/42

77 Idea: Update v t such that the distance between the densities g and f( ; v t ) is minimal. The Kullback-Leibler or cross-entropy distance is defined as: D(g, h) = E g log g(x) h(x) = g(x) log g(x) dx g(x) log h(x) dx. Determine the optimal v t from min v D(g, f( ; v)). This is equivalent to solving max v E vt 1 I {S(X) γt } log f(x; v). The Cross-Entropy Method formathematical Programming p. 34/42

78 We may estimate the optimal solution v t by solving the following stochastic counterpart: max v 1 N N i=1 I {S(Xi ) γ t } log f(x i ; v), where X 1,..., X N is a random sample from f( ; v t 1 ). Alternatively, solve: 1 N N i=1 I {S(Xi ) γ t } log f(x i ; v) = 0, where the gradient is with respect to v. The Cross-Entropy Method formathematical Programming p. 35/42

79 The solution to the CE program can often be calculated analytically (Maximum Likelihood Estimator!). The Cross-Entropy Method formathematical Programming p. 36/42

80 The solution to the CE program can often be calculated analytically (Maximum Likelihood Estimator!). Note that the CE program is useful only when under v t 1 the event {S(X) γ t } is not too rare, say ρ The Cross-Entropy Method formathematical Programming p. 36/42

81 The solution to the CE program can often be calculated analytically (Maximum Likelihood Estimator!). Note that the CE program is useful only when under v t 1 the event {S(X) γ t } is not too rare, say ρ The Cross-Entropy Method formathematical Programming p. 36/42

82 The solution to the CE program can often be calculated analytically (Maximum Likelihood Estimator!). Note that the CE program is useful only when under v t 1 the event {S(X) γ t } is not too rare, say ρ The Cross-Entropy Method formathematical Programming p. 36/42

83 The solution to the CE program can often be calculated analytically (Maximum Likelihood Estimator!). Note that the CE program is useful only when under v t 1 the event {S(X) γ t } is not too rare, say ρ By iterating over t we obtain a sequence of reference parameters {v t, t 0} v and a sequence of levels {γ t, t 1} γ. The Cross-Entropy Method formathematical Programming p. 36/42

84 Updating Example Suppose X has iid exponential components: ( ) 5 5 x j f(x; v) = exp v j j=1 j=1 1 v j. The optimal v follows from the system of equations N i=1 I {S(Xi ) γ} W (X i ; u, w) log f(x i ; v) = 0. Since v j log f(x; v) = x j v 2 j 1 v j, The Cross-Entropy Method formathematical Programming p. 37/42

85 we have for the jth equation N i=1 I {S(Xi ) γ} ( Xij v 2 j 1 v j ) = 0, whence, v j = N i=1 I {S(X i ) γ}x ij N i=1 I {S(X i ) γ}, Thus the updating rule is here to take the mean of the elite samples (MLE). The Cross-Entropy Method formathematical Programming p. 38/42

86 Conclusions The CE method optimizes the sampling distribution. Advantages: Universally applicable (discrete/continuous/mixed problems) Very easy to implement Easy to adapt, e.g. when constraints are added Works generally well Disadvantages: It is a generic method Performance function should be relatively cheap Tweaking (modifications) may be required The Cross-Entropy Method formathematical Programming p. 39/42

87 Discussion + Further Research Synergies with related fields need to be further developed: Evolutionary Algorithms Stochastic Approximation Simulated Annealing Probability Collectives Reinforcement Learning Bayesian Inference (Multi-actor) game theory The Cross-Entropy Method formathematical Programming p. 40/42

88 Discussion + Further Research Convergence properties need to be further examined: Continuous optimization (e.g., Thomas Taimre s thesis) Rare event simulation (Homem-de-Melo, Margolin) Significance of various modifications Role of the sampling distribution Noisy optimization: very little needs to be changed! Test problems and best code need to be made available: CE Toolbox: The Cross-Entropy Method formathematical Programming p. 41/42

89 Acknowledgements Zdravko Botev, Jenny Liu, Sho Nariai, Thomas Taimre Phil Pollett The Cross-Entropy Method formathematical Programming p. 42/42

90 Acknowledgements Zdravko Botev, Jenny Liu, Sho Nariai, Thomas Taimre Phil Pollett Thank You! The Cross-Entropy Method formathematical Programming p. 42/42

PARALLEL CROSS-ENTROPY OPTIMIZATION. Dirk P. Kroese. Department of Mathematics University of Queensland Brisbane, QLD 4072, AUSTRALIA

PARALLEL CROSS-ENTROPY OPTIMIZATION. Dirk P. Kroese. Department of Mathematics University of Queensland Brisbane, QLD 4072, AUSTRALIA Proceedings of the 27 Winter Simulation Conference S. G. Henderson, B. Biller, M.-H. Hsieh, J. Shortle, J. D. Tew, and R. R. Barton, eds. PARALLEL CROSS-ENTROPY OPTIMIZATION Gareth E. Evans Department

More information

The Cross-Entropy Method

The Cross-Entropy Method The Cross-Entropy Method Guy Weichenberg 7 September 2003 Introduction This report is a summary of the theory underlying the Cross-Entropy (CE) method, as discussed in the tutorial by de Boer, Kroese,

More information

Application of the cross-entropy method to clustering and vector quantization

Application of the cross-entropy method to clustering and vector quantization DOI 10.1007/s10898-006-9041-0 ORIGINAL PAPER Application of the cross-entropy method to clustering and vector quantization Dirk P. Kroese Reuven Y. Rubinstein Thomas Taimre Received: 28 April 2005 / Accepted:

More information

GLOBAL LIKELIHOOD OPTIMIZATION VIA THE CROSS-ENTROPY METHOD WITH AN APPLICATION TO MIXTURE MODELS. Zdravko Botev Dirk P. Kroese

GLOBAL LIKELIHOOD OPTIMIZATION VIA THE CROSS-ENTROPY METHOD WITH AN APPLICATION TO MIXTURE MODELS. Zdravko Botev Dirk P. Kroese Proceedings of the 2004 Winter Simulation Conference R. G. Ingalls, M. D. Rossetti, J. S. Smith, and B. A. Peters, eds. GLOBAL LIKELIHOOD OPTIMIZATION VIA THE CROSS-ENTROPY METHOD WITH AN APPLICATION TO

More information

Application of the Cross-Entropy Method to Clustering and Vector Quantization

Application of the Cross-Entropy Method to Clustering and Vector Quantization Application of the Cross-Entropy Method to Clustering and Vector Quantization Dirk P. Kroese Reuven Y. Rubinstein Thomas Taimre Abstract We apply the cross-entropy (CE) method to problems in clustering

More information

Optimization of Axle NVH Performance Using the Cross Entropy Method

Optimization of Axle NVH Performance Using the Cross Entropy Method Optimization of Axle NVH Performance Using the Cross Entropy Method Abstract Glenn Meinhardt Department of Industrial and Systems Engineering Oakland University Rochester, Michigan 48309 Email: gameinha@oakland.edu

More information

Convexization in Markov Chain Monte Carlo

Convexization in Markov Chain Monte Carlo in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non

More information

Clustering Lecture 5: Mixture Model

Clustering Lecture 5: Mixture Model Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics

More information

ACO and other (meta)heuristics for CO

ACO and other (meta)heuristics for CO ACO and other (meta)heuristics for CO 32 33 Outline Notes on combinatorial optimization and algorithmic complexity Construction and modification metaheuristics: two complementary ways of searching a solution

More information

A Fast Estimation of SRAM Failure Rate Using Probability Collectives

A Fast Estimation of SRAM Failure Rate Using Probability Collectives A Fast Estimation of SRAM Failure Rate Using Probability Collectives Fang Gong Electrical Engineering Department, UCLA http://www.ee.ucla.edu/~fang08 Collaborators: Sina Basir-Kazeruni, Lara Dolecek, Lei

More information

Monte Carlo Methods and Statistical Computing: My Personal E

Monte Carlo Methods and Statistical Computing: My Personal E Monte Carlo Methods and Statistical Computing: My Personal Experience Department of Mathematics & Statistics Indian Institute of Technology Kanpur November 29, 2014 Outline Preface 1 Preface 2 3 4 5 6

More information

10703 Deep Reinforcement Learning and Control

10703 Deep Reinforcement Learning and Control 10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture

More information

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs 15.082J and 6.855J Lagrangian Relaxation 2 Algorithms Application to LPs 1 The Constrained Shortest Path Problem (1,10) 2 (1,1) 4 (2,3) (1,7) 1 (10,3) (1,2) (10,1) (5,7) 3 (12,3) 5 (2,2) 6 Find the shortest

More information

Telecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA

Telecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA A Decoder-based Evolutionary Algorithm for Constrained Parameter Optimization Problems S lawomir Kozie l 1 and Zbigniew Michalewicz 2 1 Department of Electronics, 2 Department of Computer Science, Telecommunication

More information

GAMs semi-parametric GLMs. Simon Wood Mathematical Sciences, University of Bath, U.K.

GAMs semi-parametric GLMs. Simon Wood Mathematical Sciences, University of Bath, U.K. GAMs semi-parametric GLMs Simon Wood Mathematical Sciences, University of Bath, U.K. Generalized linear models, GLM 1. A GLM models a univariate response, y i as g{e(y i )} = X i β where y i Exponential

More information

Parallel Hierarchical Cross Entropy Optimization for On-Chip Decap Budgeting

Parallel Hierarchical Cross Entropy Optimization for On-Chip Decap Budgeting Parallel Hierarchical Cross Entropy Optimization for On-Chip Decap Budgeting Xueqian Zhao, Yonghe Guo, Zhuo Feng and Shiyan Hu Department of Electrical and Computer Engineering Michigan Technological University,

More information

Using Machine Learning to Optimize Storage Systems

Using Machine Learning to Optimize Storage Systems Using Machine Learning to Optimize Storage Systems Dr. Kiran Gunnam 1 Outline 1. Overview 2. Building Flash Models using Logistic Regression. 3. Storage Object classification 4. Storage Allocation recommendation

More information

The cross entropy method for classification

The cross entropy method for classification Shie Mannor Dept. of Electrical and Computer Engineering, McGill University, Montreal, Canada shie@ece.mcgill.ca Dori Peleg dorip@techunix.technion.ac.il Dept. of Electrical Engineering, Technion Institute

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Theory of Variational Inference: Inner and Outer Approximation Eric Xing Lecture 14, February 29, 2016 Reading: W & J Book Chapters Eric Xing @

More information

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization

Performance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization 6th WSEAS International Conference on SYSTEM SCIENCE and SIMULATION in ENGINEERING, Venice, Italy, November 21-23, 2007 18 Performance Evaluation of an Interior Point Filter Line Search Method for Constrained

More information

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture - 35 Quadratic Programming In this lecture, we continue our discussion on

More information

MATLAB Based Optimization Techniques and Parallel Computing

MATLAB Based Optimization Techniques and Parallel Computing MATLAB Based Optimization Techniques and Parallel Computing Bratislava June 4, 2009 2009 The MathWorks, Inc. Jörg-M. Sautter Application Engineer The MathWorks Agenda Introduction Local and Smooth Optimization

More information

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Metaheuristic Optimization with Evolver, Genocop and OptQuest Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:

More information

Modeling and Reasoning with Bayesian Networks. Adnan Darwiche University of California Los Angeles, CA

Modeling and Reasoning with Bayesian Networks. Adnan Darwiche University of California Los Angeles, CA Modeling and Reasoning with Bayesian Networks Adnan Darwiche University of California Los Angeles, CA darwiche@cs.ucla.edu June 24, 2008 Contents Preface 1 1 Introduction 1 1.1 Automated Reasoning........................

More information

Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation

Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation .--- Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Networ and Fuzzy Simulation Abstract - - - - Keywords: Many optimization problems contain fuzzy information. Possibility

More information

STAT 725 Notes Monte Carlo Integration

STAT 725 Notes Monte Carlo Integration STAT 725 Notes Monte Carlo Integration Two major classes of numerical problems arise in statistical inference: optimization and integration. We have already spent some time discussing different optimization

More information

Fitting Social Network Models Using the Varying Truncation S. Truncation Stochastic Approximation MCMC Algorithm

Fitting Social Network Models Using the Varying Truncation S. Truncation Stochastic Approximation MCMC Algorithm Fitting Social Network Models Using the Varying Truncation Stochastic Approximation MCMC Algorithm May. 17, 2012 1 This talk is based on a joint work with Dr. Ick Hoon Jin Abstract The exponential random

More information

Machine Learning for Software Engineering

Machine Learning for Software Engineering Machine Learning for Software Engineering Introduction and Motivation Prof. Dr.-Ing. Norbert Siegmund Intelligent Software Systems 1 2 Organizational Stuff Lectures: Tuesday 11:00 12:30 in room SR015 Cover

More information

Digital Image Processing Laboratory: MAP Image Restoration

Digital Image Processing Laboratory: MAP Image Restoration Purdue University: Digital Image Processing Laboratories 1 Digital Image Processing Laboratory: MAP Image Restoration October, 015 1 Introduction This laboratory explores the use of maximum a posteriori

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of

More information

K-Means and Gaussian Mixture Models

K-Means and Gaussian Mixture Models K-Means and Gaussian Mixture Models David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 43 K-Means Clustering Example: Old Faithful Geyser

More information

6 Randomized rounding of semidefinite programs

6 Randomized rounding of semidefinite programs 6 Randomized rounding of semidefinite programs We now turn to a new tool which gives substantially improved performance guarantees for some problems We now show how nonlinear programming relaxations can

More information

Optimal designs for comparing curves

Optimal designs for comparing curves Optimal designs for comparing curves Holger Dette, Ruhr-Universität Bochum Maria Konstantinou, Ruhr-Universität Bochum Kirsten Schorning, Ruhr-Universität Bochum FP7 HEALTH 2013-602552 Outline 1 Motivation

More information

B553 Lecture 12: Global Optimization

B553 Lecture 12: Global Optimization B553 Lecture 12: Global Optimization Kris Hauser February 20, 2012 Most of the techniques we have examined in prior lectures only deal with local optimization, so that we can only guarantee convergence

More information

Optimal boundary control of a tracking problem for a parabolic distributed system using hierarchical fuzzy control and evolutionary algorithms

Optimal boundary control of a tracking problem for a parabolic distributed system using hierarchical fuzzy control and evolutionary algorithms Optimal boundary control of a tracking problem for a parabolic distributed system using hierarchical fuzzy control and evolutionary algorithms R.J. Stonier, M.J. Drumm and J. Bell Faculty of Informatics

More information

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with

More information

Constrained and Unconstrained Optimization

Constrained and Unconstrained Optimization Constrained and Unconstrained Optimization Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign hrtdmrt2@illinois.edu Oct 10th, 2017 C. Hurtado (UIUC - Economics) Numerical

More information

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water

More information

Sparse & Redundant Representations and Their Applications in Signal and Image Processing

Sparse & Redundant Representations and Their Applications in Signal and Image Processing Sparse & Redundant Representations and Their Applications in Signal and Image Processing Sparseland: An Estimation Point of View Michael Elad The Computer Science Department The Technion Israel Institute

More information

A noninformative Bayesian approach to small area estimation

A noninformative Bayesian approach to small area estimation A noninformative Bayesian approach to small area estimation Glen Meeden School of Statistics University of Minnesota Minneapolis, MN 55455 glen@stat.umn.edu September 2001 Revised May 2002 Research supported

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 2: Basic concepts Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic Optimisation

More information

100 Myung Hwan Na log-hazard function. The discussion section of Abrahamowicz, et al.(1992) contains a good review of many of the papers on the use of

100 Myung Hwan Na log-hazard function. The discussion section of Abrahamowicz, et al.(1992) contains a good review of many of the papers on the use of J. KSIAM Vol.3, No.2, 99-106, 1999 SPLINE HAZARD RATE ESTIMATION USING CENSORED DATA Myung Hwan Na Abstract In this paper, the spline hazard rate model to the randomly censored data is introduced. The

More information

A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling Repairman Problem

A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling Repairman Problem Proceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012 A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling

More information

Luo, W., and Li, Y. (2016) Benchmarking Heuristic Search and Optimisation Algorithms in Matlab. In: 22nd International Conference on Automation and Computing (ICAC), 2016, University of Essex, Colchester,

More information

Regularization and Markov Random Fields (MRF) CS 664 Spring 2008

Regularization and Markov Random Fields (MRF) CS 664 Spring 2008 Regularization and Markov Random Fields (MRF) CS 664 Spring 2008 Regularization in Low Level Vision Low level vision problems concerned with estimating some quantity at each pixel Visual motion (u(x,y),v(x,y))

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Markov Networks in Computer Vision. Sargur Srihari

Markov Networks in Computer Vision. Sargur Srihari Markov Networks in Computer Vision Sargur srihari@cedar.buffalo.edu 1 Markov Networks for Computer Vision Important application area for MNs 1. Image segmentation 2. Removal of blur/noise 3. Stereo reconstruction

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: April 11, 2011. Lecture 1: Introduction and Basic Terms Welcome to the course, time table, assessment, etc..

More information

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods

ISyE 6416: Computational Statistics Spring Lecture 13: Monte Carlo Methods ISyE 6416: Computational Statistics Spring 2017 Lecture 13: Monte Carlo Methods Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology Determine area

More information

10-701/15-781, Fall 2006, Final

10-701/15-781, Fall 2006, Final -7/-78, Fall 6, Final Dec, :pm-8:pm There are 9 questions in this exam ( pages including this cover sheet). If you need more room to work out your answer to a question, use the back of the page and clearly

More information

Image Restoration using Markov Random Fields

Image Restoration using Markov Random Fields Image Restoration using Markov Random Fields Based on the paper Stochastic Relaxation, Gibbs Distributions and Bayesian Restoration of Images, PAMI, 1984, Geman and Geman. and the book Markov Random Field

More information

Computational Economics and Finance

Computational Economics and Finance Computational Economics and Finance Part I: Elementary Concepts of Numerical Analysis Spring 2015 Outline Computer arithmetic Error analysis: Sources of error Error propagation Controlling the error Rates

More information

Neuro-Dynamic Programming An Overview

Neuro-Dynamic Programming An Overview 1 Neuro-Dynamic Programming An Overview Dimitri Bertsekas Dept. of Electrical Engineering and Computer Science M.I.T. May 2006 2 BELLMAN AND THE DUAL CURSES Dynamic Programming (DP) is very broadly applicable,

More information

CS Introduction to Data Mining Instructor: Abdullah Mueen

CS Introduction to Data Mining Instructor: Abdullah Mueen CS 591.03 Introduction to Data Mining Instructor: Abdullah Mueen LECTURE 8: ADVANCED CLUSTERING (FUZZY AND CO -CLUSTERING) Review: Basic Cluster Analysis Methods (Chap. 10) Cluster Analysis: Basic Concepts

More information

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask Machine Learning and Data Mining Clustering (1): Basics Kalev Kask Unsupervised learning Supervised learning Predict target value ( y ) given features ( x ) Unsupervised learning Understand patterns of

More information

MaTCH : Mapping Data-Parallel Tasks on a Heterogeneous Computing Platform Using the Cross-Entropy Heuristic

MaTCH : Mapping Data-Parallel Tasks on a Heterogeneous Computing Platform Using the Cross-Entropy Heuristic MaTCH : Mapping Data-Parallel Tasks on a Heterogeneous Computing Platform Using the Cross-Entropy Heuristic Soumya Sanyal and Sajal K. Das Dept. of Computer Science & Engineering The University of Texas

More information

Expectation Propagation

Expectation Propagation Expectation Propagation Erik Sudderth 6.975 Week 11 Presentation November 20, 2002 Introduction Goal: Efficiently approximate intractable distributions Features of Expectation Propagation (EP): Deterministic,

More information

A SIMULATED ANNEALING ALGORITHM FOR SOME CLASS OF DISCRETE-CONTINUOUS SCHEDULING PROBLEMS. Joanna Józefowska, Marek Mika and Jan Węglarz

A SIMULATED ANNEALING ALGORITHM FOR SOME CLASS OF DISCRETE-CONTINUOUS SCHEDULING PROBLEMS. Joanna Józefowska, Marek Mika and Jan Węglarz A SIMULATED ANNEALING ALGORITHM FOR SOME CLASS OF DISCRETE-CONTINUOUS SCHEDULING PROBLEMS Joanna Józefowska, Marek Mika and Jan Węglarz Poznań University of Technology, Institute of Computing Science,

More information

Parallel Implementation of a Random Search Procedure: An Experimental Study

Parallel Implementation of a Random Search Procedure: An Experimental Study Parallel Implementation of a Random Search Procedure: An Experimental Study NIKOLAI K. KRIVULIN Faculty of Mathematics and Mechanics St. Petersburg State University 28 University Ave., St. Petersburg,

More information

CS 229 Midterm Review

CS 229 Midterm Review CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask

More information

1. Practice the use of the C ++ repetition constructs of for, while, and do-while. 2. Use computer-generated random numbers.

1. Practice the use of the C ++ repetition constructs of for, while, and do-while. 2. Use computer-generated random numbers. 1 Purpose This lab illustrates the use of looping structures by introducing a class of programming problems called numerical algorithms. 1. Practice the use of the C ++ repetition constructs of for, while,

More information

An Evolutionary Algorithm for Minimizing Multimodal Functions

An Evolutionary Algorithm for Minimizing Multimodal Functions An Evolutionary Algorithm for Minimizing Multimodal Functions D.G. Sotiropoulos, V.P. Plagianakos and M.N. Vrahatis University of Patras, Department of Mamatics, Division of Computational Mamatics & Informatics,

More information

Algorithms for Markov Random Fields in Computer Vision

Algorithms for Markov Random Fields in Computer Vision Algorithms for Markov Random Fields in Computer Vision Dan Huttenlocher November, 2003 (Joint work with Pedro Felzenszwalb) Random Field Broadly applicable stochastic model Collection of n sites S Hidden

More information

THE DEVELOPMENT OF THE POTENTIAL AND ACADMIC PROGRAMMES OF WROCLAW UNIVERISTY OF TECHNOLOGY METAHEURISTICS

THE DEVELOPMENT OF THE POTENTIAL AND ACADMIC PROGRAMMES OF WROCLAW UNIVERISTY OF TECHNOLOGY METAHEURISTICS METAHEURISTICS 1. Objectives The goals of the laboratory workshop are as follows: to learn basic properties of evolutionary computation techniques and other metaheuristics for solving various global optimization

More information

Classical Gradient Methods

Classical Gradient Methods Classical Gradient Methods Note simultaneous course at AMSI (math) summer school: Nonlin. Optimization Methods (see http://wwwmaths.anu.edu.au/events/amsiss05/) Recommended textbook (Springer Verlag, 1999):

More information

A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS

A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS BERNA DENGIZ AND FULYA ALTIPARMAK Department of Industrial Engineering Gazi University, Ankara, TURKEY 06570 ALICE E.

More information

Cost Functions in Machine Learning

Cost Functions in Machine Learning Cost Functions in Machine Learning Kevin Swingler Motivation Given some data that reflects measurements from the environment We want to build a model that reflects certain statistics about that data Something

More information

A Parallel Evolutionary Algorithm for Discovery of Decision Rules

A Parallel Evolutionary Algorithm for Discovery of Decision Rules A Parallel Evolutionary Algorithm for Discovery of Decision Rules Wojciech Kwedlo Faculty of Computer Science Technical University of Bia lystok Wiejska 45a, 15-351 Bia lystok, Poland wkwedlo@ii.pb.bialystok.pl

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

(X 1:n η) 1 θ e 1. i=1. Using the traditional MLE derivation technique, the penalized MLEs for η and θ are: = n. (X i η) = 0. i=1 = 1.

(X 1:n η) 1 θ e 1. i=1. Using the traditional MLE derivation technique, the penalized MLEs for η and θ are: = n. (X i η) = 0. i=1 = 1. EXAMINING THE PERFORMANCE OF A CONTROL CHART FOR THE SHIFTED EXPONENTIAL DISTRIBUTION USING PENALIZED MAXIMUM LIKELIHOOD ESTIMATORS: A SIMULATION STUDY USING SAS Austin Brown, M.S., University of Northern

More information

AN INVESTIGATION OF MODEL-BASED APPROACHES IN SOLVING A VARIETY OF GLOBAL OPTIMIZATION PROBLEMS. A Dissertation Presented to The Academic Faculty

AN INVESTIGATION OF MODEL-BASED APPROACHES IN SOLVING A VARIETY OF GLOBAL OPTIMIZATION PROBLEMS. A Dissertation Presented to The Academic Faculty AN INVESTIGATION OF MODEL-BASED APPROACHES IN SOLVING A VARIETY OF GLOBAL OPTIMIZATION PROBLEMS A Dissertation Presented to The Academic Faculty By Joshua Q Hale In Partial Fulfillment of the Requirements

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

10.4 Linear interpolation method Newton s method

10.4 Linear interpolation method Newton s method 10.4 Linear interpolation method The next best thing one can do is the linear interpolation method, also known as the double false position method. This method works similarly to the bisection method by

More information

Image Segmentation using Gaussian Mixture Models

Image Segmentation using Gaussian Mixture Models Image Segmentation using Gaussian Mixture Models Rahman Farnoosh, Gholamhossein Yari and Behnam Zarpak Department of Applied Mathematics, University of Science and Technology, 16844, Narmak,Tehran, Iran

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

Using Genetic Algorithms to Solve the Box Stacking Problem

Using Genetic Algorithms to Solve the Box Stacking Problem Using Genetic Algorithms to Solve the Box Stacking Problem Jenniffer Estrada, Kris Lee, Ryan Edgar October 7th, 2010 Abstract The box stacking or strip stacking problem is exceedingly difficult to solve

More information

Model Parameter Estimation

Model Parameter Estimation Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about model parameter

More information

Markov Networks in Computer Vision

Markov Networks in Computer Vision Markov Networks in Computer Vision Sargur Srihari srihari@cedar.buffalo.edu 1 Markov Networks for Computer Vision Some applications: 1. Image segmentation 2. Removal of blur/noise 3. Stereo reconstruction

More information

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Bindu Student, JMIT Radaur binduaahuja@gmail.com Mrs. Pinki Tanwar Asstt. Prof, CSE, JMIT Radaur pinki.tanwar@gmail.com Abstract

More information

A Brief Look at Optimization

A Brief Look at Optimization A Brief Look at Optimization CSC 412/2506 Tutorial David Madras January 18, 2018 Slides adapted from last year s version Overview Introduction Classes of optimization problems Linear programming Steepest

More information

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16.

The Plan: Basic statistics: Random and pseudorandom numbers and their generation: Chapter 16. Scientific Computing with Case Studies SIAM Press, 29 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit IV Monte Carlo Computations Dianne P. O Leary c 28 What is a Monte-Carlo method?

More information

Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels

Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels Mehdi Molkaraie and Hans-Andrea Loeliger Dept. of Information Technology and Electrical Engineering ETH Zurich, Switzerland

More information

Estimation of Distribution Algorithm Based on Mixture

Estimation of Distribution Algorithm Based on Mixture Estimation of Distribution Algorithm Based on Mixture Qingfu Zhang, Jianyong Sun, Edward Tsang, and John Ford Department of Computer Science, University of Essex CO4 3SQ, Colchester, Essex, U.K. 8th May,

More information

Approximability Results for the p-center Problem

Approximability Results for the p-center Problem Approximability Results for the p-center Problem Stefan Buettcher Course Project Algorithm Design and Analysis Prof. Timothy Chan University of Waterloo, Spring 2004 The p-center

More information

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function Introduction METAHEURISTICS Some problems are so complicated that are not possible to solve for an optimal solution. In these problems, it is still important to find a good feasible solution close to the

More information

Research Interests Optimization:

Research Interests Optimization: Mitchell: Research interests 1 Research Interests Optimization: looking for the best solution from among a number of candidates. Prototypical optimization problem: min f(x) subject to g(x) 0 x X IR n Here,

More information

Discrete Optimization. Lecture Notes 2

Discrete Optimization. Lecture Notes 2 Discrete Optimization. Lecture Notes 2 Disjunctive Constraints Defining variables and formulating linear constraints can be straightforward or more sophisticated, depending on the problem structure. The

More information

Ant Colony Optimization

Ant Colony Optimization Ant Colony Optimization CompSci 760 Patricia J Riddle 1 Natural Inspiration The name Ant Colony Optimization was chosen to reflect its original inspiration: the foraging behavior of some ant species. It

More information

6. Find the equation of the plane that passes through the point (-1,2,1) and contains the line x = y = z.

6. Find the equation of the plane that passes through the point (-1,2,1) and contains the line x = y = z. Week 1 Worksheet Sections from Thomas 13 th edition: 12.4, 12.5, 12.6, 13.1 1. A plane is a set of points that satisfies an equation of the form c 1 x + c 2 y + c 3 z = c 4. (a) Find any three distinct

More information

MEI STRUCTURED MATHEMATICS METHODS FOR ADVANCED MATHEMATICS, C3. Practice Paper C3-B

MEI STRUCTURED MATHEMATICS METHODS FOR ADVANCED MATHEMATICS, C3. Practice Paper C3-B MEI Mathematics in Education and Industry MEI STRUCTURED MATHEMATICS METHODS FOR ADVANCED MATHEMATICS, C3 Practice Paper C3-B Additional materials: Answer booklet/paper Graph paper List of formulae (MF)

More information

Approximate Bayesian Computation using Auxiliary Models

Approximate Bayesian Computation using Auxiliary Models Approximate Bayesian Computation using Auxiliary Models Tony Pettitt Co-authors Chris Drovandi, Malcolm Faddy Queensland University of Technology Brisbane MCQMC February 2012 Tony Pettitt () ABC using

More information

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles INTERNATIONAL JOURNAL OF MATHEMATICS MODELS AND METHODS IN APPLIED SCIENCES Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles M. Fernanda P.

More information

Missing variable problems

Missing variable problems Missing variable problems In many vision problems, if some variables were known the maximum likelihood inference problem would be easy fitting; if we knew which line each token came from, it would be easy

More information

Package stochprofml. February 20, 2015

Package stochprofml. February 20, 2015 Type Package Package stochprofml February 20, 2015 Title Stochastic Profiling using Maximum Likelihood Estimation Version 1.2 Date 2014-10-17 Author Maintainer

More information

Modern Methods of Data Analysis - WS 07/08

Modern Methods of Data Analysis - WS 07/08 Modern Methods of Data Analysis Lecture XV (04.02.08) Contents: Function Minimization (see E. Lohrmann & V. Blobel) Optimization Problem Set of n independent variables Sometimes in addition some constraints

More information

SIMULATION AND MONTE CARLO

SIMULATION AND MONTE CARLO JHU course no. 550.790, Modeling, Simulation, and Monte Carlo SIMULATION AND MONTE CARLO Some General Principles James C. Spall Johns Hopkins University Applied Physics Laboratory September 2004 Overview

More information

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Arun Das 05/09/2017 Arun Das Waterloo Autonomous Vehicles Lab Introduction What s in a name? Arun Das Waterloo Autonomous

More information

Planning and Control: Markov Decision Processes

Planning and Control: Markov Decision Processes CSE-571 AI-based Mobile Robotics Planning and Control: Markov Decision Processes Planning Static vs. Dynamic Predictable vs. Unpredictable Fully vs. Partially Observable Perfect vs. Noisy Environment What

More information

Introduction to Reinforcement Learning. J. Zico Kolter Carnegie Mellon University

Introduction to Reinforcement Learning. J. Zico Kolter Carnegie Mellon University Introduction to Reinforcement Learning J. Zico Kolter Carnegie Mellon University 1 Agent interaction with environment Agent State s Reward r Action a Environment 2 Of course, an oversimplification 3 Review:

More information

Computational Economics and Finance

Computational Economics and Finance Computational Economics and Finance Part I: Elementary Concepts of Numerical Analysis Spring 2016 Outline Computer arithmetic Error analysis: Sources of error Error propagation Controlling the error Rates

More information