March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Similar documents
Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

GENETIC ALGORITHM with Hands-On exercise

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

MATLAB Based Optimization Techniques and Parallel Computing

CHAPTER 4 GENETIC ALGORITHM

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Heuristic Optimisation

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Introduction to Genetic Algorithms. Genetic Algorithms

Algorithm Design (4) Metaheuristics

Introduction to Genetic Algorithms

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Introduction to Optimization

Introduction to Optimization

Introduction to Design Optimization: Search Methods

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Artificial Intelligence

Genetic Algorithms. Kang Zheng Karl Schober

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

Simulated annealing/metropolis and genetic optimization

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Algorithms & Complexity

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

Chapter 9: Genetic Algorithms

A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Local Search (Ch )

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Genetic Algorithms. Genetic Algorithms

The Parallel Software Design Process. Parallel Software Design

GRASP. Greedy Randomized Adaptive. Search Procedure

An Introduction to Evolutionary Algorithms

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

N-Queens problem. Administrative. Local Search

Chapter 14 Global Search Algorithms

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Advanced Search Genetic algorithm

Genetic Algorithm for optimization using MATLAB

CS:4420 Artificial Intelligence

Genetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Evolutionary Computation. Chao Lan

Network Routing Protocol using Genetic Algorithms

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Using Genetic Algorithms to Solve the Box Stacking Problem

Administrative. Local Search!

Experimental Comparison of Different Techniques to Generate Adaptive Sequences

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Optimization of Function by using a New MATLAB based Genetic Algorithm Procedure

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

3.6.2 Generating admissible heuristics from relaxed problems

A Genetic Algorithm for Multiprocessor Task Scheduling

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING

Simplex of Nelder & Mead Algorithm

Genetic Algorithms. Chapter 3

Escaping Local Optima: Genetic Algorithm

Khushboo Arora, Samiksha Agarwal, Rohit Tanwar

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

An Evolutionary Algorithm for Minimizing Multimodal Functions

Using Genetic Algorithms to optimize ACS-TSP

SPATIAL OPTIMIZATION METHODS

Optimization in Brachytherapy. Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale

Automatic Generation of Test Case based on GATS Algorithm *

AI Programming CS S-08 Local Search / Genetic Algorithms

CS5401 FS2015 Exam 1 Key

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

A COMPARATIVE STUDY OF HEURISTIC OPTIMIZATION ALGORITHMS

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

CHAPTER 5 MAINTENANCE OPTIMIZATION OF WATER DISTRIBUTION SYSTEM: SIMULATED ANNEALING APPROACH

Active contour: a parallel genetic algorithm approach

Mutations for Permutations

Classical Gradient Methods

The k-means Algorithm and Genetic Algorithm

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions

Structural Optimizations of a 12/8 Switched Reluctance Motor using a Genetic Algorithm

Multi-Objective Optimization Using Genetic Algorithms

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Classification of Optimization Problems and the Place of Calculus of Variations in it

Comparison of TSP Algorithms

Standard Optimization Techniques

DERIVATIVE-FREE OPTIMIZATION

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Introduction. A very important step in physical design cycle. It is the process of arranging a set of modules on the layout surface.

QoS Constraints Multicast Routing for Residual Bandwidth Optimization using Evolutionary Algorithm

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

Introduction to Optimization

Transcription:

Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014

1 2 3

1 2 3

Preamble Branch and bound : guarantee that the optimum solution will be found But! It is combinatorially explosive problems common in practice are often problems of large scale

Preamble Branch and bound : guarantee that the optimum solution will be found But! It is combinatorially explosive problems common in practice are often problems of large scale We are not looking for optimal point, but approximate solution within the limits of time and computer memory available

Heuristic methods Heuristic is a method that is not guaranteed to find the optimum, but usually gives a very good solution, though it cannot guarantee to do even that every time Heuristics are quick and dirty methods, generally relatively fast and relatively good

Heuristic methods Heuristic is a method that is not guaranteed to find the optimum, but usually gives a very good solution, though it cannot guarantee to do even that every time Heuristics are quick and dirty methods, generally relatively fast and relatively good

Heuristic methods Heuristic is a method that is not guaranteed to find the optimum, but usually gives a very good solution, though it cannot guarantee to do even that every time Heuristics are quick and dirty methods, generally relatively fast and relatively good It has minimum length over all forward paths with the same origin and destination nodes. The length is a shortest distance.

Stochastic Refers to the minimization (or maximization) of a function in the presence of randomness in the optimization process The randomness: noise in measurements Monte Carlo randomness in the search procedure Common methods of stochastic optimization: direct search methods (such as the Nelder-Mead method) stochastic approximation stochastic programming and miscellaneous methods such as simulated annealing and genetic

Mental break What is the probability to meet your friend if you agreed to meet from 13 till 14 and can wait for each other exactly 5 minutes?

Mental break What is the probability to meet your friend if you agreed to meet from 13 till 14 and can wait for each other exactly 5 minutes? 0 x, y 60 - time of arrivals yx < 5,y > x xy < 5,x > y P(A) = Sg S = 60 60 55 55 60 60 = 23 144 = 0.16

1 2 3

Formally introduced in the United States in the 1970s by John Holland at University of Michigan Loosely mimic the process of evolution of organisms, where a problem solution stands in for the organism s genetic string Based on: selection crossover (recombination of solutions) mutation comparison Can be applied to any problem that has a solution can be expressed as a string (genome/chromosome) a value representing the worth of the string can be calculated.

The algorithm repeatedly modifies a population of individual solutions At each step, the genetic algorithm randomly selects individuals from the current population and uses them as parents to produce the children for the next generation Over successive generations, the population evolves toward an optimal solution. May solve problems that are not well suited for standard optimization, including problems in which the objective function is discontinuous, non-differentiable, stochastic, or highly nonlinear.

Three main operators in a basic genetic algorithm: reproduction crossover mutation Initial population of solutions: the simplest (but probably not the best) way to create an initial population is generate it randomly. The size of the population have to be large enough that it can support sufficient genetic variation, but not so large that calculations take an inordinate amount of time. In practice, the population size is often determined by experimentation.

Classical Algorithms vs Classical Algorithm: Generates a single point at each iteration. The sequence of points approaches an optimal solution. Selects the next point in the sequence by a deterministic computation. Algorithm Generates a population of points at each iteration. The best point in the population approaches an optimal solution. Selects the next population by computation which uses random number generators.

Example: person-job assignment problem We are assigning salespeople to regions. The expected number of units sold if a salesperson is assigned to a region: Should be solved by linear programming! maximize the number of units sold We assign just 4 of the 5 salespeople (each salesperson can handle only one region)

Example: person-job assignment problem Can genetic algorithm be applied to this problem? Can a solution be expressed as a string? Yes: a solution such as CDAB can represent the assignment of C to region 1, D to region 2, A to region 3 and B to region 4 Can a value be assigned to a string to represent its value? Yes: simply add up the expect units sold for the solution(e.g., CDAB would be 18 + 33 + 15 + 29 = 95) At all times we will have a population consisting of numerous solution strings (genetic string of chromosomes).

Reproduction Operator Equivalent to the survival of the fittest The probability of survival of a solution is proportional to its solution value (its fitness) How will we decide which ones survive?

Reproduction Operator After the reproduction operation, we have an intermediate population known as the mating pool that is ready to mix

Crossover Operator Two parent solution strings from the mating pool combine to create two new child solution strings: 1 Randomly select two parent strings from the mating pool. 2 Randomly select a crossover point in the solution string. This is the point between any two positions in the solution string. 3 Swap the ends of the two parent strings, from the crossover point to the end of the string, to create two new child strings.

Mutation Operator Used to randomly alter the values of some of the positions in some of the strings based on a parameter that determines the level of mutation Motivation behind mutation: to sample the solution space widely Reproduction and crossover deal with better solutions, mutation samples the solution space and to broaden the search. It is even possible (but more inefficient) to solve problems using only the mutation operator

Overview of the Basic Algorithm Process Design the algorithm: choose the population size n and mutation rate; choose the operators and the stopping conditions Randomly generate an initial population and calculate the fitness value for each string. Set the incumbent solution as the solution with the best value of the fitness function in the initial population. Apply the reproduction operator to the current population to generate a mating pool of size n. Apply the crossover operator to the strings in the mating pool to generate a tentative new population of size n. Apply the mutation operator to create the final population. Calculate the fitness values If the stopping conditions are met, then exit with the incumbent solution as the final solution. Otherwise go to Step 3.

Stopping Conditions after a respecified number of generation have been created when there is very little change between generations (evolutionary process has reached a plateau) stop when the worst solution string fitness in the population

Initial solution: example Semi-random but reasonably good solution: or 1 randomly select a salesperson and randomly assign that salesperson to a region 2 select the best unassigned salesperson-region combination and make that assignment 3 continue with step 2 until sufficient salespeople have been assigned. 1 randomly choose a region and assign the best salesperson for that region 2 randomly choose an unassigned region and assign the best salesperson for that region 3 continue with step 2 until there are no more regions needing a salesperson.

Mental break There are 4 and 5 white balls and 6 and 3 black balls in two boxes, respectively. You take randomly one ball from every box, and then take one of them again. What is the probability that the last ball is white?

Mental break There are 4 and 5 white balls and 6 and 3 black balls in two boxes, respectively. You take randomly one ball from every box, and then take one of them again. What is the probability that the last ball is white? Hypotheses: H1 = (1 - white, 2 - black), H2 = (1 - white, 2 - white), H3 = (1 - black, 2 - black), H 4 =(1 - black, 2 - white) P(H1) = 4 3 4+6 5+3 = 12 80 P(H1) = 4 5 4+6 5+3 = 12 80 P(H1) = 6 3 4+6 5+3 = 118 80 P(H1) = 6 5 4+6 5+3 = 30 80 P(A H1) = P(A H4) = 1 2,P(A H2) = 1,P(A H3) = 1 P(A) = 12 1 80 2 1 + 12 80 0 + 118 80 + 30 1 80 2 = 41 80

1 2 3

Another popular heuristic for both discrete and continuous, unconstrained and bound-constrained optimization problems (described in 1983-1985) Mimics the physical process of heating a material and then slowly lowering the temperature to decrease defects, thus minimizing the system energy At each iteration a new point is randomly generated The distance of the new point from the current point, or the extent of the search, is based on a probability distribution with a scale proportional to the temperature The algorithm accepts all new points that lower the objective, but also, with a certain probability, points that raise the objective By accepting points that raise the objective, the algorithm avoids being trapped in local minima in early iterations and is able to explore globally for better solutions.

of this simple algorithm minimization of a cost function 1 Start-up. Find an initial solution S (possibly randomly), initial (high) temperature T > 0, the rate of cooling parameter r. 2 Choose a random neighbor of S and call it S. 3 Calculate the difference in costs: = cost(s )cost(s). 4 Decide whether to accept the new solution or not: if 0 ( S is better) then S = S, else set S = S with probability e /T. 5 If the stopping conditions are met, then exit with S as the final solution, else reduce the temperature by setting T = rt, and go to Step 2. Stopping condition is when S is frozen ( has not changed value for several iterations).

Summary local optimum mixed discrete/continuous problems various data representation stochastic implementation is still an art no gradient o fancy math/objective function designing may be difficult computationally expensive easy parallelized