Variable Neighborhood Search

Similar documents
Artificial Intelligence Methods (G52AIM)

VARIABLE NEIGHBORHOOD SEARCH

Variable Neighborhood Search for Solving the Balanced Location Problem

Solving the p-center Problem with Tabu Search and Variable Neighborhood Search

Adaptive Large Neighborhood Search

a local optimum is encountered in such a way that further improvement steps become possible.

n Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness

Algorithm Design (4) Metaheuristics

Solving Capacitated P-Median Problem by Hybrid K-Means Clustering and Fixed Neighborhood Search algorithm

A hybrid heuristic for the p-median problem

3D Packing of Balls in Different Containers by VNS

underlying iterative best improvement procedure based on tabu attributes. Heuristic Optimization

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

USING INJECTION POINTS IN REFORMULATION LOCAL SEARCH FOR SOLVING CONTINUOUS LOCATION PROBLEMS

Variable Neighbourhood Search for Uncapacitated Warehouse Location Problems

Solving dynamic memory allocation problems in embedded systems with parallel variable neighborhood search strategies

Vehicle Routing Heuristic Methods

GRASP. Greedy Randomized Adaptive. Search Procedure

Simple mechanisms for escaping from local optima:

SOLVING THE TASK ASSIGNMENT PROBLEM WITH A VARIABLE NEIGHBORHOOD SEARCH. Jozef Kratica, Aleksandar Savić, Vladimir Filipović, Marija Milanović

Parallel Computing in Combinatorial Optimization

VNS-based heuristic with an exponential neighborhood for the server load balancing problem

Effective probabilistic stopping rules for randomized metaheuristics: GRASP implementations

A Course on Meta-Heuristic Search Methods for Combinatorial Optimization Problems

Assigning Judges to Competitions Using Tabu Search Approach

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

TABU search and Iterated Local Search classical OR methods

Local search with perturbations for the prize collecting Steiner tree problem in graphs

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Variable Neighborhood Search Based Algorithm for University Course Timetabling Problem

Solving the Capacitated Single Allocation Hub Location Problem Using Genetic Algorithm

Optimization Techniques for Design Space Exploration

Variable neighborhood search algorithm for the green vehicle routing problem

A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies

A Tabu Search solution algorithm

Metaheuristics : from Design to Implementation

Outline. Construction Heuristics for CVRP. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING

Computational Complexity CSC Professor: Tom Altman. Capacitated Problem

Artificial Intelligence p.1/49. n-queens. Artificial Intelligence p.2/49. Initial state: the empty board or a board with n random

Data Mining Chapter 9: Descriptive Modeling Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

(Stochastic) Local Search Algorithms

Improved K-Means Algorithm for Capacitated Clustering Problem

Pre-requisite Material for Course Heuristics and Approximation Algorithms

SLS Algorithms. 2.1 Iterative Improvement (revisited)

Preliminary Background Tabu Search Genetic Algorithm

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Non-deterministic Search techniques. Emma Hart

Clust Clus e t ring 2 Nov

Models and Algorithms for Competitive Facility Location Problems with Different Customer Behavior

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

GAUSSIAN VARIABLE NEIGHBORHOOD SEARCH FOR THE FILE TRANSFER SCHEDULING PROBLEM

Greedy randomized adaptive search procedures: Advances, hybridizations, and applications

Outline of the module

A Clustering Approach to the Bounded Diameter Minimum Spanning Tree Problem Using Ants. Outline. Tyler Derr. Thesis Adviser: Dr. Thang N.

Clustering. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Chapter 8 GREEDY RANDOMIZED ADAPTIVE SEARCH PROCEDURES 1 INTRODUCTION. Mauricio G.C. Resende AT&T Labs Research

Discrete (and Continuous) Optimization WI4 131

Clustering and Visualisation of Data

What to come. There will be a few more topics we will cover on supervised learning

Scatter Search: Methodology and Applications

Unidimensional Search for solving continuous high-dimensional optimization problems

INF4820. Clustering. Erik Velldal. Nov. 17, University of Oslo. Erik Velldal INF / 22

A combination of clustering algorithms with Ant Colony Optimization for large clustered Euclidean Travelling Salesman Problem

Complete Local Search with Memory

6. Tabu Search. 6.3 Minimum k-tree Problem. Fall 2010 Instructor: Dr. Masoud Yaghini

A Survey of Solution Methods for the Continuous Location-Allocation Problem

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

Outline. Optimales Recycling - Tourenplanung in der Altglasentsorgung

Unsupervised Learning

Unsupervised Learning and Clustering

A Kruskal-Based Heuristic for the Rooted Delay-Constrained Minimum Spanning Tree Problem

Introduction to Optimization

Construction Heuristics and Local Search Methods for VRP/VRPTW

A Three-Level Meta-heuristic for the Vertex p-centre Problem

Index. acronyms, list of, xvii asymmetric digital subscriber line (ADSL), tier structure, 4 asynchronous transfer mode (ATM), 2, 163

BBS654 Data Mining. Pinar Duygulu. Slides are adapted from Nazli Ikizler

Design and Analysis of Algorithms

Introduction to Optimization

Tabu Search - Examples

Clustering. Lecture 6, 1/24/03 ECS289A

3. Genetic local search for Earth observation satellites operations scheduling

Theorem 2.9: nearest addition algorithm

Chapter 5 Search Strategies

Constructive meta-heuristics

CAD Algorithms. Categorizing Algorithms

Machine Learning : Clustering, Self-Organizing Maps

Algorithms for the maximum k-club problem in graphs

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

6. Tabu Search 6.1 Basic Concepts. Fall 2010 Instructor: Dr. Masoud Yaghini

LECTURE 20: SWARM INTELLIGENCE 6 / ANT COLONY OPTIMIZATION 2

Metaheuristics: a quick overview

CSE 5243 INTRO. TO DATA MINING

Combinatorial Optimization - Lecture 14 - TSP EPFL

Clustering. Robert M. Haralick. Computer Science, Graduate Center City University of New York

ECS 234: Data Analysis: Clustering ECS 234

University of Florida CISE department Gator Engineering. Clustering Part 2

Greedy Randomized Adaptive Search Procedure with Path-Relinking for the Vertex p-center Problem

LOCAL SEARCH FOR THE MINIMUM FUNDAMENTAL CYCLE BASIS PROBLEM

Transcription:

Variable Neighborhood Search Hansen and Mladenovic, Variable neighborhood search: Principles and applications, EJOR 43 (2001) 1 Basic notions of VNS Systematic change of the neighborhood in search Does not follow a single trajectory but explores increasingly distant neighbors of the incumbent solution Jumps from this solution to a new one if and only if there is an improvement In this way, keeps good (maybe optimal) variables in the incumbent and obtains promising neighbors Uses local search to get from these neighbors to local optima 2 1

Basic VNS algorithm Initialization: Select a set of neighborhood structures N k (k=1,...,k max ), find an initial solution x, set k=1, choose a stopping condition Step 1 (shaking): Generate x N k (x) at random Step 2 (local search): Apply a local search method starting with x to find local optimum x Step 3 (move or not): If x is better than the incumbent, then set x = x and k=1, otherwise set k=k+1 (or if k=k max set k=1); go back to Step 1 3 Basic VNS algorithm (cont.) Basic VNS is a random descent, first improvement method In Step 1, x is generated at random to avoid cycling Successive N k are often nested In Step 3, if incumbent is changed then start over with N 1, otherwise continue search in N k+1 starting with the local optimum of N k 4 2

Variable neighborhood descent Initialization: same as before Step 1: Find the best x N k (x) Step 2: If x is better than x, then set x = x, otherwise set k=k+1; go back to Step 1 Implies change of neighborhood during local search Meaningful as local optimum of one neighborhood is not necessarily one in another 5 Variants of the basic VNS In Step 3, still set x = x with some probability when x is worse than the incumbent (descentascent as in SA) In Step 1, generate a solution from each of the k max neighborhoods and move to the best of them (best improvement as in variable depth search) In Step 1, choose the best of l randomly generated solutions from N k 6 3

Variants of the basic VNS (cont.) In Step 3, set k=k min instead of k=1, set k=k step instead of k=k+1 Choosing k min and k step large implies diversification, reverse intensification (assuming neighborhoods are nested) 7 VNS decisions Number and types of neighborhoods to be used Order of their use in the search Strategy for changing the neighborhoods Local search method Stopping condition 8 4

VNS for TSP VNS-1 Neighborhood definition is k-opt where k max =n, i.e. N k (x) is the set of solutions having k edges different from x Local search method used in Step 2 is 2-opt VNS-2 Neighborhood definition is the same Local search method is 2-opt on (1-r)% NN candidate subgraph; in the outer loop of 2-opt, link (i, j) is not deleted if j is not among NNs of i (long edges are considered for deletion when selecting points from N k ) 9 Results for TSP over 2-opt 2-opt solution is best of two independent trials VNS stopping condition is CPU time for two 2-opt trials 10 5

Alternative VNS for TSP GENIUS, which is based on a sophisticated node deletion/insertion procedure, defines neighborhood as the p cities closest to the city under consideration for deletion/insertion VNS with the same neighborhood definition gives 0.75% average improvement over GENIUS in similar CPU time 11 VNS for p-median P-median problem: Given m discrete locations, locate p (uncapacitated) facilities that will serve n customers (with known locations) so as to minimize total distance from customers to facilities N k (x) is the set of solutions having k facilities located in different locations than x, where k max =p Local search method used in Step 2 is fast interchange (FI) descent 12 6

Results for p-median (deviation from best known) FI: fast interchange descent, HC: heuristic concentration, TS: tabu search, CSTS: chain substitution TS 13 VNS for multi-source Weber Multi-source Weber problem: Continuous counterpart of p-median, i.e. p facilities can be located anywhere on the plane Choice of neighborhood is crucial; moving facilities is more effective than reassigning customers to facilities Neighborhood structure: known customer locations are also considered for facilities 14 7

Results for multi-source Weber Various heuristics were compared with equivalent CPU times On a series of 20 problems with 1060 customers, average deviation from best known solution is 0.02% for the best of four VNS variants 0.13% for the best of three TS variants 1.27% for a GA 20% or more for some well known heuristics 15 VNS for minimum sum-ofsquares clustering Minimum sum-of-squares clustering problem: Partition n objects each in q-dimensional Euclidean space into m clusters such that sum of squared distances from each entity to the centroid of its cluster is minimum K-means Given an initial partition, try to assign object j in cluster l to each of the other i clusters Neighborhood is defined by all possible i and j pairs Find best neighbor and move if better than the current 16 8

VNS for minimum sum-ofsquares clustering (cont.) H-means: similar to Cooper s alternating heuristic for Weber problem (solve location and allocation problems alternatingly until convergence) J-means Centroid of cluster i is relocated at some object This correnponds to reallocation of all objects in that cluster and is called a jump (hence the name J-means) Solution obtained with the jump neighborhood can be improved by H-means and/or K-means 17 VNS for minimum sum-ofsquares clustering (cont.) VNS-1 Uses K-means neighborhoods with k max =m Uses H+K-means for local search in Step 2 of the basic VNS algorithm VNS-2 Uses jump (centroid relocation) neighborhood with k max =m Uses J+H+K-means for local search in Step 2 18 9

Results for minimum sum-ofsquares clustering Multi-start versions of K-means, H-means, H+K-means Equivalent CPU times 19 VNS for bilinear programming Bilinear programming problem Has three sets of variables x, y, z When all y s are fixed, it becomes a LP in x and z When all z s are fixed, it becomes a LP in x and y 20 10

VNS for bilinear programming (cont) Local search ALTernate in Step 2 of the basic VNS Step 1: Choose values of z (or y) variables Step 2: Solve LP1 in x and y (or in x and z) Step 3: For y (or z) found in Step 2, solve LP2 in x and z (or in x and y) Step 4: If convergence is not reached within given tolerance, return to Step 2 Neighborhood N k (x, y, z) for VNS corresponds to k pivots of the LP in x and y or in x and z, for k=1,...,k max 21 Results for bilinear programming (dev. from best) 18 linear + # of x, y, z variables 18 quadratic multi-start of ALT constraints 22 11

Extensions: RVNS RVNS: Reduced VNS Local search in Step 2 is dropped to save time In Step 1, random solutions are generated from increasingly far neighborhoods of the incumbent In Step 3, move iff the new solution is better For large p-median instances with 3038 customers RVNS has the same solution quality as FI and uses 18 times less CPU time RVNS is 0.53% worse than the basic VNS 23 Extensions: RVNS (cont.) (dev. from basic VNS) FI: fast interchange descent Basic VNS uses five times the CPU time of FI 24 12

Extensions: VNDS VNDS: VNS is combined with decomposition N k (x): all but k variables of solution x are fixed, choose these k variables at random in Step 1 Local search in Step 2: solve a k-dimensional subproblem in the space of unfixed variables Step 3 is the same as in the basic VNS Basic VNS can be used as the local search heuristic (two level recursive VNS) 25 Extensions: VNDS (cont.) May return to k=1 when maximum size of or time allocated for the subproblem exceeds a limit For large p-median instances with 3038 customers VNDS outperforms FI in similar CPU times VNDS is 0.28% better than the basic VNS and uses five times less CPU time 26 13

Extensions: VNDS (cont.) (dev. from basic VNS) FI: fast interchange descent Basic VNS uses five times the CPU time of FI 27 Extensions: SVNS SVNS: Accounts for topology of local optima (valleys and mountains) Steps 1 and 2 are the same as in the basic VNS For the move decision in Step 3, instead of the objective function, use an evaluation function taking also into account distance ρ of x from x Step 3 (improve or not): If f( x ) < f(x best ), set x best = x Step 4 (move or not): If f( x )-αρ(x, x ) < f(x), set x= x and k=1, otherwise set k=k+1; go back to Step 1 28 14

Extensions: SVNS (cont.) WMAXSAT: weighted maximum satisfiability problem of logic Definition: find a truth assignment to the variables to maximize total weight of satisfied logic clauses (unions of variables) GRASP: greedy randomized adaptive search procedure(s) typically used as construction heuristics 29 Conclusions VNS has many desirable features of a metaheuristic: Simple and largely applicable Coherent: steps follow naturally from principles, not a hybrid Efficient and effective: very good solution quality in moderate CPU time Robust: performs well for various problems User friendly: easy to understand and implement Innovative: allows development of variations/extensions 30 15