Hybrid Constraint Programming and Metaheuristic methods for Large Scale Optimization Problems

Similar documents
Improving CP-based Local Branching via Sliced Neighborhood Search

CP-based Local Branching

Machine Learning for Software Engineering

Graph Coloring via Constraint Programming-based Column Generation

ACO and other (meta)heuristics for CO

A Computational Study of Bi-directional Dynamic Programming for the Traveling Salesman Problem with Time Windows

Hybrid Metaheuristics

General Methods and Search Algorithms

Learning techniques for Automatic Algorithm Portfolio Selection

Neighborhood Search: Mixing Gecode and EasyLocal++

Column Generation Based Primal Heuristics

TABU search and Iterated Local Search classical OR methods

GRASP. Greedy Randomized Adaptive. Search Procedure

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Outline of the module

Amanur Rahman Saiyed (Indiana State University) THE TRAVELING SALESMAN PROBLEM November 22, / 21

Weight Annealing Heuristics for Solving the Two-Dimensional Bin Packing Problem

Heuristics in MILP. Group 1 D. Assouline, N. Molyneaux, B. Morén. Supervisors: Michel Bierlaire, Andrea Lodi. Zinal 2017 Winter School

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

A Hybrid Solver for Large Neighborhood Search: Mixing Gecode and EasyLocal++

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

Modified Order Crossover (OX) Operator

Primal Heuristics in SCIP

A Parallel Architecture for the Generalized Traveling Salesman Problem

How to use your favorite MIP Solver: modeling, solving, cannibalizing. Andrea Lodi University of Bologna, Italy

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Algorithms for the Bin Packing Problem with Conflicts

Heuristics in Commercial MIP Solvers Part I (Heuristics in IBM CPLEX)

Dynamically Configured λ-opt Heuristics for Bus Scheduling

A Meta-heuristic Applied for a Topologic Pickup and Delivery Problem with Time Windows Constraints

Practice Final Exam 1

Outline of the talk. Local search meta-heuristics for combinatorial problems. Constraint Satisfaction Problems. The n-queens problem

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

7KH9HKLFOH5RXWLQJSUREOHP

George Reloaded. M. Monaci (University of Padova, Italy) joint work with M. Fischetti. MIP Workshop, July 2010

Lagrangian Relaxation in CP

SOME GREEDY BASED ALGORITHMS FOR MULTI PERIODS DEGREE CONSTRAINED MINIMUM SPANNING TREE PROBLEM

Tabu search and genetic algorithms: a comparative study between pure and hybrid agents in an A-teams approach

Constraint Programming

Effective probabilistic stopping rules for randomized metaheuristics: GRASP implementations

Introduction to Combinatorial Algorithms

Local Search Overview

Constraint-based solution methods for vehicle routing problems

Some Basics on Tolerances. Gerold Jäger

LOCAL SEARCH FOR THE MINIMUM FUNDAMENTAL CYCLE BASIS PROBLEM

Constructive and destructive algorithms

Parallel Computing in Combinatorial Optimization

A Re-examination of Limited Discrepancy Search

Search. Krzysztof Kuchcinski. Department of Computer Science Lund Institute of Technology Sweden.

Overview. H. R. Alvarez A., Ph. D.

of optimization problems. In this chapter, it is explained that what network design

Overview of Tabu Search

On Mixed-Integer (Linear) Programming and its connection with Data Science

3 No-Wait Job Shops with Variable Processing Times

Research Interests Optimization:

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

Mathematical Programming Formulations, Constraint Programming

The Heuristic (Dark) Side of MIP Solvers. Asja Derviskadic, EPFL Vit Prochazka, NHH Christoph Schaefer, EPFL

DFS* and the Traveling Tournament Problem. David C. Uthus, Patricia J. Riddle, and Hans W. Guesgen

Decision Diagrams for Solving Traveling Salesman Problems with Pickup and Delivery in Real Time

Primal Heuristics for Branch-and-Price Algorithms

Arc-Flow Model for the Two-Dimensional Cutting Stock Problem

Simple mechanisms for escaping from local optima:

Branch-and-bound: an example

Computational Complexity CSC Professor: Tom Altman. Capacitated Problem

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems

Metaheuristic Optimization with Evolver, Genocop and OptQuest

The Branch & Move algorithm: Improving Global Constraints Support by Local Search

Hybrid strategies of enumeration in constraint solving

Cloud Branching MIP workshop, Ohio State University, 23/Jul/2014

Optimization Techniques for Design Space Exploration

LECTURE 20: SWARM INTELLIGENCE 6 / ANT COLONY OPTIMIZATION 2

algorithms, i.e., they attempt to construct a solution piece by piece and are not able to offer a complete solution until the end. The FM algorithm, l

Cardinality Reasoning for bin-packing constraint. Application to a tank allocation problem

General properties of staircase and convex dual feasible functions

HARNESSING CERTAINTY TO SPEED TASK-ALLOCATION ALGORITHMS FOR MULTI-ROBOT SYSTEMS

56:272 Integer Programming & Network Flows Final Examination -- December 14, 1998

Evolutionary Computation for Combinatorial Optimization

Recursive column generation for the Tactical Berth Allocation Problem

5.3 Cutting plane methods and Gomory fractional cuts

Integer Programming ISE 418. Lecture 7. Dr. Ted Ralphs

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen

Massively Parallel Approximation Algorithms for the Traveling Salesman Problem

IEOR E4008: Computational Discrete Optimization

Using SAS/OR to Optimize Scheduling and Routing of Service Vehicles

Adaptive Large Neighborhood Search

A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling Repairman Problem

SOLVING THE TASK ASSIGNMENT PROBLEM WITH A VARIABLE NEIGHBORHOOD SEARCH. Jozef Kratica, Aleksandar Savić, Vladimir Filipović, Marija Milanović

Heuristic Optimization

Dynamic programming for the orienteering problem with time windows

Combinatorial Optimization

A simulated annealing algorithm for the vehicle routing problem with time windows and synchronization constraints

Graph Coloring Facets from a Constraint Programming Formulation

Polyhedral results for the Cardinality Constrained Multi-cycle Problem (CCMcP) and the Cardinality Constrained Cycles and Chains Problem (CCCCP)

Time Complexity of an Algorithm

A Tabu Search Heuristic for the Generalized Traveling Salesman Problem

Constraint Programming for Timetabling

Effective Local Search Algorithms for the Vehicle Routing Problem with General Time Window Constraints

arxiv: v1 [cs.dm] 6 May 2009

Transcription:

Hybrid Constraint Programming and Metaheuristic methods for Large Scale Optimization Problems Fabio Parisini Tutor: Paola Mello Co-tutor: Michela Milano Final seminars of the XXIII cycle of the doctorate course in Electronics, Computer Science and Telecommunications Parisini (UniBo) Hybrid methods Final seminars 1 / 27

Combinatorial optimization problems Combinatorial optimization (CO) is a topic in theoretical computer science and applied mathematics that consists of finding the least-cost solution to a mathematical problem in which each solution is associated with a numerical cost 1. Combinatorial optimization problems arise in many application areas: Vehicle routing; Logistics; Packing and cutting stock application; Resource allocation; Scheduling;... 1 Wikipedia Parisini (UniBo) Hybrid methods Final seminars 2 / 27

Combinatorial optimization problems Combinatorial optimization (CO) is a topic in theoretical computer science and applied mathematics that consists of finding the least-cost solution to a mathematical problem in which each solution is associated with a numerical cost 1. Combinatorial optimization problems arise in many application areas: Vehicle routing; Logistics; Packing and cutting stock application; Resource allocation; Scheduling;... 1 Wikipedia Parisini (UniBo) Hybrid methods Final seminars 2 / 27

Complete and heuristic methods Two categories of solution approaches to CO problems: Complete methods: find the optimal solution and prove optimality at the cost of a high computational effort; Heuristic methods: find good solutions without any optimality guarantee. Complete methods are impracticable when dealing with large scale optimization problems. Parisini (UniBo) Hybrid methods Final seminars 3 / 27

Complete and heuristic methods Two categories of solution approaches to CO problems: Complete methods: find the optimal solution and prove optimality at the cost of a high computational effort; Heuristic methods: find good solutions without any optimality guarantee. Complete methods are impracticable when dealing with large scale optimization problems. Parisini (UniBo) Hybrid methods Final seminars 3 / 27

Feasibility and optimality components Two aspects coexist within CO problems: Feasibility component: the constraints and the size of the problem are such that it is computationally expensive to find any feasible solution; Optimality component: it is computationally easy in practice to find a feasible solution, whilst it is very difficult to find the optimal one. Parisini (UniBo) Hybrid methods Final seminars 4 / 27

Solution techniques Constraint Programming (CP) is particularly effective when dealing with the feasibility component of a CO problem. On the other hand, CP may present some limitations when dealing with a strong optimality component; Metaheuristic methods instead are used in the literature to solve large scale optimization problems in an incomplete way, i.e. by finding feasible sub-optimal solutions. Metaheuristic techniques can thus effectively deal with CO problems where the optimality component is dominant. Parisini (UniBo) Hybrid methods Final seminars 5 / 27

Constraint programming General technique based on tree search; Exploiting variable and value selection heuristics to guide the search; Filtering and constraint propagation considerably reduce the size of the search space; Suitable for complete approaches to the solution of CO problems; Impracticable for large scale optimization problems. Parisini (UniBo) Hybrid methods Final seminars 6 / 27

Metaheuristic methods Many existing techniques, having common concepts: Intensification and diversification techniques; Neighborhood exploration methods; Definition of local search moves specific for the problem; Usage of a set of elite solutions; Adaptation of the search strategy to the evolution of the search process itself;... Parisini (UniBo) Hybrid methods Final seminars 7 / 27

Motivation Thesis aim Constraint programming and metaheuristics methods show complementary strenghts and weaknesses. Hybrid search techniques can be designed to exploit the advantages of both approaches. The aim of my thesis is to integrate metaheuristic concepts, such as neighborhood exploration, intensification, diversification, restarting, within a CP-based tree search. Parisini (UniBo) Hybrid methods Final seminars 8 / 27

CP modeling of a CO problem An optimization problem P; A model for P defined on a set of finite domain integer variables x = [x 1, x 2,...,x n ]; A set of constraints posted on the problem variables; An incumbent solution x = [ x 1, x 2,..., x n ] which is a feasible assignment of values to the problem variables; The discrepancy between x and x, which can be computed as: (x, x) = n d i where d i = i=1 { 1 if xi x i ; 0 otherwise. (1) Parisini (UniBo) Hybrid methods Final seminars 9 / 27

A known technique: limited discrepancy search (LDS) Wrong turn: when the heuristics guides the search to a failure state, backtracking is performed up to an open decision point, where the choice performed by the heuristics is reversed. This alternate choice is called a wrong turn ; Limited Discrepancy Search (LDS) limits the number of wrong turns (i.e. discrepancies) along the way; it explores regions at increasing discrepancy value k, the k-distance neighborhoods of the solution proposed by the heuristics. Parisini (UniBo) Hybrid methods Final seminars 10 / 27

LDS in optimization problems LDS tries to reduce the cost of the incumbent solution x by exploring the k-distance neighborhood of x via tree search. n Incumbent Solution Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems LDS tries to reduce the cost of the incumbent solution x by exploring the k-distance neighborhood of x via tree search. n Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems LDS tries to reduce the cost of the incumbent solution x by exploring the k-distance neighborhood of x via tree search. n Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems LDS tries to reduce the cost of the incumbent solution x by exploring the k-distance neighborhood of x via tree search. n Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems LDS tries to reduce the cost of the incumbent solution x by exploring the k-distance neighborhood of x via tree search. n 5 Impossible to reach high discrepancy values Parisini (UniBo) Hybrid methods Final seminars 11 / 27

Large neighborhood search (LNS) Large neighborhood search is a technique which is commonly used for improving the quality of a given solution by exploring its neighborhood: It iteratively relaxes a fragment of the current solution x and then re-optimizes it using CP aided tree search; At a high level it can be read as a hill climber which is executed until some time limit is exhausted; It is a problem-dependent strategy; Key components of LNS are the methods used to choose the fragments to relax and the methods to re-optimize them. Parisini (UniBo) Hybrid methods Final seminars 12 / 27

Hybrid methods for neighborhood exploration Complete neighborhood exploration is impracticable for large scale optimization problems: We introduce methods to explore slices of large discrepancy neighborhoods efficiently using restarts and randomization. n 3 2 1 Parisini (UniBo) Hybrid methods Final seminars 13 / 27

Hybrid methods for neighborhood exploration Complete neighborhood exploration is impracticable for large scale optimization problems: We introduce methods to explore slices of large discrepancy neighborhoods efficiently using restarts and randomization. n 3 2 1 Parisini (UniBo) Hybrid methods Final seminars 13 / 27

Hybrid methods for neighborhood exploration Complete neighborhood exploration is impracticable for large scale optimization problems: We introduce methods to explore slices of large discrepancy neighborhoods efficiently using restarts and randomization. n 3 2 1 Parisini (UniBo) Hybrid methods Final seminars 13 / 27

Sliced neighborhood search (SNS) SNS iteratively explores neighborhoods of an incumbent solution x by triggering a sequence of restarts; At each restart a randomly chosen transversal slice of a k distance neighborhood is chosen and explored; Transversal slices are identified by randomly selecting which variables have to change and which variables have to keep the same value; SNS works by posting extra constraints, equality and difference constraints, on subsets of variables, thus setting at-least and at-most discrepancy bounds. Parisini (UniBo) Hybrid methods Final seminars 14 / 27

Sliced neighborhood search (SNS) Definition (Neighborhood slice) A slice of a k-distance neighborhood of a given reference solution x is defined on three parameters: the incumbent solution x, a set E of indices corresponding to variables that keep the same value and a set D of indices corresponding to variables that have to change. The cardinality of E is n k: N S ( x, E, D) = {x P {x i = x i i E} {x i x i i D}} SNS randomly choses indices in sets E and D and iteratively explores the corresponding neighborhood slice. Parisini (UniBo) Hybrid methods Final seminars 15 / 27

A SNS example Let s take the incumbent solution x = [2, 4, 9, 5, 2, 8]: LDS first explores exhaustively the search space at discrepancy value 1, where just 1 of the 6 variables can change at a time, then at discrepancy 2, 3 and so on; SNS first fixes a certain number of variables to the incumbent solution value, then starts the real search: For example, SNS could set x 2 = 4, x 3 = 9 and x 6 = 8 and perform a standard tree search just on x 1, x 4 and x 5 ; SNS performs many randomized iterations choosing each time a different set of variables and using small time limits for each iteration. Parisini (UniBo) Hybrid methods Final seminars 16 / 27

Application of SNS Usage of SNS as a stand alone search strategy: SNS is given an initial solution x. If x is distant from the optimal solution SNS can quickly improve over it by performing large neighborhood search. SNS within a heuristic framework: SNS allows the partial exploration of very large neighborhoods, it includes randomization elements and both intensification and diversification behaviors; By conveniently setting at-most k and at-least k bounds it is possible to constrain the solution x to be found to have the desired minimum and maximum discrepancy with respect to x, performing either intensification or diversification. Parisini (UniBo) Hybrid methods Final seminars 17 / 27

Experimental results Problem of choice: Asymmetric Travelling Salesman Problem with Time Windows (ATSPTW) Finding a minimum cost path visiting a set of cities exactly once, where each city must be visited within a specific time window. Two main components coexist, a Travelling Salesman Problem (TSP) and a scheduling problem: In TSPs, optimization usually results to be the most difficult issue; Scheduling problems with release dates and due dates usually set serious feasibility issues. Parisini (UniBo) Hybrid methods Final seminars 18 / 27

Experimental results Problem of choice: Asymmetric Travelling Salesman Problem with Time Windows (ATSPTW) Finding a minimum cost path visiting a set of cities exactly once, where each city must be visited within a specific time window. Two main components coexist, a Travelling Salesman Problem (TSP) and a scheduling problem: In TSPs, optimization usually results to be the most difficult issue; Scheduling problems with release dates and due dates usually set serious feasibility issues. Parisini (UniBo) Hybrid methods Final seminars 18 / 27

SNS performance elements 1/3 Search effectiveness in the sub-trees After discrepancy bounds are enforced by posting equality and difference constraints, search in the sub-trees takes place. Search effectiveness in the sub-trees strongly depends on the propagation performed by such constraints. While equality constraints enable strong propagation, we expect difference constraints to be less effective. Parisini (UniBo) Hybrid methods Final seminars 19 / 27

SNS performance elements 2/3 Solution Density in the selected discrepancy range Regardless of how efficiently the selected discrepancy range is explored, the success of SNS depends on the actual presence of improving solutions in such range and their number. This in turn depends on the problem structure, and on the selected at-least and at-most discrepancy bounds. Parisini (UniBo) Hybrid methods Final seminars 20 / 27

SNS performance elements 3/3 Effectiveness of the Sample Space exploration SNS is basically sampling the LDS search space; the sampling effectiveness is measured by the number of collected samples (i.e. SNS iterations) over the size of the Sample Space. Let least be the at-least discrepancy bound, most the at-most discrepancy bound; then the size of the Sample Space (i.e. the overall number of third level sub-trees) is given by: ( )( ) n most n most least Parisini (UniBo) Hybrid methods Final seminars 21 / 27

SNS stand alone 1/2 Table: Big instances, more than 50 cities, 300 CPU seconds time limit Instance basic LDS max =40% Cost Sol Disc % Impr Cost Sol Disc % Impr rbg050a 430-0.00 424 5 37.50 rbg050b 570-0.00 570-0.00 rbg050c 563-0.00 545 7 66.67 rbg055a 814-0.00 814-0.00 rbg067a 1051 4 62.50 1051 7 62.50 rbg092a 1208 3 24.34 1150 8 62.50 rbg125a 1706 3 15.86 1632 20 36.83 rbg132 1882 3 9.06 1815 13 20.73 rbg132.2 2152 3 1.11 2040 31 11.47 rbg152 2371 3 6.96 2336 23 12.5 rbg152.3 2570-0.00 2548 42 2.13 rbg172a 2942 3 4.11 2873 31 9.9 rbg193 3360 3 4.15 3266 31 13.68 rbg193.2 3290 3 3.23 3233 45 7.84 rbg201a 3694 3 3.46 3650 24 6.29 rbg233.2 4103 3 2.12 4059 49 4.52 Parisini (UniBo) Hybrid methods Final seminars 22 / 27

SNS stand alone 2/2 Table: Big instances, more than 50 cities, 300 CPU seconds time limit Instance (0%, 40%) (10%, 50%) (10%, 40%) (25%, 55%) Cost % Impr Cost % Impr Cost % Impr Cost % Impr rbg050a 424 37.50 429 6.25 429 6.25 430 0.00 rbg050b 570 0.00 570 0.00 570 0.00 570 0.00 rbg050c 545 66.67 563 0.00 563 0.00 563 0.00 rbg055a 814-814 - 814-814 - rbg067a 1051 62.50 1048 100 1051 62.50 1051 62.50 rbg092a 1150 62.50 1203 27.63 1203 27.63 1217 18.42 rbg125a 1632 36.83 1682 22.66 1721 11.61 1762 0.00 rbg132 1815 20.73 1925 1.57 1916 3.14 1934 0.00 rbg132.2 2040 11.47 2063 9.34 2035 11.93 2108 5.18 rbg152 2336 12.5 2377 6.01 2365 7.91 2410 0.79 rbg152.3 2548 2.13 2555 1.45 2499 6.89 2528 4.07 rbg172a 2873 9.90 2918 6.12 2932 4.95 2991 0.00 rbg193 3266 13.68 3278 12.46 3401 0.00 3401 0.00 rbg193.2 3233 7.84 3330 0.00 3330 0.00 3330 0.00 rbg201a 3650 6.29 3748 0.00 3748 0.00 3748 0.00 rbg233.2 4059 4.52 4142 0.00 4142 0.00 4142 0.00 Parisini (UniBo) Hybrid methods Final seminars 23 / 27

SNS within CP-based local branching Table: Big instances, 7,200 CPU seconds time limit, SNS used for neighborhood exploration and diversification. Instance Ref value LB Conf SNS Conf Value % Impr Value % Impr rbg125a 2346 1762 62.33 1484 91.99 rbg132.2 2276 1883 32.94 1310 80.97 rbg132 2122 1934 24.67 1410 93.44 rbg152.3 2675 2397 24.47 1981 61.09 rbg152 2771 2281 49.59 1889 89.27 rbg172a 3010 2748 21.63 2198 67.05 rbg193.2 3365 3143 17.45 2739 49.21 rbg193 3440 3217 21.73 2876 54.97 rbg201a 3780 3562 13.70 3039 46.57 rbg233.2 4219 3886 17.39 3768 23.55 Parisini (UniBo) Hybrid methods Final seminars 24 / 27

Conclusions and future work SNS is a general and effective search technique to heuristically explore the neighborhood of an incumbent solution x up to high discrepancy values, incorporating elements coming from LDS and LNS; Experimental results support the idea that the best SNS configurations obtain consistently better results than LDS; SNS can be used both as a stand alone search strategy and as intensification and diversification method in a heuristic framework. Further development: Use of sampling techniques to derive promising slices to explore; Introduction of learning processes, to automatically tune the SNS parameters during the search process. Adoption of SNS as general neighborhood exploration tool within metaheuristics frameworks. Parisini (UniBo) Hybrid methods Final seminars 25 / 27

List of publications I Z. Kiziltan, A. Lodi, M. Milano, and F. Parisini. Cp-based local branching. Proc. of CP-07, LNCS, 4741:847 855, 2007. Z. Kiziltan, A. Lodi, M. Milano, and F. Parisini. Bounding, filtering and diversification in cp-based local branching. Technical Report: OR/10/20, DEIS - Università di Bologna, 2010. F. Parisini. Bi-dimensional domains for the non-overlapping rectangles constraint. In ICLP, pages 811 812, 2008. F. Parisini. Local branching in a constraint programming framework. In ICLP (Technical Communications), pages 286 288, 2010. Parisini (UniBo) Hybrid methods Final seminars 26 / 27

List of publications II F. Parisini, M. Lombardi, and M. Milano. Discrepancy-based sliced neighborhood search. In AIMSA, pages 91 100, 2010. F. Parisini and M. Milano. Improving cp-based local branching via sliced neighborhood search. Accepted for publication: SAC 11: Proceedings of the 2011 ACM Symposium on Applied Computing. F. Parisini and M. Milano. Sliced neighborhood search. Submitted for publication to:expert Systems with Applications. Parisini (UniBo) Hybrid methods Final seminars 27 / 27