Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation MIC 2001, Porto, Portugal Marc Sevaux, Philippe Thomin fmarc.sevaux, Philippe.Thoming@univ-valenciennes.fr. University of Valenciennes Dept. Production Systems, France Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.1/20
The parallel machine scheduling problem Set of n jobs = J 1 ; : : : ; J n to be sequenced on m parallel machines. J Characteristics j : release date r j : processing time p j : due date d j : weight w Variables j : start time (t j r j ) t j : completion time (C j = t j + p j ) C j : binary variable (job J j is late or not) U Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.2/20
Conditions C If» d j j J, j is on time or (U early, = j 0). C If > d j j J, j is late tardy,(u or = j 1). Each job has to be scheduled on one machine only. Objective Minimize the weighted number of late jobs j j P w j U j Pmjr Key remark Late jobs scheduled arbitrarily after jobs on time, i.e., practically, not scheduled at all Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.3/20
Example r j p j d j j w Job 1 2 3 4 5 0 0 2 3 4 3 6 3 4 1 5 7 8 9 7 2 5 5 4 3 M2 J2 J 4 M1 J J J 1 5 3 0 Objective value = 4 5 9 t Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.4/20
Status Complexity results NP-hard [Garey, Johnson, 79] NP-hard [Garey, Johnson, 79] 1jr j j P w j U j NP-hard [Lenstra et al., 77] Previous works on Pmj j P U j Specific heuristics [Ho, Chang, 95] Genetic algorithm [Liu et al., 98] Column generation & B&B [Chen, Powell, 99] 1jr j j P w j U j Practical approaches for Constraint programming [Baptiste, 99] Lagrangean relaxation [Dauzère-Pérès, Sevaux, 99] Genetic local search [Sevaux, Dauzère-Pérès, 00] Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.5/20
Floating jobs General assumption J When j is scheduled on t time, 2 [r j ; d j p j j ]. Main idea of the method j is not fixed, but computed when necessary t according to the partial sequence on the same machine. J i Jj J k r j t j dj Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.6/20
lst k = min(d k ; lst succ(k)) p k Inserting a job Assumption Jobs are inserted only if on time Checking for insertion j is inserted between J i and J k J only max(r if ; eet i ) + p j» min(d j ; lst k ) j holds i = max(r i ; eet pred(i)) + p i eet eet i J i J k lst k Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.7/20
Best insertion Insertion» min(d j ; lst succ(j)) holds, If max(r ffl ; eet pred(j)) + p j j can be inserted at this position ffl J j Best Insertion All candidate positions are scaned, ffl J ffl j is inserted where the remaining idle time pred(j) between succ(j) and is minimum. Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.8/20
Heuristics Use specific sort of the jobs, then apply the best insertion procedure. WSPT (weighted shortest processing time) increasing order w of =p i i (BWS). RS (random shuffle) random order (BRS). other heuristics EDD, LPT, SPT, etc. Only BWS is really efficient. Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.9/20
A common neighborhood Only on time jobs are scheduled on the machines. Neighborhood: early jobs ψ! late jobs. Parameters Removing one early job: always feasible. One late ψ! 0tol job early jobs. Method Scan all possible exchange points, Put them in an ordered list, O(n Complexity: ). 2 Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.10/20
Different approaches Using the common neighborhood, heuristics and metaheuristics are evaluated. Initial solutions EMP, BWS, BRS Greedy heuristics Descent Deepest descent Multistart deepest descent Metaheuristics Simulated annealing Tabu search Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.11/20
Descent heuristics Starting from an initial solution or from an empty solution Simple descent choose the first neighbor that decreases objective value. Deepest descent choose the neighbor that most decreases objective value. Three runs of descent method are done. Multistart descent BRS heursitic is used as starting points, 1000 times. Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.12/20
Simulated annealing Stopping conditions 10000 iterations without improvement. Initial temperature Init = 20, each 100 it. decrease by 5%. Automatic initial temperature (experimental) Reheating 100 it. with no improving neighbors, reheat the temperature at 50% of the initial value. Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.13/20
Tabu search Stopping conditions 10000 iterations without improvement. Tabu criteria complete schedule, weights of the late jobs, triplet (weights, processing times, indices) of the late jobs. Cycle detection and dynamic tenure tenure too! small cycle, too! large block, block detected! decrease tenure by one, cycle detected! increase tenure by cycle length. Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.14/20
Numerical experiments A set of 2700 instances is given by Baptiste et al. About 1200 optimal solutions are known. No solution is given by Baptiste et al for instance with stricly more than 50 jobs in less than 10 minutes. Additional optimal solutions are found using MIP formulations. Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.15/20
Descent heuristics Simple Deepest Descent BRS EMP BWS BRS MD Opt. hits (%) 46 45 49 45 78 First pos. (%) 33 33 35 32 63 Gap to opt. (%) 19.5 21.5 19.6 20.3 3.9 Av. CPU time (s) < :01 0.02 < :01 < :01 4.58 Max. CPU time (s) 0.05 0.12 0.02 0.05 41.1 Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.16/20
Simulated annealing vs Tabu search Simulated Annealing Tabu Search EMP BWS BRS EMP BWS BRS Opt. hits (%) 70 70 70 89 88 88 First pos. (%) 55 54 55 79 76 79 Gap to opt. (%) 6.72 6.59 5.98 1.94 2.00 1.95 Av. CPU time (s) 9.10 7.68 8.87 10.63 9.42 10.16 Max. CPU time (s) 60.3 53.2 63.3 101.9 112.0 107.5 Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.17/20
200 Overall comparison - Init. sol. = BW DS (62) SA (57) TS (53) 180 160 140 f(x) 120 100 80 60 0 5 10 15 20 CPU Time (s) Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.18/20
70 68 Overall comparison - Best solutions - Zoom DS (62) SA (57) TS (53) 66 64 62 f(x) 60 58 56 54 52 0 2 4 6 8 10 CPU Time (s) Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.19/20
Perspectives and conclusion Improve the method Find new lower bounds (linear relaxation of MIP models) Develop the automatic SA initial temperature Develop an exact method Lagrangean relaxation algorithms Based on master sequence. Valid extensions of the model Uniform parallel machines, Qmjr j j P w j U j Unrelated parallel machines, Rmjr j j P w j U j Parallel Machine Scheduling: A (Meta)Heuristic Computational Evaluation p.20/20