Evaluation of School Timetabling Algorithms

Size: px
Start display at page:

Download "Evaluation of School Timetabling Algorithms"

Transcription

1 Evaluation of School Timetabling Algorithms Viktor Lindberg June 8, 2016 Master s Thesis in Computing Science, 30 credits Supervisor at CS-UmU: Mikael Rännar Examiner: Henrik Björklund Umeå University Department of Computing Science SE UMEÅ SWEDEN

2

3 Abstract Most schools have the problem that they need to organise the meetings between students and teachers in lectures and place these lectures in a timetable. Four different algorithms that can be used to solve this problem will be evaluated in this thesis. The algorithms are Simulated Annealing, Particle Swarm Optimisation, Hyper-Heuristic Genetic Algorithm and Iterated Local Search. In this thesis a description of the algorithms will be given and then evaluated by running them on a set of different known timetabling problems and have their results compared with each other to find out which algorithm is best suited for use in a potential end-user application. Simulated Annealing combined with Iterated Local Search gave the best results in this thesis.

4 ii

5 Acknowledgements I would like to thank Rickard Lindberg at Dohi for presenting the interesting subject of the timetabling problem to me and giving me the opportunity to do my thesis work at Dohi. I would also like to thank Karl A-Thunberg at Dohi and my Umeå University supervisor Mikael Rännar for giving valuable feedback during this project.

6 ii

7 Contents 1 Introduction Report outline Problem description School timetabling problem Successful timetabling Complexity Goal Methods Related Work Simulated Annealing Hyper-heuristics Swarm algorithms Matheuristics Earlier work - Conclusions Method XHSTT Time Resources Events Groups Constraints Solutions Algorithm descriptions Heuristics Simulated Annealing Particle Swarm Optimisation Hyper-Heuristic Genetic Algorithm Iterated Local Search Implementation of algorithms Initial solution iii

8 iv CONTENTS Simulated Annealing Particle Swarm Optimisation Hyper-Heuristic Genetic Algorithm Iterated Local Search Results Fitness Testing environment The timetable problems Simulated Annealing Particle Swarm Optimisation Hyper-Heuristic Genetic Algorithm Iterated Local Search Comparison Conclusions Goals Limitations Future work References 33 Appendices 35 A Benchmark results 37 A.1 Simulated Annealing A.2 Particle Swarm Optimisation A.3 Hyper-Heuristic Genetic Algorithm A.4 Iterated Local Search

9 Chapter 1 Introduction In all schools from small to big, the use of timetables to organise the meetings between teachers and students is a requirement for a functioning school environment. Timetables are used in more than just schools, such as universities, scheduling of exams and in other industries. They are an important tool for many organisational matters but can be difficult to create manually when many resources (e.g students and teachers) come in to play. The problem of generating timetables for schools is known as the School timetabling problem and is a complex problem that is known to be NP-complete for most larger practical instances[9, 6]. Because of its complexity researchers are trying different techniques to solve this problem, and today most of them lie within the artificial intelligence realm. This thesis will evaluate some of the algorithms used to solve the school timetabling problem and try to find the most suitable one that can generate a timetable starting with a set of teachers, students, rooms and lectures with a set of constraints applied to it. 1.1 Report outline This section gives a general outline for the chapters in the report. Chapter 1: An introduction to the timetabling problem along with some of the earlier research on the subject. Chapter 2: A description of the XML-format used to describe the timetabling problems. Descriptions of the tested algorithms together with a description of the implementation of these algorithms. Chapter 3: The results of applying the algorithms to a set of timetabling problems. Chapter 4: Conclusions and thoughts about the results together with limitations and possible future work. Appendices: The appendices contains the complete results from the tests performed on the algorithms described in this thesis. 1.2 Problem description As mentioned earlier the focus of this thesis is the creation of timetables starting with a number of known resources and constraints. The problem of creating timetables is generally 1

10 2 Chapter 1. Introduction divided into three different types[21]: (High) School timetabling: Schedule the classes for a school such that teachers and classes does not occupy the same time slot more than once. Exam timetabling: The problem of assigning exam times for university students and spreading out the exams for each student. Course timetabling: Scheduling of the courses for university students and not having the students different courses at the same time slots School timetabling problem The problem for this thesis fits into the school timetabling type. It usually starts with a list of resources such as teachers, students and rooms. Then a set of events (e.g. lectures) are given that contains some resources. And here the problem is to assign the events any missing resources and a time slot in the timetable. The ruling factors with the assigning are constraints, which are limitations on different factors. For example one constraint can be that no resources may occupy the same time slot more than once (a teacher can not teach two classes at the same time) Successful timetabling If all events have been assigned a time slot and all of them have their required resources and do not violate any constraints, it will result in an optimal timetable. If some of the non required constraints are violated, it will still be a feasible timetable, but of worse quality. The goal for a successful timetable is to have all its required constraints fulfilled and as many of the optional constraints as possible Complexity As stated earlier the timetabling problem is a tough problem to solve and is in many real instances NP-complete[9, 6]. Cooper and Kingston[6] showed that the timetabling problem is NP-complete in five independent ways that occurs in practice. They continue to say that because of the limited size of schools the construction of timetables should become feasible with time when new ways of dealing with the problem is invented and the computers computing power increases. Because of this complexity the problem often becomes an optimisation problem where the problem is to find the best solution possible within a feasible time frame. 1.3 Goal The aim of this thesis is to find and evaluate algorithms that solve the timetabling problem, then see if any of the algorithms can be combined or improved. Also to see which algorithm would work best in an application where a school wants to create timetables for their students and teachers.

11 1.4. Methods Methods To evaluate the algorithms existing data from various schools where used. This data is provided by a benchmarking project for school timetabling[1]. The data is contained in XML files that uses a somewhat complicated structure that will be explained in more detail in Section 2.1. The algorithms were implemented in C# and evaluated by having them solve several different timetabling problems and examining how good the results were and how fast they were able to reach these results. More info and documentation can be found on the homepage for the project[1]. The timetable problem basically contains a list of resources, events and constraints. The relationships between these are what governs how a correct timetable will be created. 1.5 Related Work While the school timetabling problem is not as popular with researchers as the other timetabling types, there are quite a few studies that has been made. Many of them research specific algorithms for solving the school timetabling problem. Some of these algorithms are Simulated Annealing (SA)[20], Tabu Search[21], Genetic Algorithm[20], Evolutionary Algorithm[8], Iterated Local Search[7] and Neural Networks Simulated Annealing Some studies have been made by comparing these algorithms and resulted in different conclusions. For example there has been competitions held for the timetabling problem where the third and latest such competition was held in [18]. Here there were five teams competing against each other using different techniques and algorithms. The winner was GOAL by Fonseca et. al.[7] which started out by using an open source timetabling solver called KHE[3] to get an initial solution, then used SA and Iterated Local Search to search for a solution around this initial solution. These algorithms used a range of heuristics, such as swapping two events or swapping two resources for an event, and which heuristic to use was chosen based on preset probabilities. And since they were the winners and found the best solutions for most of the problems, it can be said that their solver performed satisfactory. Zhang et al.[24] has created a modified SA algorithm that differs in how it selects its neighbour solutions (see Section on how SA works). They compared their algorithm to another version of SA, a neural network algorithm and a local search heuristic using five different datasets. While all of them managed to find optimal solutions, the modified SA algorithm managed to find the solution much quicker than the rest Hyper-heuristics The runner up to the third timetabling competition was HySST[13] which used a hyperheuristic algorithm. This one also starts by generating an initial solution using the KHE[3] solver. It then searches starting from this solution using 9 different mutational operators. If one of these finds a better solution it continues to the next iteration, otherwise it uses hill climbing algorithms[20] to find a local optimum before continuing. It did not perform as well as GOAL but beat the other candidates and got the best solution on some of the problems. Another solution by Pillay[16] using hyper-heuristics combined this technique with a evolutionary algorithm. They managed to find feasible solutions to the two problems they

12 4 Chapter 1. Introduction tested their algorithm on. It performed better than implementations of tabu search and greedy search on the same problem and about the same as neural networks and SA. A newer Hyper-Heuristic Genetic Algorithm was developed by Raghavjee and Pillay[19], they looked at earlier research into genetic algorithms and found that the most effective mutation operators were mostly problem specific. Because of this they tested if a hyperheuristic algorithm could help in automatically selecting the best low-level heuristics for the given problem. They let a combination of heuristics evolve to best solve the timetabling problem. They tested their implementation on 14 different data sets and managed to find feasible solutions for all of them. While they produced timetables of better quality than a genetic algorithm, they were outperformed by a SA algorithm by Zhang et al.[24] and a Particle Swarm Optimisation (PSO) algorithm by Katsaragakis et al.[11], with the PSO algorithm performing the best Swarm algorithms In a more recent study by Katsaragakis et al.[11], Particle Swarm Optimisation (PSO) and artificial fish swam (AFS) were tested on the same school timetabling problem and compared to each other. They found that both performed very well and found new best solutions to some of the problems they chose. The two algorithms were also compared to other algorithms that had been tested on the same datasets. The other algorithms were a modified SA algorithm by Zhang et al.[24], evolutionary computation by Beligiannis et al.[4] and a genetic algorithm by Raghavjee et al.[19]. Both PSO and AFS performed better than the other algorithms on the same datasets. Chu et al.[5] applied PSO to the examination timetable problem. Their implementation was successful in creating feasible timetables for their datasets. They suggest that parallelising PSO or combining it with tabu search could yield better results Matheuristics Recently a new approach to solving the timetabling problem has been investigated by Fonseca et al.[10]. They started with an initial solution obtained by the aforementioned KHE solver, and then proceeded by improving the solution with a variable neighbourhood search (VNS) algorithm. When the VNS algorithm started to stagnate the solver switched to a matheuristic algorithm to improve the solution even further. This process showed itself to be very successful and managed to improve the best known solutions for several timetabling problems from the XHSTT web page[1] Earlier work - Conclusions SA is the most frequently occurring algorithm in earlier work and seems to work well on many different timetabling problems. Among all the SA implementations, the one by the GOAL team seems to be most promising because of the good results and their use of more complicated datasets. The only problem is that they do not only use SA in their solution but start with another solver then use SA and finish up by using Iterated Local Search. So their results does not show how good SA is, but how good it is in collaboration with another algorithm. Because the algorithms have access to many different heuristics when creating neighbourhood solutions, it might be difficult to know how much of each heuristic to use. Because of this the Hyper-Heuristic Genetic Algorithm could work very well as opposed to dividing the heuristics at a set percentage.

13 1.5. Related Work 5 The PSO algorithm presents an interesting modern approach that has yielded good results as evident by more than one research paper.

14 6 Chapter 1. Introduction

15 Chapter 2 Method The focus of the thesis is to find suitable algorithms that can solve the school timetabling problem. This chapter gives a more in-depth description of four candidate algorithms and the methods used to evaluate them. All of these algorithms share the same heuristics that will be described later in this chapter to create neighbouring solutions. The Hyper-Heuristic Genetic Algorithm uses these heuristics differently than the rest of the algorithms by building genes of heuristics to apply to the timetables. 2.1 XHSTT In order to do benchmarks on the different algorithms, example timetables were needed. Luckily there exists a benchmarking project for school timetabling that provides many different datasets for just this occasion. This project is located on a website[1] hosted by the University of Twente and maintained by Gerhard Post. Here researchers in the timetabling field can exchange datasets with each other and upload solutions to the data sets. The datasets are contained within an XML format called XHSTT (XML High School Timetabling) that specifies the resources and lectures of a timetable, as well as the constraints and solutions of that timetable. The main components in the format is times, resources, events and constraints Time The timetable contains a fixed number of time slots available to assign events to. These time slots can contain several events in parallel as long as none of these events contain the same resources. There are also other constraints that specify which events that can be contained in the same time slot. The times usually make up one week, but this is up to the school and there are instances where for example two weeks makes up a timetable so that odd and even weeks have different schedules Resources The resources are typically teachers, classes and rooms. These resources can be of different types and some events might need specialised types. For example a chemistry class might need a chemistry room in order to do experiments. Or a math class needs a math teacher. 7

16 8 Chapter 2. Method These types are specified with roles that are used by constraints to force events to use these specialised types Events The events are usually lectures that contain resources and are or should be assigned to one or more time slots. They are pretty much containers of resources that need to adhere to specific constraints. A finished timetable is a list of events contained inside time slots. The events specify a list of preassigned resources and unassigned resources, where the unassigned resources are given a resource type and a role. The role is there so constraints can specify which resources are valid for the given roles if an event has a preferred resource Groups Times, resources and events can be part of different groups (e.g. days or weeks for times or classes for students) in order to more easily specify many items in the XML documents Constraints The constraints are divided into hard constraints, that have to be fulfilled in order for a timetable to be feasible, and soft constraints that should be fulfilled as far as possible. A hard constraint can be for example that all events must be assigned one time slot. Or that one teacher can not be be part of more than one event that occupies the same time slot (i.e. can not teach two classes at the same time). An example of a soft constraint can be a teacher s preferred working hours, which should be fulfilled if possible but might not be necessary to get a working timetable. There are currently sixteen constraints specified in the XHSTT format and twelve of them where implemented for this thesis. The twelve that were implemented were chosen because they covered the most data sets on the XHSTT project website. Each constraint specifies a cost function and a weight that are used to calculate the cost of not meeting the constraints. The twelve implemented constraints are listed below. Assign times constraint: All events must be assigned to some time. Avoid clashes constraint: The specified resources can not occupy one time more than once. Split events constraint: How much an event can be divided in to smaller events and what minimum or maximum duration each sub event can have. Prefer resources constraint: Which resource (that is not preassigned) the specified events prefer. Prefer times constraint: Specified events prefer some times to others. Spread events constraint: Events should be spread out in time (e.g. not all math classes on the same day). Assign resource constraint: Simply that events should not have any unassigned resources. Avoid unavailable times constraint: Specifies that some resources can be unavailable at certain times.

17 2.2. Algorithm descriptions 9 Limit idle times constraint: Places limitations on how often resources can be idle. A resource is idle during a time group if it occupies an earlier and later time during that time group. Limit busy times constraint: Limits how many times a resource can occupy during one time group. Cluster busy times constraint: Limits the number of time groups during which a resource can be busy. Link events constraint: Specifies that some events should be assigned to the same time slot Solutions The XML documents can also specify solutions for the given problem. These contain a list of all the events with their assigned times and any resource assignments that they require. These solutions can be used to verify their correctness by other researchers evaluators or to share solutions. 2.2 Algorithm descriptions This section gives a more in-depth explanation of the more interesting algorithms and the heuristics they use. The algorithms where chosen based on how well they performed in earlier work described in Section 1.5 and on how well they can be implemented in a limited amount of time. All of the algorithms in this section are iterative optimisation algorithms that follows the same base premise. This premise is to seek a better solution starting from some initial solution and as they find better solutions they might get stuck in local optima, so they might have to find worse solutions that can in the long run lead to better solutions. The algorithms always assumes that there are better solutions as long as the timetable violates any of the constraints. There is however the possibility that a timetable s best solution might violate some constraints, but the algorithms have no such knowledge. Because of the possibility that they might never find a better solution while they go off in a bad direction they always save the best solution they have found so far to have something to fall back on. This problem can be imagined with the following metaphor: Old man Willy A blind old man called Willy wants to climb a mountain and can only check his progress each step by sensing with his height sensing ability if his altitude has changed. Now he might think he has reached the top because every step he takes changes his altitude negatively, but he is in fact standing on a platform much further down from the top. To reach the top he would have to chance moving down a slope sometimes to find an incline that can lead him even higher Heuristics Below are listed a range of heuristics that the algorithms can use to generate different candidate timetables. These make up the basis for all algorithms since these heuristics are

18 10 Chapter 2. Method the only way for them to change the timetables. If not stated otherwise the algorithms randomly chooses one of these heuristics each time they create a neighbour solution. Move event: Moves a random event to a random valid time slot. Switch events: Switches the time slots of two random events if they have time slots that can be interchanged. Split event: Splits an event with a larger duration than one time unit and moves the parts into random valid time slots. Merge events: Takes smaller parts of the same event and merges them into one big event and moves this event into a valid time slot. Change resource: Changes a randomly chosen resource of an event into a new random resource. Swap resources: Swaps the resources of two random events where the resource must fill the same role for the two events. Kempe move: Creates a conflict graph between the lectures in two random time slots and finds a kempe chain within the graph, then swaps events according to this chain. When creating a neighbour solution the heuristics are chosen based on a predetermined percentage. Not all heuristics are used on every type of timetable depending on their constraints. For example timetables that do not require resources to be assigned to lectures do not use the swap resources and change resource heuristics, and timetables that do not contain split or spread events constraint do not use the split and merge events heuristics. For example a problem that contains split constraints but has no required resource assignments might choose heuristics according to the following percentages: 50% Swap event, 20% Move event, 15% Split event, 10% Merge event and 5% Kempe move. Kempe chain As described by Fonseca et. al. in their solver[7] the Kempe move heuristic is a much more complex heuristic than the other heuristics used in this thesis. Sometimes a better solution might lie further away from the current solution than a normal heuristic can take it. So one attempt to find a better solution is to use a chain of moves (event swaps) that is called a Kempe chain. It starts by choosing two random time slots and creates two sets of vertices where the vertices are the lectures contained in the time slots. It then creates an undirected bipartite conflict graph between these two sets of vertices where there exists an edge between two vertices in the different sets if they share a resource. For example in Figure 2.1 there are two chains starting in lecture Eng A : EngA CheB and EngA BioA MathB EngB. Here the second chain is the longest and is chosen to have its events interchanged as seen in the second part of Figure 2.1. The longest chains are found using a breadth first search starting in each lecture and when one chain is picked out it is used to create a new timetable that will be evaluated to see how good this chain is. This is repeated for all the lectures in the two time slots and the best resulting timetable is saved.

19 2.2. Algorithm descriptions 11 Figure 2.1: Finding the longest conflict chain of lecture Eng A Simulated Annealing Simulated Annealing is a heuristic local-search algorithm that tries to mimic the process of annealing of metals. In computer science it is used to solve optimisation problems where a perfect solution might not be possible to find. The process of annealing (in the real world) is the heating of a material and then slowly cooling it in order to change its chemical and physical properties. The computer algorithm simulates this by starting with an initial solution and temperature. This initial solution can be completely random or generated by some other algorithm or method. At each iteration of the algorithm, a random neighbour solution is generated and this new solution is selected as the current if it is the better one. If this was all it did it would work exactly like hillclimbing search[20, p. 111] and can easily get stuck in a local optimum. In Figure 2.2 the hill-climbing search tries to find the highest point of a mountain and simply moves up until it can not move any higher. As seen in the figure the highest point it reached is not the highest point of the mountain but only a locally highest point. So here is where the annealing analogy comes in. When selecting the best neighbour solution, there is a chance the algorithm will pick a worse solution than the best one based on the temperature (which is lowered each iteration). This way the algorithm can escape the local optimums and gets a higher chance of finding a better solution or the global optimum. If in Figure 2.2 it was the SA algorithm that did the mountain climbing and reached the locally highest top, it would have a possibility of starting to move down the mountainside in order to start climbing towards the globally highest point of the mountain.

20 12 Chapter 2. Method Figure 2.2: Hill-climbing: Visualising its flaw of getting stuck in a local optimum. When checking if it should select a worse solution SA uses an acceptance function that can differ between different implementations. For example the acceptance function can be e temperature where is the difference between how good the original solution is to the neighbour solution. This is compared to random number usually between 0 and 1, and if this number is lower than the acceptance function the algorithm will accept the worse solution. This way the higher the temperature is the higher the chance is that a worse solution will be selected. Simulated Annealing is a popular algorithm when it comes to solving the school timetabling problem. It has been used successfully a number of times in earlier work[7, 24] with good results. Old man Willy Old blind man Willy started out his journey by drinking a bottle of whisky. Being very drunk Willy sometimes start walking in a random direction and might walk down a slope instead of up. This might actually help Willy in the case when he is standing on a platform that is not the top of the mountain. Eventually Willy sobers up and starts to only walk in directions where he is changing his altitude positively Particle Swarm Optimisation Particle Swarm Optimisation is an increasingly popular[17] algorithm that was first invented by Eberhart and Kennedy[12] and was aimed to create computational intelligence based on social interaction instead of on individual work. It was based on how birds move in a flock to search for corn and later they realised how well it worked on optimisation problems. Continuing on the bird flock analogy, birds in a flock that search for food will circle an area, and when one bird smells some food it will call the other birds so the flock converges towards the source of the food. This continues each time a bird gets closer to the food until one bird finds the source. This is what PSO tried to simulate. The particles in PSO searches a search space of a problem individually. Each particle also keeps track of where their personal best solution is in the search space. When they are done searching they update their personal best results based on the current result, their previous personal best result and the global best result found among all other particles. This way all particles will fly through a search space and follow the particles containing

21 2.2. Algorithm descriptions 13 the best results. They work together to update the globally best result so it will converge towards an optimal solution. Each particle starts at different positions in the search space in order to have a better chance at finding the global optimum. These starting positions are random and might not be completely spread out through the search space. But according to Venter and Sobieszczanski-Sobieski[23] the positions does not affect the effectiveness of PSO since the swarm changes dynamically during the search. For the swarm not to get stuck in local optima the particles does not always follow the global best result so that each particle can explore their own location of the search space. Old man Willy If blind man Willy had a group of friends (that is unfortunately also blind) whom wishes to climb the mountain with him they could help each other out. They all start walking in different random directions and when one blind person senses with his height sensing ability that he has reached a higher point, he calls out to the others that now will start to move in his direction. But since old men are stubborn they might not always listen to the others and continue in their own direction. This continues on until another person finds an even higher point and they finally reach the top or get tired and give up Hyper-Heuristic Genetic Algorithm A hyper-heuristic is a heuristic search method that tries to select or combine lower level heuristics to better be able to solve a problem. The other algorithms in this chapter selects their low level heuristics by randomly picking one each time they want to generate a new solution. This algorithm instead seeks to evolve which low level heuristics to use with a genetic algorithm (GA). The GA is used to evolve chromosomes that represent a list of heuristics. Each chromosome is a list of letters where each letter represent a heuristic. The chromosomes are evaluated by using them on a solution and then evaluating the solution. The evolutionary process of the algorithm starts with an initial population of chromosomes that are evaluated. Then based on the evaluation, parents can be selected by using a selection process, then genetic operators are applied to these parents to produce offspring that will make up the next generation of chromosomes. Old man Willy Blind old man Willy brings his whole village with him on the mountain climb. Everyone in the village has the same problem of being blind, and while they make their way up the mountain some people get lost and are never seen again. The ones left have children with each other, where this new generation is less likely to get lost because of the better height sensing ability they inherited from their parents. This is repeated while the whole village make their way to the top of the mountain. Selection There are different selection processes that can be used for selecting new parents in a genetic algorithm. The selection processes uses selection pressure to favour the better chromosomes

22 14 Chapter 2. Method to be picked as parents. The higher the selection pressure the more it will favour the better chromosomes. The convergence rate is also tied to the selection pressure, where a low selection pressure can lead to a very slow convergence and a high selection pressure can lead to the GA to converge too fast and settle on a bad solution. One of the more popular selection processes is tournament selection[15] which performs the following steps to select the parents for the new generation: Selects x number of random competitors from the chromosomes for the tournament. A random number z between 0 and 1 is generated. If the number is above a set threshold the best chromosome is chosen, otherwise a worse chromosome is chosen. The winner is added to a mating pool and a new tournament is started. This continues until the mating pool is big enough. The threshold for choosing the best chromosome is usually larger than 0.5 to favour the better chromosomes. The threshold also influence the selection pressure. Crossover When two parents have been selected to produce offspring, crossover can be used to create the offspring. Crossover is the combination of data from two chromosomes into two new chromosomes. First one or more crossover points are randomly selected, then the data between the crossover points in the parents are interchanged to create offspring. A visualisation of this can be seen in Figure 2.3 for a crossover with one crossover point and in Figure 2.4 with two crossover points. Multiple crossover points is used when some combinations can not be achieved with a single crossover point[14]. Figure 2.3: Crossover with one crossover point. Figure 2.4: Crossover with two crossover points.

23 2.3. Implementation of algorithms 15 Mutation To diversify the population, mutation can be introduced to make some random changes to certain individual chromosomes. The chance for a mutation to occur is often set to a low chance. If a mutation occurs, the selected heuristic in the chromosome is replaced with a new randomly chosen heuristic Iterated Local Search Iterated Local Search (ILS)[22] is a local optimisation algorithm that (as the previous algorithms in this chapter) tries to escape local optima. It starts by modifying the best solution so it can start the search from a different location. This modification can be done in different ways, but one way is to iteratively find solutions further and further away from the best solution. When the solution has been modified, a local search is done on the modified solution. This search can also vary between different implementations, and is used to find better solutions around the modified solution. As described earlier in Figure 2.2 the hill climbing algorithm can easily get stuck in a local optimum, so ILS as seen in Figure 2.5 tries to escape this optimum by making a big leap that will hopefully land it in the vicinity of a better solution. Figure 2.5: Visualising the ability to escape local optima by Iterated Local Search. Old man Willy Old man willy has grown wings. Being blind he is very hesitant in using them and only flies a short distance and continues on foot. When he does not manage to get any higher he gets frustrated and flies for an even longer distance. Eventually he could hit a point of the mountain where he can manage to get higher and escape any plateaus he might have gotten stuck on. 2.3 Implementation of algorithms The algorithms as well as everything related to the XHSTT format were implemented in C#. All algorithms share the same XHSTT implementation so they all use the same functions for calculating the fitness of the generated timetables. They also share functions for generating

24 16 Chapter 2. Method random neighbours, except for the genetic algorithm that generates neighbours based on its chromosomes Initial solution Before the algorithms starts to mold a problem into an acceptable solution, an initial solution is constructed randomly by first assigning random resources to the events and then assigning the events to random time slots. This initial solution is then the starting point for the algorithms to begin their work. Neighbour solutions are created randomly with the heuristics described in Section Simulated Annealing The implementation used in this thesis is based on the algorithm devised by Fonseca et. al.[7]. The algorithm works as described earlier in Section but for each iteration and temperature it creates a set number ( in this case) of neighbours. Each neighbour is evaluated and selected as described in the algorithm description. The implemented algorithm is described by the pseudo code seen in Algorithm 1. Algorithm 1 Simulated Annealing 1: procedure SimulatedAnnealing 2: bestsolution,currentsolution randomsolution 3: temperature initialtemperature 4: reheats 0 5: while reheats < maxreheats do 6: for k 0 until maxiterations do 7: neighbour randomneighbour from currentsolution 8: f itness(neighbour) f itness(currentsolution) 9: x random number 0 to 1 10: if < 0 then 11: currentsolution neighbour 12: else if x < e \temperature then 13: currentsolution neighbour 14: end if 15: end for 16: temperature alpha temperature 17: if temperature < 0.1 then 18: temperature initialtemperature 19: reheats reheats : end if 21: end while 22: end procedure

25 2.3. Implementation of algorithms Particle Swarm Optimisation The pseudo code for the PSO algorithm can be seen in Algorithm 2. Algorithm 2 Particle Swarm Optimisation 1: procedure ParticleSwarmOptimisation 2: particlelist GeneratePatricles 3: generation 0 4: while generation < maxgenerations do 5: for all particles in particlelist do 6: if particle.currentf itness <= particle.bestf itness then 7: particle.bestf itness particle.currentf itness 8: if particle.bestf itness <= globalbestf itness then 9: globalbestf itness particle.bestf itness 10: end if 11: end if 12: particle.timetable randomn eighbour 13: auxillaryp article particle 14: count 0 15: while particle.currentf itness > globalbestf itness do 16: if count > 10 then 17: x random number 0 to 1 18: if x < 0.2 then 19: Exit while loop 20: end if 21: end if 22: particle.timetable randomn eighbour 23: count count : end while 25: if auxilaryp article.currentf itness < particle.currentf itness then 26: particle.timetable auxilaryp article.timetable 27: end if 28: end for 29: generation generation : end while 31: end procedure The GenerateParticles method initialises a list with particles, where each particle contains a randomly generated timetable. Each timetable in the particles is randomised apart from each other so that all the particles have different starting points. An alternative multithreaded version of the code in Algorithm 2 was also constructed. This version is identical to the previous one except that each particle completes its work in its own thread while sharing its result through a common variable.

26 18 Chapter 2. Method Hyper-Heuristic Genetic Algorithm The implementation of the genetic algorithm is based on the general idea of the technique, which selects parents from a population using tournament selection and then creates offspring from the parents using a crossover technique. When selecting heuristics for the genes an additional heuristic is available that does nothing. This nothing is there so HHGA can vary the length of the genes in case when better solutions might lie closer or further away from the current solution. The crossover function uses two crossover points to create two offspring from two parents. Mutation is then applied to the offspring and the offspring is added to the new generation. During the crossover/mutation phase the offspring chromosome is applied to the best timetable found so far and examined to see if the generated timetable is better. The pseudo code for the algorithm can be seen in Algorithm 3. Algorithm 3 Hyper-Heuristic Genetic Algorithm 1: procedure HyperHeuristicGeneticAlgorithm 2: population GenerateInitialPopulation 3: while time < maxt ime do 4: for i 0 until population.count do 5: parents T ournamentselection(population) 6: of f spring Crossover(parents) 7: M utate(of f spring) 8: nextgeneration.add(of f spring) 9: end for 10: population nextgeneration 11: end while 12: end procedure

27 2.3. Implementation of algorithms Iterated Local Search The implementation of ILS can be seen in Algorithm 4. It starts by finding neighbour solutions in a random direction, it does this by iteratively generating random neighbours for a set number of times that is equal to the search length. It then performs a random search on the final neighbour and saves the best solution it finds. When the random search is done the process is repeated for a set amount of times and then the search length is increased to allow the algorithm to find a solution that can lie further away from the best solution. Algorithm 4 Iterated Local Search 1: procedure IteratedLocalSearch 2: currentsolution initialsolution 3: while time < maxt ime do 4: for i 0 until searchlength do 5: currentsolution randomn eighbour 6: end for 7: currentsolution LocalRandomSearch(currentSolution) 8: if currentsolution.f itness < bestsolution.f itness then 9: bestsolution currentsolution 10: else 11: currentsolution bestsolution 12: iterations iterations : end if 14: if iterations >= maxiterations then 15: iterations 0 16: searchlength (searchlength+initialsearchlength)%maxsearchlength 17: end if 18: end while 19: end procedure

28 20 Chapter 2. Method

29 Chapter 3 Results This chapter gives the results of the benchmark done on the four different algorithms. Each algorithm was tested by applying them to fifteen different hard timetabling problems. Each problem was run ten times for 1000 seconds each time. After each run an attempt to improve the result was done with Iterated Local Search for 120 seconds. The results presented in this chapter is the average of all of the ten instances for each problem, but the complete results for each instance can be found in Appendix A. Not all of the problems have been solved optimally by anyone and some problems have optimal solutions that has soft constraint fitness above zero. A second set of tests with a much longer run time were computed to see at which time the algorithms stops finding new solutions. These longer tests were built using only one timetabling problem with a run time of four hours, then improved with ILS for five minutes. For this longer test the FinlandCollege problem was chosen since it was observed during initial testing that while it was not the hardest problem, the algorithms still struggled with solving it completely. The results are presented in tables where the best solution found is presented for each problem. Also the average of the ten instances for each problem is presented with the average of all ten instances after ILS was applied to them. The values for the best and average results has the values x/y, where x is the hard fitness and y the soft fitness. So a feasible solution should have zero hard fitness, and a timetable with both zero hard and zero soft fitness can be guaranteed to be the optimal solution. An optimal solution could however have soft fitness above zero. 3.1 Fitness All algorithms use fitness functions specified for each constraint in the problems. These functions calculates the fitness for a timetable by examining how well it is tailored to meet the constraints specified in the problem. A lower fitness means a better timetable and the timetables have a hard and soft fitness where the hard fitness needs to be zero in order for the timetable to be considered feasible. Some timetables can have optimal solutions with hard or soft fitness above zero, but the algorithms in this thesis have no way of knowing this so they assume the best solution has a hard and soft fitness of zero. Therefore the algorithms should place a larger weight on the hard fitness or only calculate the hard fitness before continuing on to the soft fitness. And the soft fitness is the measure of how qualitative the timetable is, the lower it is the better even if it does not need to be zero in order for the 21

30 22 Chapter 3. Results timetable to be considered feasible. It is of course up to the school or institute to make the final decision on what a feasible timetable is. For example if a timetable has the hard constraint that no resource clashes can occur and the soft constraint that classes has no free periods on any day, it might be acceptable that one class has one free period on Wednesday if it means that a timetable with no resource clashes can be found. The problems specify different fitness functions for each constraint. These fitness functions differ so that each constraint can have different impacts. One fitness function can for example be a linear fitness function which means that the weight of the constraint is returned as is. Another example is the quadratic fitness function which returns the weight squared to give that constraint a higher importance. 3.2 Testing environment Each algorithm was tested on its own virtual server running on an Intel Xeon E5-2666v3 processor with 3.75 GB of memory. The servers were of the compute optimised C4 type provided by Amazon Web Services[2]. The longer tests were computed on a 3.40GHz Intel i5-4670k CPU and with 16GB of memory. 3.3 The timetable problems Fifteen timetable problems were chosen to test the algorithms. These problems were chosen among the harder problems available from the XHSTT website[1]. A short overview of the problems can be seen in Table 3.2. This table lists the number of times, resources and events to give a sense of the difficulty of the problem. The last column also list the best solution fitness achieved so far by various researchers as of Timetable problems - overview Instance name Times Teachers Rooms Students Classes Events Best known VejenG / 2718 StPaul / 1892 FinlandCollege / 0 FinlandHighSchool / 0 FinlandSecondarySchool / 77 GreeceHighSchool / 0 GreeceThirdHighSchoolPatras / 0 Italy Instance / 27 Kottenpark / 420 Kottenpark / 784 Kottenpark / 5095 Lewitt / 0 Woodlands / 0 Spanish school / 335 WHS / 111 Table 3.2: An overview of the timetabling problems.

31 3.4. Simulated Annealing 23 What is not shown in Table 3.2 is which constraints and the number of constraints each problem has. These can be seen on the XHSTT website[1], but as an example the FinlandCollege problem has the following constraints: The assign time constraint. The split events constraint. Two instances of the prefer times constraint. The spread events constraint. The avoid clashes constraint. Thirty-two instances of avoid unavailable times constraint. Two instances of the limit idle times constraint. Ten instances of the limit busy times constraint. Some of these problems has optimal solutions with soft fitness above zero, which is called a lower bound value. These lower bound values can be seen on the XHSST website[1]. 3.4 Simulated Annealing The results for the Simulated Annealing algorithm can be seen in Table 3.4. It managed to solve one problem completely and all the hard constraints of five other problems. Among the rest of the problems, one problem was very close to a feasible solution but the others very far from feasible solutions. While the results were not very good, since there are so few feasible solutions, it still performed the best among the tested algorithms. In some cases SA performed much worse than the other algorithms, such as on the problems Lewitt2009 and Italy Instance4. In only two minutes ILS managed to make big improvements on some of the solutions, and always some improvements on the rest. SimulatedAnnealing - Average - 10 minutes Instance name Best fitness Avg fitness +ILS Avg fitness VejenG / / / StPaul / / / FinlandCollege 1/ / /608.6 FinlandHighSchool 0/ / /127.2 FinlandSecondarySchool 0/ / /285.1 GreeceHighSchool1 0/0 0.6/0 0/0 GreeceThirdHighSchoolPatras2010 0/ / /331.7 Italy Instance4 910/ / / Kottenpark / / / Kottenpark / / / Kottenpark / / / Lewitt / / /444.8 Woodlands2009 0/ / /216.5 Spanish school 0/ / / WHS / / / Table 3.4: Average results for the Simulated Annealing algorithm with a shorter run time of ten minutes.

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ OUTLINE Optimisation problems Optimisation & search Two Examples The knapsack problem

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

Non-deterministic Search techniques. Emma Hart

Non-deterministic Search techniques. Emma Hart Non-deterministic Search techniques Emma Hart Why do local search? Many real problems are too hard to solve with exact (deterministic) techniques Modern, non-deterministic techniques offer ways of getting

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Algorithms & Complexity

Algorithms & Complexity Algorithms & Complexity Nicolas Stroppa - nstroppa@computing.dcu.ie CA313@Dublin City University. 2006-2007. November 21, 2006 Classification of Algorithms O(1): Run time is independent of the size of

More information

Random Search Report An objective look at random search performance for 4 problem sets

Random Search Report An objective look at random search performance for 4 problem sets Random Search Report An objective look at random search performance for 4 problem sets Dudon Wai Georgia Institute of Technology CS 7641: Machine Learning Atlanta, GA dwai3@gatech.edu Abstract: This report

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

Using Genetic Algorithms to optimize ACS-TSP

Using Genetic Algorithms to optimize ACS-TSP Using Genetic Algorithms to optimize ACS-TSP Marcin L. Pilat and Tony White School of Computer Science, Carleton University, 1125 Colonel By Drive, Ottawa, ON, K1S 5B6, Canada {mpilat,arpwhite}@scs.carleton.ca

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Revision Lecture Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic Optimisation University

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2 Hybridization EVOLUTIONARY COMPUTING Hybrid Evolutionary Algorithms hybridization of an EA with local search techniques (commonly called memetic algorithms) EA+LS=MA constructive heuristics exact methods

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Evolutionary Computation Algorithms for Cryptanalysis: A Study Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis

More information

mywbut.com Informed Search Strategies-II

mywbut.com Informed Search Strategies-II Informed Search Strategies-II 1 3.3 Iterative-Deepening A* 3.3.1 IDA* Algorithm Iterative deepening A* or IDA* is similar to iterative-deepening depth-first, but with the following modifications: The depth

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 Outline Local search techniques and optimization Hill-climbing

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Local Search (Ch )

Local Search (Ch ) Local Search (Ch. 4-4.1) Local search Before we tried to find a path from the start state to a goal state using a fringe set Now we will look at algorithms that do not care about a fringe, but just neighbors

More information

Evolutionary Non-Linear Great Deluge for University Course Timetabling

Evolutionary Non-Linear Great Deluge for University Course Timetabling Evolutionary Non-Linear Great Deluge for University Course Timetabling Dario Landa-Silva and Joe Henry Obit Automated Scheduling, Optimisation and Planning Research Group School of Computer Science, The

More information

Homework 2: Search and Optimization

Homework 2: Search and Optimization Scott Chow ROB 537: Learning Based Control October 16, 2017 Homework 2: Search and Optimization 1 Introduction The Traveling Salesman Problem is a well-explored problem that has been shown to be NP-Complete.

More information

A Study of Genetic Algorithms for Solving the School Timetabling Problem

A Study of Genetic Algorithms for Solving the School Timetabling Problem A Study of Genetic Algorithms for Solving the School Timetabling Problem by Rushil Raghavjee Submitted in fulfillment of the academic requirements for the degree of Master of Science in the School of Computer

More information

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems CS 331: Artificial Intelligence Local Search 1 1 Tough real-world problems Suppose you had to solve VLSI layout problems (minimize distance between components, unused space, etc.) Or schedule airlines

More information

Evolutionary Non-linear Great Deluge for University Course Timetabling

Evolutionary Non-linear Great Deluge for University Course Timetabling Evolutionary Non-linear Great Deluge for University Course Timetabling Dario Landa-Silva and Joe Henry Obit Automated Scheduling, Optimisation and Planning Research Group School of Computer Science, The

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Local Search Vibhav Gogate The University of Texas at Dallas Some material courtesy of Luke Zettlemoyer, Dan Klein, Dan Weld, Alex Ihler, Stuart Russell, Mausam Systematic Search:

More information

GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

CHAPTER 1 INTRODUCTION

CHAPTER 1 INTRODUCTION 1 CHAPTER 1 INTRODUCTION 1.1 OPTIMIZATION OF MACHINING PROCESS AND MACHINING ECONOMICS In a manufacturing industry, machining process is to shape the metal parts by removing unwanted material. During the

More information

3.6.2 Generating admissible heuristics from relaxed problems

3.6.2 Generating admissible heuristics from relaxed problems 3.6.2 Generating admissible heuristics from relaxed problems To come up with heuristic functions one can study relaxed problems from which some restrictions of the original problem have been removed The

More information

AI Programming CS S-08 Local Search / Genetic Algorithms

AI Programming CS S-08 Local Search / Genetic Algorithms AI Programming CS662-2013S-08 Local Search / Genetic Algorithms David Galles Department of Computer Science University of San Francisco 08-0: Overview Local Search Hill-Climbing Search Simulated Annealing

More information

Local Search. CS 486/686: Introduction to Artificial Intelligence

Local Search. CS 486/686: Introduction to Artificial Intelligence Local Search CS 486/686: Introduction to Artificial Intelligence 1 Overview Uninformed Search Very general: assumes no knowledge about the problem BFS, DFS, IDS Informed Search Heuristics A* search and

More information

Ar#ficial)Intelligence!!

Ar#ficial)Intelligence!! Introduc*on! Ar#ficial)Intelligence!! Roman Barták Department of Theoretical Computer Science and Mathematical Logic We know how to use heuristics in search BFS, A*, IDA*, RBFS, SMA* Today: What if the

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Evolutionary Algorithms. CS Evolutionary Algorithms 1 Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance

More information

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016 Local Search CS 486/686: Introduction to Artificial Intelligence Winter 2016 1 Overview Uninformed Search Very general: assumes no knowledge about the problem BFS, DFS, IDS Informed Search Heuristics A*

More information

Introduction to Optimization Using Metaheuristics. Thomas J. K. Stidsen

Introduction to Optimization Using Metaheuristics. Thomas J. K. Stidsen Introduction to Optimization Using Metaheuristics Thomas J. K. Stidsen Outline General course information Motivation, modelling and solving Hill climbers Simulated Annealing 1 Large-Scale Optimization

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,

More information

An Application of Genetic Algorithms to University Timetabling

An Application of Genetic Algorithms to University Timetabling An Application of Genetic Algorithms to University Timetabling BSc (Hons) Computer Science Robert Gordon University, Aberdeen Author: Alexander Brownlee Project Supervisor: Dr. John McCall Date: 29/04/2005

More information

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007) In the name of God Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm Spring 2009 Instructor: Dr. Masoud Yaghini Outlines Problem Definition Modeling As A Set Partitioning

More information

Comparison of TSP Algorithms

Comparison of TSP Algorithms Comparison of TSP Algorithms Project for Models in Facilities Planning and Materials Handling December 1998 Participants: Byung-In Kim Jae-Ik Shim Min Zhang Executive Summary Our purpose in this term project

More information

Midterm Examination CS 540-2: Introduction to Artificial Intelligence

Midterm Examination CS 540-2: Introduction to Artificial Intelligence Midterm Examination CS 54-2: Introduction to Artificial Intelligence March 9, 217 LAST NAME: FIRST NAME: Problem Score Max Score 1 15 2 17 3 12 4 6 5 12 6 14 7 15 8 9 Total 1 1 of 1 Question 1. [15] State

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

A COMPARATIVE STUDY OF EVOLUTIONARY ALGORITHMS FOR SCHOOL SCHEDULING PROBLEM

A COMPARATIVE STUDY OF EVOLUTIONARY ALGORITHMS FOR SCHOOL SCHEDULING PROBLEM A COMPARATIVE STUDY OF EVOLUTIONARY ALGORITHMS FOR SCHOOL SCHEDULING PROBLEM 1 DANIEL NUGRAHA, 2 RAYMOND KOSALA 1 School of Computer Science, Bina Nusantara University, Jakarta, Indonesia 2 School of Computer

More information

A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition

A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition The University of Nottingham, Nottingham, United Kingdom A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition Yuri Bykov 16 February 2012 ASAP group research

More information

Part II. Computational Intelligence Algorithms

Part II. Computational Intelligence Algorithms Part II Computational Intelligence Algorithms 126 Chapter 5 Population-based Single-objective Algorithms One bee makes no swarm. French proverb This chapter provides an overview of two CI algorithms that

More information

Simulated annealing/metropolis and genetic optimization

Simulated annealing/metropolis and genetic optimization Simulated annealing/metropolis and genetic optimization Eugeniy E. Mikhailov The College of William & Mary Lecture 18 Eugeniy Mikhailov (W&M) Practical Computing Lecture 18 1 / 8 Nature s way to find a

More information

DERIVATIVE-FREE OPTIMIZATION

DERIVATIVE-FREE OPTIMIZATION DERIVATIVE-FREE OPTIMIZATION Main bibliography J.-S. Jang, C.-T. Sun and E. Mizutani. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Prentice Hall, New Jersey,

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Particle Swarm Optimization For N-Queens Problem

Particle Swarm Optimization For N-Queens Problem Journal of Advanced Computer Science and Technology, 1 (2) (2012) 57-63 Science Publishing Corporation www.sciencepubco.com/index.php/jacst Particle Swarm Optimization For N-Queens Problem Aftab Ahmed,

More information

N-Queens problem. Administrative. Local Search

N-Queens problem. Administrative. Local Search Local Search CS151 David Kauchak Fall 2010 http://www.youtube.com/watch?v=4pcl6-mjrnk Some material borrowed from: Sara Owsley Sood and others Administrative N-Queens problem Assign 1 grading Assign 2

More information

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems Artificial bee colony algorithm with multiple onlookers for constrained optimization problems Milos Subotic Faculty of Computer Science University Megatrend Belgrade Bulevar umetnosti 29 SERBIA milos.subotic@gmail.com

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement Outline Informed Search ECE457 Applied Artificial Intelligence Fall 2007 Lecture #3 Heuristics Informed search techniques More on heuristics Iterative improvement Russell & Norvig, chapter 4 Skip Genetic

More information

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com ATI Material Material Do Not Duplicate ATI Material Boost Your Skills with On-Site Courses Tailored to Your Needs www.aticourses.com The Applied Technology Institute specializes in training programs for

More information

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

Midterm Examination CS540-2: Introduction to Artificial Intelligence

Midterm Examination CS540-2: Introduction to Artificial Intelligence Midterm Examination CS540-2: Introduction to Artificial Intelligence March 15, 2018 LAST NAME: FIRST NAME: Problem Score Max Score 1 12 2 13 3 9 4 11 5 8 6 13 7 9 8 16 9 9 Total 100 Question 1. [12] Search

More information

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: Local Search Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat: Select a variable to change Select a new value for that variable Until a satisfying assignment is

More information

Introduction to Optimization Using Metaheuristics. The Lecturer: Thomas Stidsen. Outline. Name: Thomas Stidsen: Nationality: Danish.

Introduction to Optimization Using Metaheuristics. The Lecturer: Thomas Stidsen. Outline. Name: Thomas Stidsen: Nationality: Danish. The Lecturer: Thomas Stidsen Name: Thomas Stidsen: tks@imm.dtu.dk Outline Nationality: Danish. General course information Languages: Danish and English. Motivation, modelling and solving Education: Ph.D.

More information

Administrative. Local Search!

Administrative. Local Search! Administrative Local Search! CS311 David Kauchak Spring 2013 Assignment 2 due Tuesday before class Written problems 2 posted Class participation http://www.youtube.com/watch? v=irhfvdphfzq&list=uucdoqrpqlqkvctckzqa

More information

TABU search and Iterated Local Search classical OR methods

TABU search and Iterated Local Search classical OR methods TABU search and Iterated Local Search classical OR methods tks@imm.dtu.dk Informatics and Mathematical Modeling Technical University of Denmark 1 Outline TSP optimization problem Tabu Search (TS) (most

More information

Predicting Diabetes using Neural Networks and Randomized Optimization

Predicting Diabetes using Neural Networks and Randomized Optimization Predicting Diabetes using Neural Networks and Randomized Optimization Kunal Sharma GTID: ksharma74 CS 4641 Machine Learning Abstract This paper analysis the following randomized optimization techniques

More information

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt TABU search and Iterated Local Search classical OR methods Outline TSP optimization problem Tabu Search (TS) (most important) Iterated Local Search (ILS) tks@imm.dtu.dk Informatics and Mathematical Modeling

More information

Artificial Intelligence Application (Genetic Algorithm)

Artificial Intelligence Application (Genetic Algorithm) Babylon University College of Information Technology Software Department Artificial Intelligence Application (Genetic Algorithm) By Dr. Asaad Sabah Hadi 2014-2015 EVOLUTIONARY ALGORITHM The main idea about

More information

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree

Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree 28 Genetic Algorithm for Dynamic Capacitated Minimum Spanning Tree 1 Tanu Gupta, 2 Anil Kumar 1 Research Scholar, IFTM, University, Moradabad, India. 2 Sr. Lecturer, KIMT, Moradabad, India. Abstract Many

More information

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing

5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing 1. Introduction 2. Cutting and Packing Problems 3. Optimisation Techniques 4. Automated Packing Techniques 5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing 6.

More information

Simple mechanisms for escaping from local optima:

Simple mechanisms for escaping from local optima: The methods we have seen so far are iterative improvement methods, that is, they get stuck in local optima. Simple mechanisms for escaping from local optima: I Restart: re-initialise search whenever a

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

1 Lab + Hwk 5: Particle Swarm Optimization

1 Lab + Hwk 5: Particle Swarm Optimization 1 Lab + Hwk 5: Particle Swarm Optimization This laboratory requires the following equipment: C programming tools (gcc, make), already installed in GR B001 Webots simulation software Webots User Guide Webots

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,

More information

Simulated Annealing. G5BAIM: Artificial Intelligence Methods. Graham Kendall. 15 Feb 09 1

Simulated Annealing. G5BAIM: Artificial Intelligence Methods. Graham Kendall. 15 Feb 09 1 G5BAIM: Artificial Intelligence Methods Graham Kendall 15 Feb 09 1 G5BAIM Artificial Intelligence Methods Graham Kendall Simulated Annealing Simulated Annealing Motivated by the physical annealing process

More information

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING YASIN ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey E-mail: yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman Lecture 4 3B1B Optimization Michaelmas 2017 A. Zisserman Convexity Robust cost functions Optimizing non-convex functions grid search branch and bound simulated annealing evolutionary optimization The Optimization

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function Introduction METAHEURISTICS Some problems are so complicated that are not possible to solve for an optimal solution. In these problems, it is still important to find a good feasible solution close to the

More information

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014

More information

Welfare Navigation Using Genetic Algorithm

Welfare Navigation Using Genetic Algorithm Welfare Navigation Using Genetic Algorithm David Erukhimovich and Yoel Zeldes Hebrew University of Jerusalem AI course final project Abstract Using standard navigation algorithms and applications (such

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

Introduction to Design Optimization: Search Methods

Introduction to Design Optimization: Search Methods Introduction to Design Optimization: Search Methods 1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape

More information

Optimizing the Sailing Route for Fixed Groundfish Survey Stations

Optimizing the Sailing Route for Fixed Groundfish Survey Stations International Council for the Exploration of the Sea CM 1996/D:17 Optimizing the Sailing Route for Fixed Groundfish Survey Stations Magnus Thor Jonsson Thomas Philip Runarsson Björn Ævar Steinarsson Presented

More information

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander Evolutionary Algorithms: Perfecting the Art of Good Enough Liz Sander Source: wikipedia.org Source: fishbase.org Source: youtube.com Sometimes, we can t find the best solution. Sometimes, we can t find

More information

Evolutionary algorithms in communications

Evolutionary algorithms in communications Telecommunications seminar Evolutionary algorithms in Communications and systems Introduction lecture II: More about EAs Timo Mantere Professor Communications and systems engineering University of Vaasa

More information

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search Introduction to Artificial Intelligence 2 nd semester 2016/2017 Chapter 4: Beyond Classical Search Mohamed B. Abubaker Palestine Technical College Deir El-Balah 1 Outlines local search algorithms and optimization

More information

GRASP. Greedy Randomized Adaptive. Search Procedure

GRASP. Greedy Randomized Adaptive. Search Procedure GRASP Greedy Randomized Adaptive Search Procedure Type of problems Combinatorial optimization problem: Finite ensemble E = {1,2,... n } Subset of feasible solutions F 2 Objective function f : 2 Minimisation

More information

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 15-382 COLLECTIVE INTELLIGENCE - S18 LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 INSTRUCTOR: GIANNI A. DI CARO BACKGROUND: REYNOLDS BOIDS Reynolds created a model of coordinated animal

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Fall 09, Homework 5

Fall 09, Homework 5 5-38 Fall 09, Homework 5 Due: Wednesday, November 8th, beginning of the class You can work in a group of up to two people. This group does not need to be the same group as for the other homeworks. You

More information

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA The Binary Genetic Algorithm Universidad de los Andes-CODENSA 1. Genetic Algorithms: Natural Selection on a Computer Figure 1 shows the analogy between biological i l evolution and a binary GA. Both start

More information

Introduction to Genetic Algorithms

Introduction to Genetic Algorithms Advanced Topics in Image Analysis and Machine Learning Introduction to Genetic Algorithms Week 3 Faculty of Information Science and Engineering Ritsumeikan University Today s class outline Genetic Algorithms

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 4, 4/11/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Today: Informed search algorithms

More information

On the application of graph colouring techniques in round-robin sports scheduling

On the application of graph colouring techniques in round-robin sports scheduling On the application of graph colouring techniques in round-robin sports scheduling Rhyd Lewis and Jonathan Thompson A case study at the Welsh Rugby Union School of Mathematics, Cardiff University, LANCS

More information

Genetic Algorithms. Genetic Algorithms

Genetic Algorithms. Genetic Algorithms A biological analogy for optimization problems Bit encoding, models as strings Reproduction and mutation -> natural selection Pseudo-code for a simple genetic algorithm The goal of genetic algorithms (GA):

More information

CS 4349 Lecture October 18th, 2017

CS 4349 Lecture October 18th, 2017 CS 4349 Lecture October 18th, 2017 Main topics for #lecture include #minimum_spanning_trees. Prelude Homework 6 due today. Homework 7 due Wednesday, October 25th. Homework 7 has one normal homework problem.

More information

Kyrre Glette INF3490 Evolvable Hardware Cartesian Genetic Programming

Kyrre Glette INF3490 Evolvable Hardware Cartesian Genetic Programming Kyrre Glette kyrrehg@ifi INF3490 Evolvable Hardware Cartesian Genetic Programming Overview Introduction to Evolvable Hardware (EHW) Cartesian Genetic Programming Applications of EHW 3 Evolvable Hardware

More information

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7. Chapter 7: Derivative-Free Optimization Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.5) Jyh-Shing Roger Jang et al.,

More information

INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM

INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM Advanced OR and AI Methods in Transportation INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM Jorge PINHO DE SOUSA 1, Teresa GALVÃO DIAS 1, João FALCÃO E CUNHA 1 Abstract.

More information

Genetic Algorithms and Genetic Programming Lecture 7

Genetic Algorithms and Genetic Programming Lecture 7 Genetic Algorithms and Genetic Programming Lecture 7 Gillian Hayes 13th October 2006 Lecture 7: The Building Block Hypothesis The Building Block Hypothesis Experimental evidence for the BBH The Royal Road

More information