# Pre-requisite Material for Course Heuristics and Approximation Algorithms

Save this PDF as:

Size: px
Start display at page:

Download "Pre-requisite Material for Course Heuristics and Approximation Algorithms"

## Transcription

1 Pre-requisite Material for Course Heuristics and Approximation Algorithms This document contains an overview of the basic concepts that are needed in preparation to participate in the course. In addition, it will be useful that students have some experience with Java programming and probability theory but this is desirable, not a mandatory pre-requisite. At the start of the course, students will be asked to indicate their level of expertise in these topics in order to inform the delivery of the course. Contents Page 1. Basics of Complexity Theory Computability Computational Problems Computational Algorithms Intractable and Undecidable Problems Time Complexity Tackling NP-Complete Problems 6 2. Basics of Heuristic Search for Optimization Optimization Problems Linear Programming (LP) Integer Programming (IP) Discrete/Continuous/Combinatorial Optimization Approximation Algorithms Greedy Algorithms and Neighbourhood Search Basics of Meta-heuristics Basics of Evolutionary Algorithms 15 ASAP Research Group, School of Computer Science, The University of Nottingham Page 1

2 1. Basics of Complexity Theory 1.1 Computability Computability focuses on answering questions such as: What class of problems can be solved using a computational algorithm? Is there an efficient computational algorithm to solve a given problem? There have been many attempts by computer scientists to define precisely what is an effective procedure or computable function. Intuitively, a computable function can be implemented on a computer, whether in a reasonable computational time or not. An ideal formulation of an effective procedure or computable function involves: 1. A language in which a set of rules can be expressed. 2. A rigorous description of a single machine which can interpret the rules in the given language and carry out the steps of a specified process. 1.2 Computational Problems The classical Travelling Salesman Problem (TSP): Given a set of c cities 1,2,,c and the distance d(i,j) (a non-negative integer ) between two cities i and j, find the shortest tour of the cities. The TSP is one of the most intensively studied problems in mathematics and computer science. A computational problem (e.g. the classical TSP) is generally described by: Its parameters (e.g. set of cities, distance between cities) Statement of the required properties in the solution (e.g. ordering of the cities such that the travelling distance is minimised) An instance of a computational problem gives specific values to the problem parameters and a solution to that instance has a specific value. The size of a problem instance is defined in terms of the amount of input data required to describe its parameters. In practice, the instance size is conveniently expressed in terms of a single variable. For example, the size of an instance of the travelling salesman problem (TSP) can be expressed by: A finite input string describing exactly all parameter values (cities and distances) A number intending to reflect the size of the input string (number of cities) ASAP Research Group, School of Computer Science, The University of Nottingham Page 2

3 An encoding scheme is a mapping between the problem instance and the finite input string that describes the instance. A problem can be described or represented using different encoding schemes. Example. Some encoding schemes that can be used to describe instances of the TSP. Consider the problem instance shown below for these coding schemes. Examples of encoding schemes are: C1 7 C2 a) Vertex list-edge list 4 6 {c1,c2,c3,c4} {(c1,c2),(c1,c3),(c1,c4),(c2,c3),(c2,c4),(c3,c4)} b) Neighbour list 12 7 C3 15 c1-{c2,c3,c4} c2-{c3} c3-{c4} C4 c) Adjacency matrix c1 c2 c3 c4 c c c c Computational Algorithms A computational algorithm is a deterministic step-by-step procedure to solve a computational problem. That is, the computational algorithm should produce a solution for any instance of the computational problem. The interest is to find an efficient algorithm, in terms of computing resources, to solve a given problem. Practical computing time is typically the factor which determines if an algorithm can be considered efficient or not. The time complexity of a computational algorithm refers to the largest amount of time required by the algorithm to solve a problem instance of a given size. ASAP Research Group, School of Computer Science, The University of Nottingham Page 3

4 For a given problem, how to assess if a computational algorithm is efficient? Computational perspective: polynomial time vs. exponential time complexity. The complexity of a function f(n) is O(g(n)) if f(n) c g(n) for all n>0 and some constant c. An algorithm has polynomial time complexity if its complexity function is given by O(g(n)) where g(n) is a polynomial function for the input length n. 1.4 Intractable and Undecidable Problems A computational problem is considered intractable if no computationally efficient algorithm can be given to solve it, or in other words, if no polynomial time algorithm can be found to solve it. Although polynomial time algorithms are preferred, some exponential time algorithms are considered good to solve some problems. This is because the algorithm may have exponential complexity in the worst case but it performs satisfactorily on the average. The tractability of a given problem is independent of the encoding scheme and the computer model. The tractability of a problem can refer to: An exponential amount of computational time is needed to find a solution. For example solving the TSP with algorithm A3. A solution is too long that cannot be described with a polynomial expression on the length of the input. For example, in a TSP instance, asking for all tours that have length L or less. A computational problem is considered undecidable if there is no algorithm at all can be given to solve it. Therefore, an undecidable problem is also intractable. Many intractable problems are decidable and can be solved in polynomial time with a nondeterministic computing model. The NP-complete class of problems include decision problems (the answer is either YES or NO) that can be solved by a non-deterministic computing model in polynomial time. The conjecture is that NP-complete problems are intractable but this is a very much open question. But at least, if the problem is NP-complete there is no known polynomial time algorithm to solve it. 1.5 Time Complexity We attempt to separate the practically solvable problems from those that cannot be solved in a reasonable time. ASAP Research Group, School of Computer Science, The University of Nottingham Page 4

5 When constructing a computer algorithm it is important to assess how expensive the algorithm is in terms of storage and time. Algorithm complexity analysis tells how much time does an algorithm takes to solve a problem. The size of a problem instance is typically measured by the size of the input that describes the given instance. Algorithm complexity analysis is based on counting primitive operations such as arithmetic, logical, reads, writes, data retrieval, data storage, etc. time A (n) expresses the time complexity of algorithm A and is defined as the greatest number of primitive operations that could be performed by the algorithm given a problem instance of size n. We are usually more interested in the rate of growth of time A than in the actual number of operations in the algorithm. Thus, to determine whether it is practical or feasible to implement A we require a not too large upper bound for time A (n). Time complexity of algorithms is expressed using the O-notation. The complexity of a function f(n) is O(g(n)) if f(n) c g(n) for all n>0 and some constant c. Let f and g be functions on n, then: f(n) = O(g(n)) means that f is of the order of g, that is, f grows at the same rate of g or slower. f(n) = (g(n)) means that f grows faster than g f(n) = (g(n)) means that f and g grow at the same rate. An algorithm is said to perform in polynomial time if and only if its time complexity function is given by O(n k ) for some k 0. That is, an algorithm A is polynomial time if time A (n) f(n) and f is a polynomial, i.e. f has the form: a 1 n k + a 2 n k a m Then, the class P is defined as the class of problems which can be solved by polynomial time deterministic algorithms. The notion is that problems in P can be solved quickly. Then, the class NP is defined as the class of problems which can be solved by polynomial time nondeterministic algorithms. The notion is that problems in NP can be verified quickly. ASAP Research Group, School of Computer Science, The University of Nottingham Page 5

6 Then, within the class NP there are easy problems, those in P, and very much harder problems called NP-complete problems. Class NP-complete Class NP Class P Intuitively, the class NP-complete are the hardest problems in NP as defined by Cook s theorem. 1.6 Tackling NP-Complete Problems There are two approaches: 1. Search techniques that although of exponential time complexity aim for optimality and are an improvement over exhaustive search. These include: 1) techniques that construct partial solutions, e.g. branch and bound, and 2) techniques that reduce the worst time complexity by making clever choices while searching the tree, e.g. specific algorithms for some NP-complete problems. 2. Search techniques that do not aim to find the optimal solutions but instead aim for sub-optimal solutions, that is: 1) approximation solution that is optimal up to a small constant factor, e.g. approximation algorithms, or 2) good enough solution reasonable fast, e.g. local search heuristics. A combinatorial optimization problem (maximization or minimization) consists of three parts: 1. A set of instances D 2. For each instance I a finite set S D (I) of candidate solutions to I 3. An evaluation function F that assigns to each instance I and each candidate solution s S D (I) a positive rational number F(I,s) called the solution value of s. If is a maximisation problem then: An optimal solution s* is one that for all s S D (I), F(I,s*) F(I,s) then OPT(I) is used to denote the value for an optimal solution to the instance I. An algorithm A is called an approximation algorithm if it finds a solution s S D (I) with value A(I). The algorithm is called an optimization algorithm if A(I)=OPT(I). If is NP-complete, no polynomial time algorithm can be found unless P=NP. The option then is to find an approximation algorithm or heuristic, called algorithm A, that runs in reasonable time and for each instance I of, A(I) is close to OPT(I). ASAP Research Group, School of Computer Science, The University of Nottingham Page 6

7 2. Basics of Heuristic Search for Optimization 2.1 Optimization Problems Basic ingredients of optimization problems: A set of decision variables which affect the value of the objective function. In the travelling salesman problem, the variables might be of the type X ij {0,1} where the value of the variable indicates whether city j is visited after city i or not; in fitting-the-data problem, the variables are the parameters that define the model. An objective function which we want to minimize or maximize. It relates the decision variables to the goal and measures goal attainment. For instance, in the travelling salesman problem, we want to minimise the total distance travelled for visiting all cities in a tour; in fitting experimental data to a user-defined model, we might minimize the total deviation of observed data from predictions based on the model. A set of constraints that allow the variables to take on certain values but exclude others. For some variations of the travelling salesman problem, it may not be allowed to visit some cities in a certain sequence, so restricting the values that certain decision variables can take in a solution to the problem instance. The optimization problem is defined as follows: Find values for all the decision variables that minimize or maximize the objective function while satisfying the given constraints. Formal representation of optimization problems are described by mathematical programming. Mathematical programming is used to describe problems of choosing the best element from some set of available alternatives or in OR terms to describe problems how to allocate limited resources. Mathematical formulation: minimize f(x) subject to g i (x) b i, i=1,,m h j (x) =c j, j=1,,n where x is a vector of decision variables, f(),g i (), h i () are general functions. Similar formulation of the problem holds when the objective function has to be maximized. Placing restrictions on the type of functions under consideration and on the values that the decision variables can take lead to specific classes of optimization problems. A solution which satisfies all constraints is called a feasible solution. ASAP Research Group, School of Computer Science, The University of Nottingham Page 7

8 2.2 Linear Programming (LP) In an LP formulation, f(),gi(), hi() are linear functions of decision variables LP is one of the most important tools in OR The typical LP problem involves: limited resources competing activities measure of solution quality constraints on the use of resources the optimal solution LP is a mathematical model with: decision variables and parameters linear algebraic expressions (objective function and constraints) planning activities A standard LP formulation for the problem of finding the optimal allocation of limited resources to competing activities. Note that constraints can involve inequalities or, strict or not and equalities. 2.3 Integer programming (IP) In many applications, the solution of an optimization problem makes sense only if decision variables are integers. Therefore, a solution XZ n, where Z n is the set of n-dimensional integer vectors. In mixed-integer programs, some components of X are allowed to be real. In general, integer linear problems are much more difficult than regular linear programming. Integer programming problems, such as the travelling salesman problem, are often expressed in terms of binary variables. Linear and integer programming have proved valuable for modelling many and diverse types of problems in manufacturing, routing, scheduling, assignment, and design. Industries that make use of LP and its extensions include transportation, energy, telecommunications, and manufacturing of many kinds. ASAP Research Group, School of Computer Science, The University of Nottingham Page 8

9 2.4 Discrete/Continuous/Combinatorial Optimization In discrete optimization, the variables used in the mathematical programming formulation are restricted to assume only discrete values, such as the integers. In continuous optimization, the variables used in the objective function can assume real values, e.g., values from intervals of the real line. In combinatorial optimization, decision variables are discrete, which means that the solution is a set, or sequence of integers or other discrete objects. The number of possible solutions is combinatorial number (extra large number, such as ). Combinatorial optimization problems include problems on graphs, matroids and other discrete structures. An example of combinatorial optimization is the set covering problem. Given a finite set of N items and a set of available features U, the problem is to select the smallest subset of items that together include the subset F U of desirable features. Assume that c ij =1 if item i includes feature j, otherwise c ij =0. Mathematical programming formulation for the set covering problem: The mathematical formulation for a specific set covering problem instance: Why is Set Covering a difficult optimization problem to solve? For a Set Covering instance with N = 4 decision variables x i, the corresponding search tree is shown below. Each branch in the tree represents a potential solution and a search algorithm needs to ASAP Research Group, School of Computer Science, The University of Nottingham Page 9

10 explore this search space. As the number of decision variables increases, the search tree may grow exponentially. 2.5 Approximation Algorithms For the BIN PACKING problem with a set of items U={u i,u 2,,u n } each with size given by a rational number s i and a set of bins each with capacity equal to 1. The optimization problem is to find a partition of U into disjoint subsets U 1, U 2,,U k such that the sum of sizes of the items in each U i is no more than 1 and such that k is as small as possible. That is, the goal is to pack all the items in as few bins as possible. The First Fit (FF) algorithm is an approximation algorithm for BIN PACKING: - given the sequence of unit-capacity bins B1,B2,B3 - place the items into the bins, one at a time, item ui is placed in the lowest-indexed bin with enough remaining capacity for si which is the item size It has been proven that the worst performance: FF(I) 1.7 OPT(I) + 2 The Best Fit (BF) algorithm is another approximation algorithm for BIN PACKING: - given the sequence of unit-capacity bins B1,B2,B3 - place the items into the bins, one at a time, item ui is placed in bin with enough remaining capacity closest to but not exceeding 1 s1 It has been proven that the worst performance: BF(I) 1.7 OPT(I) + 2 The performance of the FF and BF approximation algorithms for BIN PACKING depends very much on the ordering of the items, in particular whether larger or smaller items appear before in the sequence. ASAP Research Group, School of Computer Science, The University of Nottingham Page 10

11 The First Fit Decreasing (FFD) algorithm is another approximation algorithm for BIN PACKING: - given the sequence of unit-capacity bins B 1,B 2,B 3 - order the items in non-increasing order of their size - place the items into the bins, one at a time, item ui is placed in the lowest-indexed bin with enough remaining capacity for si which is the item size (as in FF) It has been proven that the worst performance: FFD(I) 1.22 OPT(I) Greedy Algorithms and Neighbourhood Search Real-world optimization problems (combinatorial or continuous) are difficult to solve for several reasons: Size of the search space is very large (exhaustive search impractical) Simplified models are needed to facilitate an answer Difficult to design an accurate evaluation function Existence of large number of constraints Difficult to collect accurate data Human limitations to construct solutions Hard constraints must be satisfied for a solution to be feasible. Soft constraints are not mandatory but desirable conditions in good quality solutions. Given search space S and a set of feasible solutions F S, a search problem is to find a solution x S such that: Two solutions x,y are said to be neighbours if they are close to each other in the sense that the distance between their corresponding encodings or representations is within a certain limit. Some search algorithms work with complete solutions, i.e. try to go from the current solution to a better one. Example: neighbourhood search. Other search algorithms work with partial solutions, i.e. construct a complete solution step by step. Example: greedy algorithms. Example. Consider a symmetric n-city TSP over a complete graph with solutions encoded as permutations. permutation of integers 1 to n totalnumber of permutations : n! n! ignoring symmetrictours : 2 ignoring shifted identical tours : n 1! 2 e.g e.g e.g ASAP Research Group, School of Computer Science, The University of Nottingham Page 11

12 Evaluation Function S is set of solutions and a solution s c c c f(s) Objective i.e. find a d( c, c 1 2 ) d( c find ss such that f ( s) 2, c ) d( c 3 f ( s') n1, c tour with the minimum length Neighbourhood Solutions 2 - opt move :interchanges current : n 1 s' S 2 3 c ) d( c 2 non - adjacent edges n n, c ) 1 then neighbours : , , , etc. 2 - right_insert move :moves a city 2 positionsto the right current : Greedy Algorithm 1 1.Select random starting city 2. Proceed 3. Repeat step 2 until all 4. Return tostartingcity Greedy Algorithm 2 1. Find shortest edge 3. Add cities neighbours : , , etc. to nearest unvisited city ( c,c 2.Select next chapest edge (making sure no city is c,c r t to the tour cities are visited ) and add cities c,c ( c,c ) visited more than once) 4. Repeat steps2-3 until a tour is formed i j r t i j to the tour In combinatorial optimization problems, the neighbourhood of a solution x in the search space S can be defined as: N( x) y S y ( x) where is a move or sequence of moves A feasible solution x is a local optimum with respect to N(x) if: f ( x) f ( y) y N( x) (assuming minimisation) If the inequality above is strict then x is a strict local optimum. ASAP Research Group, School of Computer Science, The University of Nottingham Page 12

13 The graph below illustrates a fitness landscape with local optima, global optima, hills, valleys and plateaus. Neighbourhood search refers to exploring solutions within the neighbourhood of the current solution with the goal to find the best solution in the vicinity, i.e. a local optimum. Trade-off in neighbourhood search: N(x) vs. Search Time Note: defining appropriate neighbourhoods according to the problem domain and the search strategy is crucial in heuristic local search. Neighbourhood Size The k-exchange neighbourhood is widely used in many combinatorial optimization problems. Typically, k-exchange neighbourhood = O(n k ) Neighbourhood pruning: consider only a subset of neighbours which are selected based on the state of the current solution. Some Strategies for Escaping Local Optima Re-initialise the search from a different starting solution Accept non-improving candidate solutions to replace the current solution in deterministic or probabilistic manner Explore only a fraction of the neighbourhood (e.g. pruning) Use more than one neighbourhood to generate candidate solutions Design composite (independent or not) neighbourhoods Maintain a memory of already visited solutions or regions Modify the fitness function to change the search landscape Perturb local optima in deterministic or probabilistic manner Improve more than one solutions simultaneously ASAP Research Group, School of Computer Science, The University of Nottingham Page 13

14 2.7 Basics of Meta-heuristics Greedy heuristics and simple local search have some limitations with respect to the quality of the solution that they can produce. A good balance between intensification and diversification is crucial in order to obtain high-quality solutions for difficult problems. A meta-heuristic can be defined as an iterative master process that guides and modifies the operations of subordinate heuristics to efficiently produce high-quality solutions. It may manipulate a complete (or incomplete) single solution or a collection of solutions at each iteration. The subordinate heuristics may be high (or low) level procedures, or a simple local search, or just a construction method (Voss et al. 1999). A meta-heuristic method is meant to provide a generalized approach that can be applied to different problems. Meta-heuristics do not need a formal mathematical model of the problem to solve. An example of a basic meta-heuristics is the iterated local search approach, an intuitive method to combine intensification and diversification. The local search and perturbation operators should be carefully designed so that they work well in combination. Flow Diagram ASAP Research Group, School of Computer Science, The University of Nottingham Page 14

15 Some Types of Meta-heuristics Single-solution method vs. Population-based methods Nature-inspired method vs. Non-nature inspired methods Pure methods vs. Hybrid methods Memory-based vs. Memory-less methods Deterministic vs. Stochastic methods Iterative vs. Greedy methods The following algorithms are examples of meta-heuristics: Iterated local search Threshold acceptance Great deluge Simulated annealing Greedy randomised search procedure Guided local search Variable neighbourhood search Tabu search Evolutionary algorithms Particle swarm optimization Artificial immune systems Etc. 2.8 Basics of Evolutionary Algorithms The rationale for evolutionary algorithms (EAs) is to maintain a population of solutions during the search. The solution (individuals) compete between them and are subject to selection, and reproduction operators during a number of generations. Exploitation vs. Exploration Having many solutions instead of only one. Survival of the fittest principle. Pass on good solution components through recombination. Explore and discover new components through self-adaptation. Solutions are modified from generation to generation by means of reproduction operators (recombination and self-adaptation). ASAP Research Group, School of Computer Science, The University of Nottingham Page 15

16 Typical Evolutionary Algorithm Cycle Typical Components of an EA Solution representation Population size Initialisation strategy Evaluation function Reproduction operators Selection mechanism Stopping condition to detect convergence A diverse range of evolutionary procedures can be designed for the subject problem by tuning the above components. It is crucial to design components and tune parameters while considering the choices made on the various parts of the algorithm. Some Examples of Evolutionary Algorithms Genetic Algorithms (GA) Evolutionary Strategies (ES) Genetic Programming (GP) Ant Colony Optimization (ACO) Memetic Algorithms (MA) ASAP Research Group, School of Computer Science, The University of Nottingham Page 16

17 Particle Swarm Optimization (PSO) Differential Evolution (DE) Estimation of Distribution Algorithm (EDA) Cultural Algorithms (CA) Since this a very active research area, new variants of EAs and other population-based metaheuristics are often proposed in the specialised literature. In optimization problems the goal is to find the best feasible solution. Issues When Dealing with Infeasible Solutions Evaluate quality of feasible and infeasible solutions Discriminate between feasible and infeasible solutions Decide if infeasibility should be eliminated Decide if infeasibility should be repaired Decide if infeasibility should be penalised Initiate the search with feasible solutions, infeasible ones or both Maintain feasibility with specialised representations, moves and operators Design a decoder to translate an encoding into a feasible solution Focus the search on specific areas, e.g. in the boundaries A constraint handling technique is required to help the effective and efficient exploration of the search space. ASAP Research Group, School of Computer Science, The University of Nottingham Page 17

### A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

### 3 INTEGER LINEAR PROGRAMMING

3 INTEGER LINEAR PROGRAMMING PROBLEM DEFINITION Integer linear programming problem (ILP) of the decision variables x 1,..,x n : (ILP) subject to minimize c x j j n j= 1 a ij x j x j 0 x j integer n j=

### 1 The Traveling Salesperson Problem (TSP)

CS 598CSC: Approximation Algorithms Lecture date: January 23, 2009 Instructor: Chandra Chekuri Scribe: Sungjin Im In the previous lecture, we had a quick overview of several basic aspects of approximation

### CHAPTER 4 GENETIC ALGORITHM

69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

### NP-Complete Problems

NP-omplete Problems P and NP Polynomial time reductions Satisfiability Problem, lique Problem, Vertex over, and ominating Set 10/19/2009 SE 5311 FLL 2009 KUMR 1 Polynomial lgorithms Problems encountered

### A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem Richard E. Mowe Department of Statistics St. Cloud State University mowe@stcloudstate.edu Bryant A. Julstrom Department

### P and NP (Millenium problem)

CMPS 2200 Fall 2017 P and NP (Millenium problem) Carola Wenk Slides courtesy of Piotr Indyk with additions by Carola Wenk CMPS 2200 Introduction to Algorithms 1 We have seen so far Algorithms for various

### Heuristic Optimization Introduction and Simple Heuristics

Heuristic Optimization Introduction and Simple Heuristics José M PEÑA (jmpena@fi.upm.es) (Universidad Politécnica de Madrid) 1 Outline 1. What are optimization problems? 2. Exhaustive vs. Heuristic approaches

### Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R

### An iteration of Branch and Bound One iteration of Branch and Bound consists of the following four steps: Some definitions. Branch and Bound.

ranch and ound xamples and xtensions jesla@man.dtu.dk epartment of Management ngineering Technical University of enmark ounding ow do we get ourselves a bounding function? Relaxation. Leave out some constraints.

### Algorithms for the Precedence Constrained Generalized Travelling Salesperson Problem

MASTER S THESIS Algorithms for the Precedence Constrained Generalized Travelling Salesperson Problem RAAD SALMAN Department of Mathematical Sciences Division of Mathematics CHALMERS UNIVERSITY OF TECHNOLOGY

### Introduction to Combinatorial Algorithms

Fall 2009 Intro Introduction to the course What are : Combinatorial Structures? Combinatorial Algorithms? Combinatorial Problems? Combinatorial Structures Combinatorial Structures Combinatorial structures

### An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water

### Variable Neighbourhood Search (VNS)

Variable Neighbourhood Search (VNS) Key dea: systematically change neighbourhoods during search Motivation: recall: changing neighbourhoods can help escape local optima a global optimum is locally optimal

### Ant Colony Optimization

Ant Colony Optimization CompSci 760 Patricia J Riddle 1 Natural Inspiration The name Ant Colony Optimization was chosen to reflect its original inspiration: the foraging behavior of some ant species. It

### GENETIC ALGORITHM with Hands-On exercise

GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic

### Constructive and destructive algorithms

Constructive and destructive algorithms Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Constructive algorithms In combinatorial optimization problems every

### of optimization problems. In this chapter, it is explained that what network design

CHAPTER 2 Network Design Network design is one of the most important and most frequently encountered classes of optimization problems. In this chapter, it is explained that what network design is? The

### Artificial Intelligence

Artificial Intelligence Information Systems and Machine Learning Lab (ISMLL) Tomáš Horváth 16 rd November, 2011 Informed Search and Exploration Example (again) Informed strategy we use a problem-specific

### Modified Order Crossover (OX) Operator

Modified Order Crossover (OX) Operator Ms. Monica Sehrawat 1 N.C. College of Engineering, Israna Panipat, Haryana, INDIA. Mr. Sukhvir Singh 2 N.C. College of Engineering, Israna Panipat, Haryana, INDIA.

### Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

### Assignment 3b: The traveling salesman problem

Chalmers University of Technology MVE165 University of Gothenburg MMG631 Mathematical Sciences Linear and integer optimization Optimization with applications Emil Gustavsson Assignment information Ann-Brith

### A HIGH PERFORMANCE ALGORITHM FOR SOLVING LARGE SCALE TRAVELLING SALESMAN PROBLEM USING DISTRIBUTED MEMORY ARCHITECTURES

A HIGH PERFORMANCE ALGORITHM FOR SOLVING LARGE SCALE TRAVELLING SALESMAN PROBLEM USING DISTRIBUTED MEMORY ARCHITECTURES Khushboo Aggarwal1,Sunil Kumar Singh2, Sakar Khattar3 1,3 UG Research Scholar, Bharati

### An Introduction to Evolutionary Algorithms

An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/

### Introduction to Algorithms

Introduction to Algorithms 6.046J/18.401 Lecture 21 Prof. Piotr Indyk P vs NP (interconnectedness of all things) A whole course by itself We ll do just two lectures More in 6.045, 6.840J, etc. Introduction

### Constrained Minimum Spanning Tree Algorithms

December 8, 008 Introduction Graphs and MSTs revisited Minimum Spanning Tree Algorithms Algorithm of Kruskal Algorithm of Prim Constrained Minimum Spanning Trees Bounded Diameter Minimum Spanning Trees

### Introduction to Approximation Algorithms

Introduction to Approximation Algorithms Dr. Gautam K. Das Departmet of Mathematics Indian Institute of Technology Guwahati, India gkd@iitg.ernet.in February 19, 2016 Outline of the lecture Background

### Artificial Intelligence

Artificial Intelligence Information Systems and Machine Learning Lab (ISMLL) Tomáš Horváth 10 rd November, 2010 Informed Search and Exploration Example (again) Informed strategy we use a problem-specific

### Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)

### Dr. Mustafa Jarrar. Chapter 4 Informed Searching. Sina Institute, University of Birzeit

Lecture Notes, Advanced Artificial Intelligence (SCOM7341) Sina Institute, University of Birzeit 2 nd Semester, 2012 Advanced Artificial Intelligence (SCOM7341) Chapter 4 Informed Searching Dr. Mustafa

### Algorithm Design and Analysis

Algorithm Design and Analysis LECTURE 29 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/7/2016 Approximation

### Evolutionary Non-linear Great Deluge for University Course Timetabling

Evolutionary Non-linear Great Deluge for University Course Timetabling Dario Landa-Silva and Joe Henry Obit Automated Scheduling, Optimisation and Planning Research Group School of Computer Science, The

### 11.1 Facility Location

CS787: Advanced Algorithms Scribe: Amanda Burton, Leah Kluegel Lecturer: Shuchi Chawla Topic: Facility Location ctd., Linear Programming Date: October 8, 2007 Today we conclude the discussion of local

### NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.

CS 787: Advanced Algorithms NP-Hardness Instructor: Dieter van Melkebeek We review the concept of polynomial-time reductions, define various classes of problems including NP-complete, and show that 3-SAT

### An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China

### Informed search algorithms. (Based on slides by Oren Etzioni, Stuart Russell)

Informed search algorithms (Based on slides by Oren Etzioni, Stuart Russell) The problem # Unique board configurations in search space 8-puzzle 9! = 362880 15-puzzle 16! = 20922789888000 10 13 24-puzzle

### Today s s lecture. Lecture 3: Search - 2. Problem Solving by Search. Agent vs. Conventional AI View. Victor R. Lesser. CMPSCI 683 Fall 2004

Today s s lecture Search and Agents Material at the end of last lecture Lecture 3: Search - 2 Victor R. Lesser CMPSCI 683 Fall 2004 Continuation of Simple Search The use of background knowledge to accelerate

### First-improvement vs. Best-improvement Local Optima Networks of NK Landscapes

First-improvement vs. Best-improvement Local Optima Networks of NK Landscapes Gabriela Ochoa 1, Sébastien Verel 2, and Marco Tomassini 3 1 School of Computer Science, University of Nottingham, Nottingham,

### Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Iterative Improvement Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible

### Hardware-Software Codesign

Hardware-Software Codesign 4. System Partitioning Lothar Thiele 4-1 System Design specification system synthesis estimation SW-compilation intellectual prop. code instruction set HW-synthesis intellectual

### Model Parameter Estimation

Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about model parameter

### COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section 35.1-35.2(CLRS) 1 Coping with NP-Completeness Brute-force search: This is usually only a viable option for small

### CT79 SOFT COMPUTING ALCCS-FEB 2014

Q.1 a. Define Union, Intersection and complement operations of Fuzzy sets. For fuzzy sets A and B Figure Fuzzy sets A & B The union of two fuzzy sets A and B is a fuzzy set C, written as C=AUB or C=A OR

### Comparison of TSP Algorithms

Comparison of TSP Algorithms Project for Models in Facilities Planning and Materials Handling December 1998 Participants: Byung-In Kim Jae-Ik Shim Min Zhang Executive Summary Our purpose in this term project

### Combinatorial Optimization - Lecture 14 - TSP EPFL

Combinatorial Optimization - Lecture 14 - TSP EPFL 2012 Plan Simple heuristics Alternative approaches Best heuristics: local search Lower bounds from LP Moats Simple Heuristics Nearest Neighbor (NN) Greedy

### ABSTRACT I. INTRODUCTION. J Kanimozhi *, R Subramanian Department of Computer Science, Pondicherry University, Puducherry, Tamil Nadu, India

ABSTRACT 2018 IJSRSET Volume 4 Issue 4 Print ISSN: 2395-1990 Online ISSN : 2394-4099 Themed Section : Engineering and Technology Travelling Salesman Problem Solved using Genetic Algorithm Combined Data

### 3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

### Variable Neighborhood Search

Variable Neighborhood Search Hansen and Mladenovic, Variable neighborhood search: Principles and applications, EJOR 43 (2001) 1 Basic notions of VNS Systematic change of the neighborhood in search Does

### Module 6 P, NP, NP-Complete Problems and Approximation Algorithms

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms Dr. Natarajan Meghanathan Associate Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu

### Origins of Operations Research: World War II

ESD.83 Historical Roots Assignment METHODOLOGICAL LINKS BETWEEN OPERATIONS RESEARCH AND STOCHASTIC OPTIMIZATION Chaiwoo Lee Jennifer Morris 11/10/2010 Origins of Operations Research: World War II Need

### Chapter 9 Graph Algorithms

Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures Chapter 9 Graph s 2 Definitions Definitions an undirected graph is a finite set

### Rollout Algorithms for Discrete Optimization: A Survey

Rollout Algorithms for Discrete Optimization: A Survey by Dimitri P. Bertsekas Massachusetts Institute of Technology Cambridge, MA 02139 dimitrib@mit.edu August 2010 Abstract This chapter discusses rollout

### Traveling Salesman Problem. Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij

Traveling Salesman Problem Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij 1 Contents TSP and its applications Heuristics and approximation algorithms Construction heuristics,

### Heuristic Optimisation Lecture Notes

Heuristic Optimisation Lecture Notes Sándor Zoltán Németh School of Mathematics, University of Birmingham Watson Building, Room 34 Email: s.nemeth@bham.ac.uk Overview Module structure Topics Assessment

### Combining Two Local Searches with Crossover: An Efficient Hybrid Algorithm for the Traveling Salesman Problem

Combining Two Local Searches with Crossover: An Efficient Hybrid Algorithm for the Traveling Salesman Problem Weichen Liu, Thomas Weise, Yuezhong Wu and Qi Qi University of Science and Technology of Chine

### Solving Traveling Salesman Problem for Large Spaces using Modified Meta- Optimization Genetic Algorithm

Solving Traveling Salesman Problem for Large Spaces using Modified Meta- Optimization Genetic Algorithm Maad M. Mijwel Computer science, college of science, Baghdad University Baghdad, Iraq maadalnaimiy@yahoo.com

### Innovative Systems Design and Engineering ISSN (Paper) ISSN (Online) Vol.5, No.1, 2014

Abstract Tool Path Optimization of Drilling Sequence in CNC Machine Using Genetic Algorithm Prof. Dr. Nabeel Kadim Abid Al-Sahib 1, Hasan Fahad Abdulrazzaq 2* 1. Thi-Qar University, Al-Jadriya, Baghdad,

### A NEW HEURISTIC ALGORITHM FOR MULTIPLE TRAVELING SALESMAN PROBLEM

TWMS J. App. Eng. Math. V.7, N.1, 2017, pp. 101-109 A NEW HEURISTIC ALGORITHM FOR MULTIPLE TRAVELING SALESMAN PROBLEM F. NURIYEVA 1, G. KIZILATES 2, Abstract. The Multiple Traveling Salesman Problem (mtsp)

### A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP Wael Raef Alkhayri Fahed Al duwairi High School Aljabereyah, Kuwait Suhail Sami Owais Applied Science Private University Amman,

### ARTIFICIAL INTELLIGENCE

BABEŞ-BOLYAI UNIVERSITY Faculty of Computer Science and Mathematics ARTIFICIAL INTELLIGENCE Solving search problems Informed search strategies Global and local 2 Topics A. Short introduction in Artificial

### Integer Programming ISE 418. Lecture 1. Dr. Ted Ralphs

Integer Programming ISE 418 Lecture 1 Dr. Ted Ralphs ISE 418 Lecture 1 1 Reading for This Lecture N&W Sections I.1.1-I.1.4 Wolsey Chapter 1 CCZ Chapter 2 ISE 418 Lecture 1 2 Mathematical Optimization Problems

### Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms 2 Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures 3 Definitions an undirected graph G = (V, E) is a

### Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization K. Tang 1, X. Yao 1, 2, P. N. Suganthan 3, C. MacNish 4, Y. P. Chen 5, C. M. Chen 5, Z. Yang 1 1

### Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

CS880: Approximations Algorithms Scribe: Chi Man Liu Lecturer: Shuchi Chawla Topic: Local Search: Max-Cut, Facility Location Date: 2/3/2007 In previous lectures we saw how dynamic programming could be

### Solving Travelling Salesman Problem Using Variants of ABC Algorithm

Volume 2, No. 01, March 2013 ISSN 2278-1080 The International Journal of Computer Science & Applications (TIJCSA) RESEARCH PAPER Available Online at http://www.journalofcomputerscience.com/ Solving Travelling

### RESEARCH ARTICLE. Accelerating Ant Colony Optimization for the Traveling Salesman Problem on the GPU

The International Journal of Parallel, Emergent and Distributed Systems Vol. 00, No. 00, Month 2011, 1 21 RESEARCH ARTICLE Accelerating Ant Colony Optimization for the Traveling Salesman Problem on the

### Algorithm Analysis. Gunnar Gotshalks. AlgAnalysis 1

Algorithm Analysis AlgAnalysis 1 How Fast is an Algorithm? 1 Measure the running time» Run the program for many data types > Use System.currentTimeMillis to record the time Worst Time Average Best» Usually

### Random Search Report An objective look at random search performance for 4 problem sets

Random Search Report An objective look at random search performance for 4 problem sets Dudon Wai Georgia Institute of Technology CS 7641: Machine Learning Atlanta, GA dwai3@gatech.edu Abstract: This report

### DIT411/TIN175, Artificial Intelligence. Peter Ljunglöf. 23 January, 2018

DIT411/TIN175, Artificial Intelligence Chapters 3 4: More search algorithms CHAPTERS 3 4: MORE SEARCH ALGORITHMS DIT411/TIN175, Artificial Intelligence Peter Ljunglöf 23 January, 2018 1 TABLE OF CONTENTS

### Ant Colony Optimization: A Component-Wise Overview

Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Ant Colony Optimization: A Component-Wise Overview M. López-Ibáñez, T. Stützle,

### 4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

4/22/24 s Diwakar Yagyasen Department of Computer Science BBDNITM Visit dylycknow.weebly.com for detail 2 The basic purpose of a genetic algorithm () is to mimic Nature s evolutionary approach The algorithm

### Tabu Search for Constraint Solving and Its Applications. Jin-Kao Hao LERIA University of Angers 2 Boulevard Lavoisier Angers Cedex 01 - France

Tabu Search for Constraint Solving and Its Applications Jin-Kao Hao LERIA University of Angers 2 Boulevard Lavoisier 49045 Angers Cedex 01 - France 1. Introduction The Constraint Satisfaction Problem (CSP)

### Artificial Intelligence Application (Genetic Algorithm)

Babylon University College of Information Technology Software Department Artificial Intelligence Application (Genetic Algorithm) By Dr. Asaad Sabah Hadi 2014-2015 EVOLUTIONARY ALGORITHM The main idea about

### Dynamic programming. Trivial problems are solved first More complex solutions are composed from the simpler solutions already computed

Dynamic programming Solves a complex problem by breaking it down into subproblems Each subproblem is broken down recursively until a trivial problem is reached Computation itself is not recursive: problems

### Machine Learning for Software Engineering

Machine Learning for Software Engineering Single-State Meta-Heuristics Prof. Dr.-Ing. Norbert Siegmund Intelligent Software Systems 1 2 Recap: Goal is to Find the Optimum Challenges of general optimization

### Time Complexity Analysis of the Genetic Algorithm Clustering Method

Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti

### MLR Institute of Technology

Course Name : Engineering Optimization Course Code : 56021 Class : III Year Branch : Aeronautical Engineering Year : 2014-15 Course Faculty : Mr Vamsi Krishna Chowduru, Assistant Professor Course Objective

### Evolutionary Algorithms. CS Evolutionary Algorithms 1

Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance

### Design and Analysis of Algorithms

CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

### Computational complexity

Computational complexity Heuristic Algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Definitions: problems and instances A problem is a general question expressed in

### A Parallel Architecture for the Generalized Travelling Salesman Problem: Project Proposal

A Parallel Architecture for the Generalized Travelling Salesman Problem: Project Proposal Max Scharrenbroich, maxfs at umd.edu Dr. Bruce Golden, R. H. Smith School of Business, bgolden at rhsmith.umd.edu

### Polynomial-Time Approximation Algorithms

6.854 Advanced Algorithms Lecture 20: 10/27/2006 Lecturer: David Karger Scribes: Matt Doherty, John Nham, Sergiy Sidenko, David Schultz Polynomial-Time Approximation Algorithms NP-hard problems are a vast

### LECTURE 9 Data Structures: A systematic way of organizing and accessing data. --No single data structure works well for ALL purposes.

LECTURE 9 Data Structures: A systematic way of organizing and accessing data. --No single data structure works well for ALL purposes. Input Algorithm Output An algorithm is a step-by-step procedure for

### CS 440 Theory of Algorithms /

CS 440 Theory of Algorithms / CS 468 Algorithms in Bioinformaticsi Brute Force. Design and Analysis of Algorithms - Chapter 3 3-0 Brute Force A straightforward approach usually based on problem statement

### Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Outline Best-first search Greedy best-first search A* search Heuristics Local search algorithms Hill-climbing search Beam search Simulated annealing search Genetic algorithms Constraint Satisfaction Problems

### CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION

CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION 1. What is performance measurement? 2. What is an algorithm? 3. How the algorithm is good? 4. What are the

### Algorithms and Experimental Study for the Traveling Salesman Problem of Second Order. Gerold Jäger

Algorithms and Experimental Study for the Traveling Salesman Problem of Second Order Gerold Jäger joint work with Paul Molitor University Halle-Wittenberg, Germany August 22, 2008 Overview 1 Introduction

### Informed Search and Exploration

Informed Search and Exploration Chapter 4 (4.1-4.3) CS 2710 1 Introduction Ch.3 searches good building blocks for learning about search But vastly inefficient eg: Can we do better? Breadth Depth Uniform

### 1 Better Approximation of the Traveling Salesman

Stanford University CS261: Optimization Handout 4 Luca Trevisan January 13, 2011 Lecture 4 In which we describe a 1.5-approximate algorithm for the Metric TSP, we introduce the Set Cover problem, observe

### VNS-based heuristic with an exponential neighborhood for the server load balancing problem

Available online at www.sciencedirect.com Electronic Notes in Discrete Mathematics 47 (2015) 53 60 www.elsevier.com/locate/endm VNS-based heuristic with an exponential neighborhood for the server load

### CSC Design and Analysis of Algorithms. Lecture 4 Brute Force, Exhaustive Search, Graph Traversal Algorithms. Brute-Force Approach

CSC 8301- Design and Analysis of Algorithms Lecture 4 Brute Force, Exhaustive Search, Graph Traversal Algorithms Brute-Force Approach Brute force is a straightforward approach to solving a problem, usually

### Parameterized Complexity - an Overview

Parameterized Complexity - an Overview 1 / 30 Parameterized Complexity - an Overview Ue Flarup 1 flarup@imada.sdu.dk 1 Department of Mathematics and Computer Science University of Southern Denmark, Odense,

### A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition

The University of Nottingham, Nottingham, United Kingdom A Late Acceptance Hill-Climbing algorithm the winner of the International Optimisation Competition Yuri Bykov 16 February 2012 ASAP group research

### Simulated Annealing. G5BAIM: Artificial Intelligence Methods. Graham Kendall. 15 Feb 09 1

G5BAIM: Artificial Intelligence Methods Graham Kendall 15 Feb 09 1 G5BAIM Artificial Intelligence Methods Graham Kendall Simulated Annealing Simulated Annealing Motivated by the physical annealing process

### 16.410/413 Principles of Autonomy and Decision Making

16.410/413 Principles of Autonomy and Decision Making Lecture 17: The Simplex Method Emilio Frazzoli Aeronautics and Astronautics Massachusetts Institute of Technology November 10, 2010 Frazzoli (MIT)

### Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Hill Climbing Many search spaces are too big for systematic search. A useful method in practice for some consistency and optimization problems is hill climbing: Assume a heuristic value for each assignment

### Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms 2 Introduction graph theory useful in practice represent many real-life problems can be slow if not careful with data structures 3 Definitions an undirected graph G = (V, E)

### A Genetic Algorithm Framework

Fast, good, cheap. Pick any two. The Project Triangle 3 A Genetic Algorithm Framework In this chapter, we develop a genetic algorithm based framework to address the problem of designing optimal networks

### SPERNER S LEMMA MOOR XU

SPERNER S LEMMA MOOR XU Abstract. Is it possible to dissect a square into an odd number of triangles of equal area? This question was first answered by Paul Monsky in 970, and the solution requires elements

### Algorithms and Data Structures

Algorithm Analysis Page 1 - Algorithm Analysis Dr. Fall 2008 Algorithm Analysis Page 2 Outline Textbook Overview Analysis of Algorithm Pseudo-Code and Primitive Operations Growth Rate and Big-Oh Notation