CS 231: Algorithmic Problem Solving

Similar documents
Data Structures (CS 1520) Lecture 28 Name:

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

CS 231: Algorithmic Problem Solving

Backtracking. Chapter 5

Data Structures (CS 1520) Lecture 28 Name:

Backtracking and Branch-and-Bound

10. EXTENDING TRACTABILITY

Chapter-6 Backtracking

Copyright 2000, Kevin Wayne 1

UNIT 4 Branch and Bound

6. Algorithm Design Techniques

Practice Problems for the Final

CSE 100: B+ TREE, 2-3 TREE, NP- COMPLETNESS

6.856 Randomized Algorithms

BackTracking Introduction

CMSC Introduction to Algorithms Spring 2012 Lecture 16

Sorting and Selection

Binary Trees

Algorithm Design and Analysis

CMPSCI611: Approximating SET-COVER Lecture 21

Data Structures and Algorithms

Theorem 2.9: nearest addition algorithm

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Algorithm classification

The problem: given N, find all solutions of queen sets and return either the number of solutions and/or the patterned boards.

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Algorithm Design Techniques (III)

CSE 417 Branch & Bound (pt 4) Branch & Bound

BST Deletion. First, we need to find the value which is easy because we can just use the method we developed for BST_Search.

Chapter 9 Graph Algorithms

Faster parameterized algorithms for Minimum Fill-In

Paths, Flowers and Vertex Cover

Solving NP-hard Problems on Special Instances

Practice Final Exam 2: Solutions

n 2 ( ) ( ) + n is in Θ n logn

INDIAN STATISTICAL INSTITUTE

Topics. Trees Vojislav Kecman. Which graphs are trees? Terminology. Terminology Trees as Models Some Tree Theorems Applications of Trees CMSC 302

General Methods and Search Algorithms

Organizing Spatial Data

TREES AND ORDERS OF GROWTH 7

Exact Algorithms for NP-hard problems

Fixed-Parameter Algorithm for 2-CNF Deletion Problem

Greedy Algorithms CHAPTER 16

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

PSD1A. DESIGN AND ANALYSIS OF ALGORITHMS Unit : I-V

Lower Bound on Comparison-based Sorting

Multiple-choice (35 pt.)

CAD Algorithms. Categorizing Algorithms

Chapter 9 Graph Algorithms

Algorithm Design And Analysis Asst. Prof. Ali Kadhum Idrees The Sum-of-Subsets Problem Department of Computer Science College of Science for Women

Chapter 9 Graph Algorithms

Branching Algorithms

Trees. 3. (Minimally Connected) G is connected and deleting any of its edges gives rise to a disconnected graph.

INF2220: algorithms and data structures Series 1

Graphs. Directed graphs. Readings: Section 28

looking ahead to see the optimum

Randomized algorithms have several advantages over deterministic ones. We discuss them here:

COMP 250 Fall recurrences 2 Oct. 13, 2017

Design and Analysis of Algorithms (VII)

Trees. Eric McCreath

Analysis of Algorithms

CSE 100: GRAPH ALGORITHMS

Search Algorithms. IE 496 Lecture 17

CS261: Problem Set #1

TREES AND ORDERS OF GROWTH 7

Important separators and parameterized algorithms

Graphs. Readings: Section 28. CS 135 Fall : Graphs 1

Important separators and parameterized algorithms

Geometric data structures:

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

11. APPROXIMATION ALGORITHMS

20 Some terminology regarding trees: Parent node: A node that has branches. Parent nodes can have multiple branches.

Computing optimal total vertex covers for trees

We will show that the height of a RB tree on n vertices is approximately 2*log n. In class I presented a simple structural proof of this claim:

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

Trees : Part 1. Section 4.1. Theory and Terminology. A Tree? A Tree? Theory and Terminology. Theory and Terminology

Parallel and Sequential Data Structures and Algorithms Lecture (Spring 2012) Lecture 16 Treaps; Augmented BSTs

(Refer Slide Time: 01.26)

COMP3121/3821/9101/ s1 Assignment 1

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Great Theoretical Ideas in Computer Science

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

March 20/2003 Jayakanth Srinivasan,

Priority Queues. 1 Introduction. 2 Naïve Implementations. CSci 335 Software Design and Analysis III Chapter 6 Priority Queues. Prof.

15.083J Integer Programming and Combinatorial Optimization Fall Enumerative Methods

Greedy algorithms is another useful way for solving optimization problems.

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

3 INTEGER LINEAR PROGRAMMING

Approximation Algorithms

Problem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.

CSE 2123 Recursion. Jeremy Morris

CS F-11 B-Trees 1

MAT 7003 : Mathematical Foundations. (for Software Engineering) J Paul Gibson, A207.

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

1 Minimum Cut Problem

COMP Analysis of Algorithms & Data Structures

Subset sum problem and dynamic programming

Geometric Steiner Trees

Transcription:

CS 231: Algorithmic Problem Solving Naomi Nishimura Module 7 Date of this version: January 28, 2019 WARNING: Drafts of slides are made available prior to lecture for your convenience. After lecture, slides will be updated to reflect material taught. Check the date on this page to make sure you have the correct, updated version. WARNING: Slides do not include all class material; if you have missed a lecture, make sure to find out from a classmate what material was presented verbally or on the board. CS 231 Module 7 Compromising on speed 1 / 48

Being lucky What I said A problem being NP-complete means that it is unlikely that there exists an algorithm that is guaranteed to runs in time polynomial in the size of the input and is guaranteed to produce the correct answer. What I didn t say The problem cannot be solved quickly on small instances. The problem cannot be solved quickly on special instances. You might be lucky if: The problem instance is small. Exponential in a small number may still be a reasonable running time. The problem instance is of a special kind. Try solving Hamiltonian Cycle on a tree. CS 231 Module 7 Compromising on speed 2 / 48

Searching in a methodical way Example: Generate all sequences of size 3 over {1,2,3,4}. Group sequences by the first number in the sequence, then the second, then the third. First 1 First 2 Second 2 Second 3 Second 1 Second 3 132 134 213 214 123 124 231 234 Second 4 Second 4 142 143 241 243 First 3 First 4 Second 1 Second 2 Second 1 Second 2 312 314 321 324 412 413 421 423 Second 4 Second 3 341 342 431 432 CS 231 Module 7 Compromising on speed 3 / 48

Solutions and partial solutions For many problems, the solution is formed from the input, such as: Ordering of elements Arrangement of elements Subset of elements Partition of elements A partial solution is a partial specification that corresponds to the set of all solutions consistent with the information specified. For example, 1?? is a partial solution for sequences of size 3 over {1,2,3,4}. It corresponds to all the sequences that have 1 in the first position. Note: Because the input might be a graph and the elements might be vertices, to avoid confusion we refer to nodes in the search tree. CS 231 Module 7 Compromising on speed 4 / 48

Search trees The root corresponds to the set of all the solutions. A leaf corresponds to a particular solution. The sets associated with the children of a node form a partition of the set of solutions associated with the node.??? 1?? 2?? 3?? 4?? 12? 13? 14? 21? 23? 24? 31? 32? 34? 41? 42? 43? 123 124 142 143 231 234 312 314 341 342 421 423 132 134 213 214 241 243 321 324 412 413 431 432 CS 231 Module 7 Compromising on speed 5 / 48

Exploring a search tree Pro: A search tree contains all feasible solutions. Con: Constructing the tree is very time-consuming. Ideas to use: Create the tree as you search, forming only the nodes that you need. To explore a node, generate and explore the subtree rooted at each possible child. When a node has been completely explored, backtrack to continue exploration of its parent. CS 231 Module 7 Compromising on speed 6 / 48

Using recursion to explore a tree Base cases: Reach a solution Reach a partial solution that cannot be extended General case: Recursion is used to form a bigger tree from the smaller trees rooted at the children of a node. The first child is explored completely before the second child. All children of a node are explored completely before backtracking from the node to its parent. CS 234 covers a similar type of exploration of graphs, known as depth-first search. Variants: Generate all solutions (return a list of all solutions found) Generate one solution (stop when one solution is found) Generate the best solution (keep track of the best solution so far) CS 231 Module 7 Compromising on speed 7 / 48

Paradigm: Backtracking Ideas: View a partial solution as a list. Try to extend a partial solution of length k by adding an element in position k + 1. If extension is possible, process the new node next. If extension is not possible, backtrack by undoing the last decision. Recipe for backtracking: 1. Specify a partial solution. 2. Determine how the children of a node are formed. 3. Choose when to backtrack. Note: In contrast to greedy algorithms, backtracking algorithms do not eliminate solutions as they go, and decisions can be undone. CS 231 Module 7 Backtracking 8 / 48

The implicit backtracking tree Each node represents a partial solution. The root of the tree is the initial partial solution. For a node with partial solution p of length k: If there are no candidates for position k + 1 in the list, the node is a leaf. Otherwise, the children of a node represent all ways of extending the partial solution from size k to size k + 1. Note: The number of children of a node is equal to the number of candidates for the next position in the list. Tip Make sure that all possibilities are covered. CS 231 Module 7 Backtracking 9 / 48

Template to find all solutions def all sols(instance): return extend([], instance, []) def extend(part sol, part inst, sols so far): if is solution(part sol, part inst): return sols so far + [part sol] elif part inst == None: return sols so far else: for option in all options(part sol, part inst): sol = form sol(option, part sol, part inst) inst = form inst(option, part sol, part inst) sols so far = extend(sol, inst, sols so far) return sols so far CS 231 Module 7 Backtracking 10 / 48

Observations on the template Helper functions used by the template: is solution determines if the partial solution is a complete solution all options generates all ways of forming a child of a node form sol extends a partial solution to a bigger partial solution form inst modifies an instance to fit the bigger partial solution Coding with the template: The first function is a non-recursive wrapper function that has the instance as an input. The instance might be an object with multiple fields, or multiple parameters might be used. In the base cases, a leaf is reached and if it contains a solution, it is added to the collection of solutions so far. The function returns a list of all solutions. CS 231 Module 7 Backtracking 11 / 48

Using backtracking to generate sequences Example: Generate all sequences of size 3 over {1,2,3,4}. 1. Specify a partial solution. A sequence of length 0, 1, 2, or 3 2. Determine how the children of a node are formed. For any numbers not already in the solution, create a child by adding one number to the partial solution. 3. Choose when to backtrack. When the sequence has length three Example: The node with the solution 2?? has as its children 21?, 23?, and 24?. CS 231 Module 7 Backtracking 12 / 48

Forming the helper functions is solution determines if the length of the partial solution is three all options generates all values not yet in the partial solution form sol extends a partial solution to a bigger partial solution by adding a new value form inst modifies an instance by removing a value from the list of choices In example code available on the website, notice the use of copy.copy to generate a new list. CS 231 Module 7 Backtracking 13 / 48

Template to find one solution def one sol(instance): return search([], instance) def search(part sol, part inst): if is solution(part sol, part inst): return part sol elif part inst == None: return None else: for option in all options(part sol, part inst): sol = form sol(option, part sol, part inst) inst = form inst(option, part sol, part inst) possible sol = search(sol, inst) if possible sol!= None: return possible sol CS 231 Module 7 Backtracking 14 / 48

Observations on the template As before, use a non-recursive wrapper function. Here only one solution is returned. When a base case is reached, return a solution or None. In the general case, stop searching once a solution is found. The entire tree need not be searched. CS 231 Module 7 Backtracking 15 / 48

Using backtracking to find one sequence Example: Generate a sequence in descending order. Example code is available on the website. Is there a better way to find a sequence in descending order? CS 231 Module 7 Backtracking 16 / 48

Integer knapsack Integer knapsack Input: A set of n types of objects, where object i has weight w i and value v i, and a weight bound W Output: A list of integers c 1,...,c n such that n i=1 c i w i W and n i=1 c i v i is maximized Example: n = 3, W = 10 i w i v i 1 3 4 2 4 5 3 5 6 4 4 5 CS 231 Module 7 Backtracking 17 / 48

Using backtracking for Integer Knapsack 1. Specify a partial solution. A sequence of objects with possible repeats 2. Determine how the children of a node are formed. For any object that fits in the available space, create a child by adding one new object 3. Choose when to backtrack. When there is not enough space for any object to be added Implementation options: Check if an object fits before adding it. Add objects that might exceed the capacity, instead checking the total capacity of a partial solution when processing it. Key idea: Various choices can be made on exactly how to define a partial solution and how to process it. CS 231 Module 7 Backtracking 18 / 48

Implicit backtracking tree for Integer Knapsack Key idea: We will come back to the idea of how to fix the problem of having some solutions appear more than once. CS 231 Module 7 Backtracking 19 / 48

Using backtracking for k-colouring A colour for a vertex is legal if none of its neighbours has been assigned that colour. 1. Specify a partial solution. An assignment of up to k colours to a subset of the vertices 2. Determine how the children of a node are formed. For an uncoloured vertex, assign one of the k colours to it 3. Choose when to backtrack. When all vertices have been coloured 1. Specify a partial solution. An assignment of up to k colours to a subset of the vertices such that each colour is legal 2. Determine how the children of a node are formed. For an uncoloured vertex, assign one of the k colours that is legal, if any 3. Choose when to backtrack. When there is no possible legal colour for any uncoloured vertex Key idea: Choosing the second option means we can stop early if there is a vertex that can t be coloured. CS 231 Module 7 Backtracking 20 / 48

Fine-tuning k-colouring When forming the child of a node, how do we choose which vertex to colour and which colour to assign? Key idea: If we don t pay attention to our choices, we may generate the same solution more than once. Solution: Order the vertices so that in the child of the root, the first vertex is coloured, in the grandchildren the first two vertices are coloured, and so on. Key idea: If we are very careful in our choices of ordering of vertices and colours, we might increase our chances of finding a solution sooner. CS 231 Module 7 Backtracking 21 / 48

Using backtracking for optimal colouring Goal: Find a colouring using the smallest number of colours. Note: Search time can be improved by avoiding solutions that differ only by the colour given to a set of vertices. Observations: The number of children of a node can be as big as the number of colours used so far plus 1 (a new colour). The algorithm can t stop when a colouring is found, as a better one might exist. Backtracking only occurs when all vertices have been coloured (since it is always possible). Key idea: If we ve already found a k-colouring, stop searching on a branch in which an (k + 1)st colour is needed. CS 231 Module 7 Backtracking 22 / 48

Template to keep track of the best so far def find best(part sol, part inst, best so far): if is solution(part sol, part inst): if best so far == None: return part sol elif sol value(part sol) > sol value(best so far): return part sol else: return best so far else: value = best so far for option in all options(part sol, part inst): sol = form sol(option, part sol, part inst) inst = form inst(option, part sol, part inst) value = find best(sol, inst, value) return value CS 231 Module 7 Backtracking 23 / 48

Observations on the template A non-recursive wrapper function can be used here too. Helper function sol value computes a value of the solution. There is no early exit when a solution is found. CS 231 Module 7 Backtracking 24 / 48

Ideas for refining backtracking Ideas seen: Pruning: Stop search as soon as it is clear that the partial solution cannot lead to a solution. Exploiting symmetry: Avoid revisiting identical partial solutions. Comparing to best-so-far: Stop search when a partial solution cannot lead to a solution better than the best found so far. A new idea: For a maximization (minimization) problem, use a bounding function to determine the maximum (minimum) value possible in the subset associated with a partial solution. If the value of the function at a node is smaller (bigger) than the best value calculated so far, stop search from that node and backtrack. CS 231 Module 7 Backtracking 25 / 48

Branch-and-bound Ideas: At each node, compute a bound on value of any children. If this bound is worse than best solution we have so far, don t search farther from this node. Recipe for branch-and-bound: 1. Determine what to store at each node. 2. Decide how to generate the children of a node. 3. Specify what global information should be stored and updated. 4. Choose a bounding function. CS 231 Module 7 Branch-and-bound 26 / 48

Template for branch-and-bound Add the following extra case: elif bound(part sol, part inst) < sol value(best so far): return best so far A non-recursive wrapper function can be used here too. Helper function bound instance computes a bound. In comparing bounds, < is used for a maximization problem and > is used for a minimization problem. CS 231 Module 7 Branch-and-bound 27 / 48

Exploiting symmetry for knapsack To avoid revisiting identical partial solutions, place an order on the items and never add an item that comes earlier in the ordering than the item just added. 1. Specify a partial solution. A sequence of objects ordered by object number, with possible repeats 2. Determine how the children of a node are formed. For any objects with numbers at least that of the last item in the sequence, create a child by adding one in the order of item number. 3. Choose when to backtrack. When there is not enough space for the object being added CS 231 Module 7 Branch-and-bound 28 / 48

The refined implicit tree CS 231 Module 7 Branch-and-bound 29 / 48

Choosing an order for items To increase the chances of finding the best solution early, order items so that v i /w i decreases as i increases. Example: n = 3, W = 10 i w i v i v i /w i 1 3 4 4/3 2 4 5 5/4 3 5 6 6/5 CS 231 Module 7 Branch-and-bound 30 / 48

Using branch-and-bound for Knapsack 1. Determine what to store at each node. A partial solution, its weight, and its value 2. Decide how to generate the children of a node. Each child is formed by adding a different object, if possible 3. Specify what global information should be stored and updated. Keep track of the best complete solution so far 4. Choose a bounding function. Add the value of the solution plus v k /w k times the remaining weight, where the last item added was of type k Note: If only items of types 1 through k have been taken at a given node, then we know that no child of the node can do better than the value of the bounding function. CS 231 Module 7 Branch-and-bound 31 / 48

Branch-and-bound tree for Knapsack 4 5 6 13 1/3 12 1/2 12 8 9 10 13 1/3 12 12 3/4 12 2/5 13 13 1/3 13 CS 231 Module 7 Branch-and-bound 32 / 48

Using a search tree for Independent Set Backtracking ideas: 1. Specify a partial solution. A set of vertices that form an independent set 2. Determine how the children of a node are formed. Add another vertex that has no edges to any of the vertices in the set so far 3. Choose when to backtrack. When no more vertices can be added Refinements: Can we prune useless branches? Yes, we did so by only considering independent sets instead of any sets of vertices Is each solution explored only once? Not as currently defined Can we stop early if a better solution was already found? Not without extra calcuations CS 231 Module 7 Branch-and-bound 33 / 48

Using a search tree for Vertex Cover Backtracking ideas: 1. Specify a partial solution. A subgraph of the original graph and a set of vertices no longer in the graph that cover all edges that have been deleted 2. Determine how the children of a node are formed. For a vertex in the subgraph, create one child in which the vertex is added to the vertex cover (and G is formed by removing v and all incident edges) and create one child in which all neighbours of the node are added to the vertex cover (and G is formed by removing v and all its neighbours and all incident edges) 3. Choose when to backtrack. When all edges have been covered CS 231 Module 7 Branch-and-bound 34 / 48

Refining Vertex Cover Refinements: Can we prune useless branches? Not as currently defined Is each solution explored only once? Yes, as we always partition the remaining options Can we stop early if a better solution was already found? Yes, we can keep track of the best solution so far and stop if the partial solution is bigger than the solution so far CS 231 Module 7 Branch-and-bound 35 / 48

Partition Partition Input: A group of n objects, where object i has weight w i Output: A partition of the objects into groups A and B with minimum difference in the sum of the weights of each part CS 231 Module 7 Branch-and-bound 36 / 48

Using branch-and-bound for Partition Recipe for branch-and-bound: 1. Determine what to store at each node. An assignment of a subset of the objects to groups A and B 2. Decide how to generate the children of a node. Each child is formed by adding an unassigned object to either A or B 3. Specify what global information should be stored and updated. Keep track of the difference between sums of weights in the best complete solution so far. 4. Choose a bounding function. Suppose the sum of weights of all the unassigned objects (x) is less than the difference between the sums of the weights in the two parts (y). Then we use y x as our value. Otherwise, we use y as our value. Note: No bigger change in difference is possible than by putting all the items in the much-less-full part of the partition. CS 231 Module 7 Branch-and-bound 37 / 48

Exploiting nice properties Idea: Develop fast algorithms under nice conditions, characterized as one or more parameters of the problem or the instance. Examples: Vertex cover when the size of the vertex cover is 1. The parameter is the size of the vertex cover. Vertex cover when the maximum degree of any vertex in G is 1. The parameter is the maximum degree of any vertex in G. Note: The parameter should not limit the size of instance. Not allowed: Vertex cover when the size of G is constant. The parameter is the number of vertices in G. CS 231 Module 7 Fixed-parameter algorithms 38 / 48

Finding parameterized algorithms Can we find an algorithm with running time that depends on both the size of G and the parameter? Aim for the part dependent on the size of G to be polynomial in the size of G. Allow the part dependent on the parameter to be exponential (or worse), as this part will be small when the parameter is small. A fixed-parameter tractable (FPT) algorithm runs in time O(g(k)n O(1) ), where k is a parameter and g is any function. Caution: The function g must be a function of k only, and not a function of n. CS 231 Module 7 Fixed-parameter algorithms 39 / 48

Using branching to find a fixed-parameter algorithm Branching is one of the common paradigms used for fixed-parameter algorithms. Idea: Use the parameter to bound the size of the search tree explored in the algorithm. To obtain a fixed-parameter tractable algorithm: Ensure that exploring the bounded tree is sufficient. Show that if no solution is found in the tree, no solution exists. Ensure that the cost of creating and exploring the tree is in O(g(k)n O(1) ), where k is a parameter and g is any function. CS 231 Module 7 Fixed-parameter algorithms 40 / 48

Calculating the cost of creating and exploring the tree Suppose we have a search tree such that: The height of the tree is at most h. The number of children of each node is at most c. The cost of processing a node is at most p. The total number of nodes in the tree will be O(c h ), as discussed in the math session, and hence the total cost will be in O(c h p). One way to obtain total time in O(g(k)n O(1) ): Ensure h O(k). Ensure c O(1). Ensure p O(f (k)n O(1) ) Total cost is in O(c k f (k)n O(1) ), or O(g(k)n O(1) ), for a function g that is only a function of k, and not of n. CS 231 Module 7 Fixed-parameter algorithms 41 / 48

FPT algorithm for Vertex Cover Form a search tree that a partial vertex cover and a graph at each node, with the root storing the empty set and G. Generating children from a node storing (P, H): Choose an edge {u,v} that has not yet been covered. Create two children as follows: P {u} and the graph formed from H by removing u and all edges incident on u P {v} and the graph formed from H by removing v and all edges incident on v Number of nodes in the tree: O(2 k ) since two children of a node and only k levels (add to the vertex cover at each level) Cost of processing a node: Polynomial in the number of vertices in the graph Total cost of O(2 k n O(1) ) is in FPT CS 231 Module 7 Fixed-parameter algorithms 42 / 48

Randomized algorithms All the algorithms we have seen so far as deterministic, not randomized. Characteristics of a randomized algorithm: The algorithm can flip coins to decide the course of action. The amount of randomness used is viewed as another resource, like time and space. The same algorithm on the same input may lead to different running times. The same algorithm on the same input may lead to different outputs. Depending on the type of uncertainty we allow, we sacrifice either the guarantee on worst-case running time (considered in this module) or the guarantee on correctness (considered in the next module). CS 231 Module 7 Randomized algorithms 43 / 48

Compromising on speed A Las Vegas algorithm is guaranteed to give the correct answer in expected polynomial time. The term average case is used for deterministic algorithms, based on a probability distribution on the instances. A probability distribution is usually based on assumptions about distributions on instances. The term expected case is used for randomized algorithms, based on the average over possible executions on a single instance. There is no dependence on assumptions about distributions on instances. CS 231 Module 7 Randomized algorithms 44 / 48

Using randomness to sidestep worst case Key idea: Because the exact steps of the algorithm are not fixed, you can be lucky and avoid worst-case behaviour. Example: In message-passing on a network of computers, if all computers are sending messages to other computers, the same link might be needed for many of the messages. With high probability, sending messages to random locations is much faster. Algorithm: Send messages to random locations and then from the random locations to the desired destinations. CS 231 Module 7 Randomized algorithms 45 / 48

Using randomness for sorting and selection Use randomness to find a pivot. Sorting: Compare each element to the pivot. Recursively sort elements smaller than the pivot. Recursively sort elements greater than the pivot. Selection: Compare each element to the pivot. Recursively select from the correct piece. CS 231 Module 7 Randomized algorithms 46 / 48

Finding a good pivot Various options: Use whatever pivot is found. Use only a good enough pivot. Key ideas: The base case can be reached by reducing the problem size by at least 1/4 Θ(log n) times. The probability of getting a good enough pivot is 1/2. Expected running times are Θ(n log n) for sorting and Θ(n) for selection. CS 231 Module 7 Randomized algorithms 47 / 48

Module summary Topics covered: Search spaces Search trees Backtracking The implicit backtracking tree Ideas for refining backtracking Branch-and-bound Fixed-parameter algorithms Randomized algorithms CS 231 Module 7 Randomized algorithms 48 / 48