Greedy algorithms. How to design algorithms: Goal. Input. We can: 1. Try to develop a completely new algorithm. 2. Modify some existing algorithm.

Similar documents
Greedy algorithms. How to design algorithms: Goal. Input. We can: 1. Try to develop a completely new algorithm. 2. Modify some existing algorithm.

Greedy algorithms. Given a problem, how do we design an algorithm that solves the problem? There are several strategies:

Partha Sarathi Manal

Algorithms and Complexity

5 MST and Greedy Algorithms

5 MST and Greedy Algorithms

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Minimum Spanning Trees

4.1 Interval Scheduling

Minimum Spanning Trees

Minimum Trees. The problem. connected graph G = (V,E) each edge uv has a positive weight w(uv)

Tree. number of vertices. Connected Graph. CSE 680 Prof. Roger Crawfis

Week 11: Minimum Spanning trees

Example of why greedy step is correct

6.1 Minimum Spanning Trees

Lecture 5: Graph algorithms 1

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

CS 561, Lecture 9. Jared Saia University of New Mexico

Algorithm Design and Analysis

SPANNING TREES. Lecture 21 CS2110 Spring 2016

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Weighted Graphs and Greedy Algorithms

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

Introduction to Algorithms

CS61B, Fall 2002 Discussion #15 Amir Kamil UC Berkeley 12/5/02

2 A Template for Minimum Spanning Tree Algorithms

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

Minimum Spanning Trees

COMPSCI 311: Introduction to Algorithms First Midterm Exam, October 3, 2018

CSE 521: Design and Analysis of Algorithms I

Minimum Spanning Trees

Solutions to Exam Data structures (X and NV)

Design and Analysis of Algorithms

COMP 182: Algorithmic Thinking Prim and Dijkstra: Efficiency and Correctness

Spanning Trees 4/19/17. Prelim 2, assignments. Undirected trees

Spanning Trees. Lecture 22 CS2110 Spring 2017

looking ahead to see the optimum

Introduction to Algorithms

Minimum-Cost Spanning Tree. Example

Info 2950, Lecture 16

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies

Greedy algorithms is another useful way for solving optimization problems.

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

CSE 100 Minimum Spanning Trees Prim s and Kruskal

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

Union Find and Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 8

Introduction to Optimization

Minimum Spanning Trees

Spanning Trees, greedy algorithms. Lecture 22 CS2110 Fall 2017

Greedy Algorithms CSE 6331

MST worksheet By Jim Xu

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

CSE 421 Greedy Alg: Union Find/Dijkstra s Alg

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

COP 4531 Complexity & Analysis of Data Structures & Algorithms

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms

CSC 1700 Analysis of Algorithms: Minimum Spanning Tree

Chapter 23. Minimum Spanning Trees

Minimum Spanning Trees

Unit 3 Chapter 10 ABSTRACT ALGORITHMS 2GREEDY METHODS

Minimum Spanning Trees Ch 23 Traversing graphs

Spanning Trees, greedy algorithms. Lecture 20 CS2110 Fall 2018

10/31/18. About A6, Prelim 2. Spanning Trees, greedy algorithms. Facts about trees. Undirected trees

Homework 5: Graphs, Minimum Spanning Trees, and Dijkstra Shortest-Path

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Solutions to Math 381 Quiz 2

Introduction to Algorithms

Applied Algorithm Design Lecture 3

Outline. Graphs. Divide and Conquer.

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

What is a minimal spanning tree (MST) and how to find one

Lecture 25 Spanning Trees

Notes on Minimum Spanning Trees. Red Rule: Given a cycle containing no red edges, select a maximum uncolored edge on the cycle, and color it red.

Announcements Problem Set 5 is out (today)!

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

Spanning Trees. Lecture 20 CS2110 Spring 2015

tree follows. Game Trees

Week 12: Minimum Spanning trees and Shortest Paths

Massive Data Algorithmics. Lecture 10: Connected Components and MST

CS 4349 Lecture October 18th, 2017

Exam 3 Practice Problems

DESIGN AND ANALYSIS OF ALGORITHMS

Kruskal s MST Algorithm

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

DESIGN AND ANALYSIS OF ALGORITHMS. Unit 3 Chapter 10 ABSTRACT ALGORITHMS 2- GREEDY METHODS

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Unit 2: Algorithmic Graph Theory

The Algorithms of Prim and Dijkstra. Class 19

Graph Algorithms (part 3 of CSC 282),

CAD Algorithms. Categorizing Algorithms

Introduction: (Edge-)Weighted Graph

Transcription:

Page 1 Greedy algorithms How to design algorithms: Input Goal We can: 1. Try to develop a completely new algorithm. 2. Modify some existing algorithm. If we try the second possibility we can: 1. Modify a specific algorithm. 2. Try an algorithm from a general class of algorithms. Examples of algorithms to use in case 1 could be: Graph algorithms Flow algorithms Linear programming (Simplex method)

Page 2 We will describe three classes of algorithms that can be used in case 2: Greedy algorithms Divide & Conquer algorithms Dynamic Programming algorithms But first, an example of how we can modify an existing algorithm. Remember DFS: Set R = For all v V Set vis(v) = 0 End for DFS(s) DFS(u): Set vis(u) = 1 Add u to R For each v such that v is adjacent to u If vis(v) = 0 DFS(v) End if End for The algorithm can be used both for directed and undirected graphs.

Page 3 DIRECTED CYCLE Input: A directed graph G Goal: Does G contain a directed cycle? Modified DFS: DFS_Mod(u): For all v V Set vis(v) = 0 End for While there is v such that vis(v) = 0 DFS_Mod(v) Return "No " Set vis(u) = 0,5 For each v such that v is adjacent to u If vis(v) = 0,5 Return "Yes" Else if vis(v) = 0 DFS_Mod(v) End if End for Set vis(u) = 1 We can prove that this algorithm stops and returns Yes or No correctly.

Greedy algorithms Page 4 In this lecture we will describe a general template for finding algorithms. It works in a surprisingly large number of cases. It's the method of greedy algorithms. We will study a special type of problems. We want to make choices. Let's say that we make choices c[1], c[2], c[3],... At each step in the algorithm we have a set of possible choices. When the algorithm ends we want to have a selection c[1], c[2], c[3],...c[k] that in some sense is correct. We assume that there, to each selection, is associated a cost A. Let's assume that our goal is to find a correct selection with as small cost as possible. Furthermore, we assume that when the choice c[1] is made the remaining situation is a problem of the same type. (This notion seems hard define in a precise way.) Then there is a chance that a so called greedy algorithm will work. A greedy algorithm is a an algorithm which make the choices following a very simple (greedy) strategy. What this means depends on the situation. Usually there are two sorts of greedy strategies: 1. We can make the choice c[1] so that the cost (locally) increases as little as possible. 2. We can make the choice c[1] so that the remaining problem is as "good" as possible. (The first case is what we in the strictest sense means by a greedy algorithm. But lots of interesting problems are covered by the second, more vague case.)

Page 5 The greedy algorithm runs like this: Assume that we have made choices c[1], c[2], c[3],..., c[m]. If this selection is correct we stop. Otherwise, make the next choice following your greedy strategy. The general idea with a greedy algorithm is that you don't have to spend long time on making your choices. You don't have to look ahead and consider the consequences. Greedy algorithm usually have low time-complexity. The philosophy behind greedy algorithms: You don't have to look ahead when making choices. Ex: We have the numbers 10, 5 and 1. We are given the integer N. We want to write N as a sum of of the numbers 10, 5, 1. (We can use a number more than one time.) That is, we want to find numbers a,b,c such that N = a 10 + b 5 + c. Furthermore, we want to use as few terms as possible. That is, a + b + c should be as small as possible. The solution is obvious: As long as N is greater than 9 we subtract 10 to get a new number N and repeat. When N is smaller than 10 we subtract 5 if possible. Then we subtract 1 until we reach 0. So, for instance, N = 37 gives a = 3, b = 1, c = 2. Obviously, we can not do better than this. This is a greedy algorithm.

When greedy algorithms fail Page 6 A greedy algorithm can fail for two reasons: 1. It can fail to give us an optimal solution 2. It can fail to give us a correct solution. Ex: We take the same problem as before, but instead of 10, 5, 1 we use the numbers 6, 5, 1. If we have N = 10, the greedy algorithm gives us 10 = 6 + 1 + 1 + 1+ 1. But the best solution is 10 = 5 + 5. Ex: The same problem but with the numbers 6, 5, 2. If we take N = 7, the greedy algorithm subtracts 6 from 7 and leaves us with 1. Then the algorithm fails to reach the sum 7. The correct solution is 7= 5 + 2. But when do greedy algorithms work? We study some examples. Ex: We want to drive along a road. We represent the road as a coordinate axis. We start at x = 0 and want to go to a city x[n]. Along the road there are other cities x[1], x[2],...x[n-1]. A full gas tank contains gas for A kilometers. We can fill the tank in the cities but nowhere else. We want to reach x[n] and tank as few times as possible. How do we do that?

Page 7 We might think that we should use some complicated strategy but that is not so. In fact, a greedy algorithm works: If we are at x[i] and have enough gas left to reach x[i+1] we do not fill gas. Otherwise, we get a full tank at x[i]. If it is at all possible to get to x[n], this algorithm will take us there and fill gas as few times as possible. The time-complexity is O(n). We will look at some more examples: Set L = Set i = 0 Set T = A While TRUE do While x[i+1]- x[i] T and i n do Set T = T - x[i+1] + x[i] Set i = i + 1 End while If i = n then Halt If x[i+1] - x[i] A then Return "Impossible" Set T = A Put i at the end of L End while

Page 8 Activity planning Let us assume that we have n activities a 1,a 2,...,a n with corresponding time intervals [s i,f i ). No intervals are allowed to overlap each other. (The intervals are half-open. Observe that [2, 4) och [4, 5) do not overlap.) How do we choose a maximal number of activities that do not overlap each other? Greedy algorithm for activity planning It turns out that we shall choose activities after end times. This algorithm chooses a set A of activities. Sort the activities such that f 1 f 2... f n. (1) A {a 1 } (2) i 1 (3) for j 2 to n (4) if s j f i (5) A A {a j } (6) i j (7) return A The time-complexity is O(n).

Page 9 Jobs with deadlines Let us assume that we have n jobs which must be done by one person. It takes time t i to do job i. We also know that job i must be finished latest at time d i. We want to plan times for doing the jobs such that: f(i) =s(i)+t i for all i. No intervals [s(i),f(i)], [s(j),f(j)] overlap each other. f(i) d i for all i.

Page 10 A first problem is to decide if this is possible and how the planning then looks. If the planning is impossible to make, this must be because f(i) >d i for some i. Wecan try to minimize the failure". There are several ways of measuring the failure. A natural idea is the following: Set L = max i f i d i. Then try to get L as small as possible.

Page 11 The algorithm is really simple. Sort the job so that d 1 d 2 d n. We assume that d i > 0 for all i and that the first job starts at 0. (1) s(1) 0, f(1) t 1 (2) for i 2 to n (3) s(i) f(i 1), f(i) s(i)+t i (4) return s, f

Page 12 The Minimal Spanning Tree Problem If G is a connected graph,then a spanning tree is a tree that contains all nodes in G. Obs: If V = n and T G is a tree then T is spanning E = n - 1 A graph with node weights A minimal spanning tree (MST) is a spanning tree such that is minimal.

Page 13 The MST problem: Input: W weighted connected graph G Goal: A MST in G MST

Page 14 Kruskal's algorithm Sort the edges such that w(e₁) w( e₂)... Set A = For each eᵢ in the sorted order If A {eᵢ} does not contain any cycle Set A = A {eᵢ} End if End for Return A A first form How do we decide the complexity? How do you know if a set of edges contains a cycle or not? We have to describe the algorithm more in details. Data structures for identifying cycles: MakeSet(v) creates the set {v} Complexity: O( 1 ) FindSet(v) finds the set containing v Complexity : O( log V ) Make Union(u,v) makes the union of the sets containing u and v Complexity : O( log V )

Page 15 Kruskal(V,E,w) (1) A (2) foreach v V (3) MakeSet(v) (4) Sort E in increasing weight order (5) foreach (u, v) E (in the sorted order) (6) if FindSet(u) FindSet(v) (7) A A {(u, v)} (8) MakeUnion(u, v) (9) return A Complexity: O( E log E ) (due to the sorting); FindSet and MakeUnion takes O( E log V ) tid.

Page 16 Another similar algorithm is Prim's algorithm Prim(V,E,w,s) (1) key[v] for each v V (2) key[s] 0 (3) Q MakeHeap(V,key) (4) π[s] Null (5) while Q (6) u HeapExtractMin(Q) (7) foreach neighbor v to u (8) if v Q and w(u, v) <key[v] (9) π[v] u (10) key[v] w(u, v) (11) Order the heap at v It can be showed that the complexity is O( E log V )