CMSC Introduction to Algorithms Spring 2012 Lecture 16

Similar documents
Algorithm classification

Subset sum problem and dynamic programming

Data Structures, Algorithms & Data Science Platforms

Lecture 5: Dynamic Programming II

CMPSCI 311: Introduction to Algorithms Practice Final Exam

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

CMSC 451: Dynamic Programming

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

CMSC Introduction to Algorithms Spring 2012 Lecture 7

Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017

Backtracking. Chapter 5

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Introduction to Algorithms

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Design and Analysis of Algorithms

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

CSE 417 Branch & Bound (pt 4) Branch & Bound

Backtracking and Branch-and-Bound

Towards a Memory-Efficient Knapsack DP Algorithm

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

1 Maximum Independent Set

Solving NP-hard Problems on Special Instances

Theorem 2.9: nearest addition algorithm

15.4 Longest common subsequence

Algorithms Dr. Haim Levkowitz

Dynamic Programming. Lecture Overview Introduction

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Solutions. (a) Claim: A d-ary tree of height h has at most 1 + d +...

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Approximation Algorithms

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

CS 231: Algorithmic Problem Solving

FINAL EXAM SOLUTIONS

Design and Analysis of Algorithms

Copyright 2000, Kevin Wayne 1

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer

CSE 421 Applications of DFS(?) Topological sort

Trees. 3. (Minimally Connected) G is connected and deleting any of its edges gives rise to a disconnected graph.

CSci 231 Final Review

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

Algorithm Design Techniques part I

Algorithms, Spring 2014, CSE, OSU Greedy algorithms II. Instructor: Anastasios Sidiropoulos

Greedy Algorithms. Alexandra Stefan

Data Structures (CS 1520) Lecture 28 Name:

Design and Analysis of Algorithms

Algorithm Design Methods. Some Methods Not Covered

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

CS 350 Final Algorithms and Complexity. It is recommended that you read through the exam before you begin. Answer all questions in the space provided.

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

Introduction to Algorithms I

Chapter 16: Greedy Algorithm

Sorting Algorithms. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

CS6402 DESIGN AND ANALYSIS OF ALGORITHMS QUESTION BANK UNIT I

Tutorial 7: Advanced Algorithms. (e) Give an example in which fixed-parameter tractability is useful.

Notes for Lecture 24

CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION

Selection (deterministic & randomized): finding the median in linear time

Algorithms for Data Science

10. EXTENDING TRACTABILITY

2. (a) Explain when the Quick sort is preferred to merge sort and vice-versa.

Algorithm Design and Analysis

Comparison Based Sorting Algorithms. Algorithms and Data Structures: Lower Bounds for Sorting. Comparison Based Sorting Algorithms

We will give examples for each of the following commonly used algorithm design techniques:

II (Sorting and) Order Statistics

Approximation Algorithms

Lecture 14: General Problem-Solving Methods

Spanning Trees. Lecture 20 CS2110 Spring 2015

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Contents. 1 Introduction. 2 Searching and Traversal Techniques. Preface... (vii) Acknowledgements... (ix)

IV/IV B.Tech (Regular) DEGREE EXAMINATION. Design and Analysis of Algorithms (CS/IT 414) Scheme of Evaluation

Greedy Algorithms CHAPTER 16

CSE Winter 2015 Quiz 2 Solutions

Algorithms for Euclidean TSP

Tutorial 6-7. Dynamic Programming and Greedy

Algorithms and Data Structures: Lower Bounds for Sorting. ADS: lect 7 slide 1

Matching Theory. Figure 1: Is this graph bipartite?

L.J. Institute of Engineering & Technology Semester: VIII (2016)

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

CS 6783 (Applied Algorithms) Lecture 5

CAD Algorithms. Categorizing Algorithms

Trees. Arash Rafiey. 20 October, 2015

11. APPROXIMATION ALGORITHMS

We will show that the height of a RB tree on n vertices is approximately 2*log n. In class I presented a simple structural proof of this claim:

Exercise 1 : B-Trees [ =17pts]

4.1 Interval Scheduling

Design and Analysis of Algorithms

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

CS 231: Algorithmic Problem Solving

Analysis of Algorithms - Greedy algorithms -

Approximation Algorithms

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

EST Solutions. Ans 1(a): KMP Algorithm for Preprocessing: KMP Algorithm for Searching:

CMSC351 - Fall 2014, Homework #4

Lecture 3. Brute Force

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Applied Algorithm Design Lecture 3

15.4 Longest common subsequence

Transcription:

CMSC 351 - Introduction to Algorithms Spring 2012 Lecture 16 Instructor: MohammadTaghi Hajiaghayi Scribe: Rajesh Chitnis 1 Introduction In this lecture we will look at Greedy Algorithms and Dynamic Programming. 2 Preliminaries So far we have seen divide-and-conquer algorithms which recursively break down the problem into two or more subproblems of the same (or almost the same) type until these become simple enough to be solved directly. The solutions to the subproblems are then combined to give a solution to the original problem. Quicksort and Mergesort are these types of algorithms. We prove the correctness of such algorithms often by induction and obtain their running times by the Master Theorem. Backtracking or Branch-and-Bound technique: It is another approach for finding all (or some) solutions to a computational problem. They often incrementally build candidates to the solution recursively in each step and abandons each partial candidate (backwards) as soon as it determines that it cannot be completed to a valid or optimum solution. The running times of these algorithms is often exponential (e.g. 2 n ) and there are several techniques especially in AI to improve their running times. Greedy Algorithms: Unlike backtracking algorithms that try every possible choice (solutions), a greedy algorithm tries to make a locally optimal choice at each step with the hope of finding a global optimum. In other words, a greedy algorithm never reconsiders its choices. That is the reason for many problems greedy algorithms fail to produce the solution or the optimal solution. Say you have 25-cent,10-cent and 4-cent coins and we want to make change of 41 cents: Greedy produces25,10 and 4 and fails while a backtracking algorithm gives 25 and four 4 cent coins. However greedy algorithms are often very fast unlike backtracking algorithms. Djikstra s algorithm for shortest paths (that we see later) is a greedy algorithm. Greedy coloring is another application. 1

Lecture 16 3 A SPECIAL KNAPSACK PROBLEM: SUBSET SUM Dynamic Programming: It is a method for solving problems by breaking them down into simpler subproblems. The main idea is as follows: In general to solve a problem, we need to solve different parts of the problem (subproblems), then combine the solutions of the subproblems to reach an overall solution. Often, many of these subproblems are really the same. This is the place that dynamic programming saves time compared to backtracking algorithms, since unlike backtracking algorithms, it seeks to solve each subproblem only once, thus reducing the number of computations. The proof of correctness for dynamic programming is often by induction. 3 A Special Knapsack Problem: Subset Sum Suppose we are given a knapsack and we want to pack it fully, if it is possible. More precisely, we have the following problem: Given an integer k and n items of different sizes such that the i th item has an integer size s i, find a subset of the items whose sizes sum to exactly k, or determine that no such subset exists. 3.1 Greedy Algorithm: Always use the first (the largest) item that you can pack. This algorithm fails. Example is k = 13, n = 4 and the sizes as 6, 5, 4, 3. Then greedy packs only 6 and 5, but we can pack 6, 4, and 3. 3.2 Backtracking Algorithm: We do brute-force or exhaustive search in this case. Algorithm 1 BF(n, k,sol) 1: if n = 0 and k = 0 then 2: return true; 3: end if 4: if n = 0 and k > 0 then 5: return false; 6: end if 7: if k < 0 then 8: return false; 9: end if 10: return ( BF(n 1, k,sol) OR BF(n 1, k s n,sol {s n }) ) We call at the beginning with BF(n, k, ) to get the answer. Since we try both cases at each stage, the running time in the worst case in Ω(2 n ). 2

Lecture 16 4 LONGEST COMMON SUBSEQUENCE (LCS) 3.3 Dynamic Programming Similar to backtracking assume DP(n, k) is true if and only if we can construct k with numbers from s 1, s 2,..., s n. Then the recursion for DP is exactly the same as BF. However we can improve the running time a lot by this observation that the total number of problems may not be too high. There are n possibilities for the first parameter and k possibilities for the second parameter. Thus overall we only have nk different subproblems. Thus if we store all known results in a n k array then we compute each subproblem only once. If we are interested in finding the actual subset, then we can add to each entry a flag (sol) that indicates whether the corresponding item was selected in that step or not. This flag (sol) can be traced back from the (n, k)-th entry and the subset can be recovered. Algorithm 2 DP(n, k) 1: Set all flag[n, k] to -1. 2: if flag[n, k] <> 1 then 3: return flag[n, k]; 4: end if 5: if n = 0 and k = 0 then 6: flag[n, k] = 1; 7: end if 8: if n = 0 and k > 0 then 9: flag[n, k] = 0; 10: end if 11: if k < 0 then 12: flag[n, k] = 0; 13: end if 14: if flag[n 1, k] = 1 then 15: flag[n, k] = 1; 16: sol[n, k] = 0; 17: end if 18: if flag[n 1, k s n ] = 1 then 19: flag[n, k] = 1; 20: sol[n, k] = 1; 21: end if 22: return flag[n, k]; 4 Longest Common Subsequence (LCS) A subsequence of a sequence is a sequence obtained by deleting some elements with changing the order of the remaining elements. For example, ADF is a subsequence of ABCDEF. The problem is to find the LCS of two sequences (strings) given by a 1, a 2,..., a n 3

Lecture 16 6 BIN PACKING and b 1, b 2,..., b m. Again let LCS(i, j) be the length of LCS of a 1, a 2,..., a i and b 1, b 2,..., b j. Then 0 if i = 0 or j = 0 LCS(i, j) = LCS(i ( 1, j 1) + 1 ) if a i = b j max LCS(i, j 1), LCS(i 1, j) if a i b j Again if we are not careful then we have a brute-force backtracking algorithm with running time O(2 min{n,m} ). But if we use dynamic programming then the running time is O(nm). 5 Independent Set in Trees Given a tree we want to find a largest independent set in it. Without loss of generality we assume the tree is rooted at say r and T i is the subtree rooted at node i. For every vertex v let C(v) denote the set of children of v. Then we have { 1 ( if i is a leaf IS(i) = max j C(i) IS(j), 1 + ) k C(j),j C(i) IS(k) If we use dynamic programming then we the running time is O(n). 6 Bin Packing Packing different-sized objects into fixed sized bins using as few of the bins as possible. Formally the problem is as follows: Let x 1, x 2,..., x n be a set of real number each between 0 and 1. Partition the numbers into as few subsets as possible such that the sum of numbers in each subset is at most 1. 6.1 Greedy Algorithm Put x 1 in the first bin, and then for each i, put x i in the first bin that has room for it, or start a new bin if there is no room in any of the used bins. This algorithm is called as First-Fit. Theorem 1 The First-Fit algorithm uses at most twice plus one bins than the best possible number. Proof: First-Fit cannot leave two bins less than half-full; otherwise the items in the second bin could have been placed in the first bin. This the number of bins sued is no more than twice the sum of sizes of all items (rounded up). The theorem follows since the best solution cannot use than less than the sum of all the sizes. Suppose we use b bins. If we match the bins in pairs and put together then we have b 1 2 b 2 n i=1 x i OPT b 2 OPT + 1. 4

Lecture 16 REFERENCES There are algorithms for bin packing which give almost optimal solutions and there are also some backtracking and dynamic programming algorithms but they are more involved. References [1] Udi Manber, Introduction to Algorithms - A Creative Approach 5