Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

Similar documents
CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Introduction to Algorithms

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

Algorithms Dr. Haim Levkowitz

15.4 Longest common subsequence

Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017

1 Dynamic Programming

Design and Analysis of Algorithms

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Analysis of Algorithms Prof. Karen Daniels

We ve done. Now. Next

CS173 Longest Increasing Substrings. Tandy Warnow

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department

Dynamic Programming II

CMSC 451: Lecture 11 Dynamic Programming: Longest Common Subsequence Thursday, Oct 5, 2017

15.4 Longest common subsequence

Design and Analysis of Algorithms

Introduction to Algorithms

We ve done. Now. Next

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

Longest Common Subsequence. Definitions

Tutorial 6-7. Dynamic Programming and Greedy

Algorithm Design and Analysis

Virtual University of Pakistan

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

CS 231: Algorithmic Problem Solving

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

16.Greedy algorithms

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

1 i n (p i + r n i ) (Note that by allowing i to be n, we handle the case where the rod is not cut at all.)

Algorithms for Data Science

Dynamic Programming (Part #2)

Dynamic Programming. Lecture Overview Introduction

Design and Analysis of Algorithms

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

Dynamic Programming part 2

Problem Set 2 Solutions

So far... Finished looking at lower bounds and linear sorts.

Lecture: Analysis of Algorithms (CS )

5 MST and Greedy Algorithms

CSE 417 Branch & Bound (pt 4) Branch & Bound

CS Algorithms. Dynamic programming and memoization. (Based on slides by Luebke, Lim, Wenbin)

ECE608 - Chapter 16 answers

5 MST and Greedy Algorithms

y j LCS-Length(X,Y) Running time: O(st) set c[i,0] s and c[0,j] s to 0 for i=1 to s for j=1 to t if x i =y j then else if

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

Homework3: Dynamic Programming - Answers

CMSC351 - Fall 2014, Homework #4

CSE 421 Applications of DFS(?) Topological sort

CMSC Introduction to Algorithms Spring 2012 Lecture 16

Kruskal s MST Algorithm

CMSC 451: Dynamic Programming

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Greedy Algorithms. COMP 215 Lecture 6

Chapter 16: Greedy Algorithm

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

Dynamic Programmming: Activity Selection

Greedy Algorithms CHAPTER 16

Problem Strategies. 320 Greedy Strategies 6

Dynamic Programming. Nothing to do with dynamic and nothing to do with programming.

Professor: Padraic Bartlett. Lecture 9: Trees and Art Galleries. Week 10 UCSB 2015

Last week: Breadth-First Search

Lecture 20 : Trees DRAFT

COMP251: Greedy algorithms

Greedy Algorithms. Alexandra Stefan

memoization or iteration over subproblems the direct iterative algorithm a basic outline of dynamic programming

Applied Algorithm Design Lecture 3

QB LECTURE #1: Algorithms and Dynamic Programming

Divide and Conquer Algorithms

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

1 Non greedy algorithms (which we should have covered

Dynamic Programming CS 445. Example: Floyd Warshll Algorithm: Computing all pairs shortest paths

Exercises Optimal binary search trees root

Efficient Sequential Algorithms, Comp309. Motivation. Longest Common Subsequence. Part 3. String Algorithms

Write an algorithm to find the maximum value that can be obtained by an appropriate placement of parentheses in the expression

The Knapsack Problem an Introduction to Dynamic Programming. Slides based on Kevin Wayne / Pearson-Addison Wesley

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Chapter 16. Greedy Algorithms

Lecture 5: Dynamic Programming II

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Dynamic Programming Group Exercises

CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Computer Sciences Department 1

Algorithms Exam TIN093/DIT600

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

2. CONNECTIVITY Connectivity

Longest Common Subsequences and Substrings

Algorithms: COMP3121/3821/9101/9801

Algorithm Design Techniques part I

CS 561, Lecture 9. Jared Saia University of New Mexico

Lecture 8. Dynamic Programming

Transcription:

Subsequence Definition CS 461, Lecture 8 Jared Saia University of New Mexico Assume given sequence X = x 1, x 2,..., x m Let Z = z 1, z 2,..., z l Then Z is a subsequence of X if there exists a strictly increasing sequence i 1, i 2,..., i k of indices such that for all j = 1, 2,..., k, x ij = z j 2 Today s Outline Example Longest Common Subsequence Intro to Greedy Algs Let X = A, B, C, B, A, B, D, C, Z = A, C, A, B, C Then, Z is a subsequence of X 1 3

Common Subsequence Brute Force Given two sequences X and Y, we say that Z is a common subsequence of X and Y if Z is a subsequence of X and Z is a subsequence of Y Example: X = A, B, D, C, B, A, B, C, Y = A, D, B, C, D, B, A, B Then Z = A, B, B, A, B is a common subsequence Z is not a longest common subsequence(lcs) of X and Y though since the common subsequence Z = A, B, C, B, A, B is longer Q: Is Z a longest common subsequence? Brute Force approach is to enumerate all possible subsequences of X, check to see if its a subsequence of Y, and then keep track of the longest common subsequence of both X and Y This is slow. Q: How many subsequences of X are there? 4 6 LCS Problem Terminology We are given two sequences X = x 1, x 2,..., x m and Y = y 1, y 2,..., y n Goal: Find a maximum-length common subsequence of X and Y Given a sequence X = x 1, x 2,... x m, for i = 0, 1,..., m, let X i be the i-th prefix of X i.e. X i = x 1, x 2,..., x i Example: if X = A, B, D, C, X 0 = and X 3 = A, B, D 5 7

Optimal Substructure Recursive Solution Lemma 1: Let X = x 1, x 2,..., x m and let Y = y 1, y 2,..., y n be sequences and let Z = z 1, z 2,..., z k be any LCS of X and Y. Then: If x m = y n, then z k = x m = y n and Z k 1 is a LCS of X m 1 and Y n 1 If x m y n, then z k x m implies that Z is a LCS of X m 1 and Y If x m y n, then z k y n implies that Z is an LCS of X and Y n 1 Let X = x 1, x 2,..., x m and Y = y 1, y 2,..., y n be arbitrary sequences Based on Lemma 1, there are two main possibilities for the LCS of X and Y : If x m = y n, LCS(X, Y ) is LCS(X m 1, Y n 1 ) appended to x m = y n Otherwise, either LCS(X, Y ) is LCS(X m 1, Y ) or LCS(X, Y n 1 ) (whichever is larger) 8 10 In-Class Exercise Recursive solution Prove each of the three statements in the previous slide Hint: Use proof by contradiction Let c(i, j) be the length of an LCS of the sequence X i, Y j Note that c(i, j) = 0 if i or j is 0 Thus we have: c(i, j) = 0 if i = 0 or j = 0 c(i, j) = c(i 1, j 1) + 1 if i, j > 0 and x i = y j c(i, j) = max(c(i, j 1), c(i 1, j)) if i, j > 0 and x i y j 9 11

DP Solution Example This is already enough to write up a recursive function, however the naive recursive function will take exponential time Instead, we can use dynamic programming and solve from the bottom up Code for doing this is on p. 353 and 355 of the text, basically it uses the same ideas we ve seen before of filling in entries in a table from the bottom up. A B D C B A B C 0 0 0 0 0 0 0 0 0 A 0 1 1 1 1 1 1 1 1 D 0 1 1 2 2 2 2 2 2 B 0 1 2 2 2 3 3 3 3 C 0 1 2 2 3 3 3 3 4 D 0 1 2 3 3 3 3 3 4 B 0 1 2 3 3 4 4 4 4 A 0 1 2 3 3 4 5 5 5 B 0 1 2 3 3 4 5 6 6 12 14 Example Reconstruction Consider X = A, B, D, C, B, A, B, C, Y = A, D, B, C, D, B, A, B The next slide gives the table constructed by the DP algorithm for computing the LCS of X and Y The bold numbers represent one possible path giving a LCS. The arrows keep track of where the minimum is obtained from To find a reconstruction, first find a path of edges leading from the bottom right corner to the top left corner In this path, the target of each diagonal arrow gives a character to include in the LCS In our example, A, B, C, B, A, B is the LCS we get by following the edges along the only path from the bottom right to the top left 13 15

Take Away Activity Selection We ve seen four different DP type algorithms In each case, we did the following 1) found a recurrence for the solution 2) built solutions to the recurrence from the bottom up You should be prepared to do this on your own now! You are given a list of programs to run on a single processor Each program has a start time and a finish time However the processor can only run one program at any given time, and there is no preemption (i.e. once a program is running, it must be completed) 16 18 Greedy Algorithms Another Motivating Problem Greed is Good - Michael Douglas in Wall Street A greedy algorithm always makes the choice that looks best at the moment Greedy algorithms do not always lead to optimal solutions, but for many problems they do In the next week or two, we will see several problems for which greedy algorithms produce optimal solutions including: activity selection, fractional knapsack, and making change When we study graph theory, we will also see that greedy algorithms work well for computing shortest paths and finding minimum spanning trees. Suppose you are at a film fest, all movies look equally good, and you want to see as many complete movies as possible This problem is also exactly the same as the activity selection problem. 17 19

Example Greedy Activity Selector Imagine you are given the following set of start and stop times for activities 1. Sort the activities by their finish times 2. Schedul the first activity in this list 3. Now go through the rest of the sorted list in order, scheduling activities whose start time is after (or the same as) the last scheduled activity (note: code for this algorithm is in section 16.1) time 20 22 Ideas Greedy Algorithm Sorting the activities by their finish times There are many ways to optimally schedule these activities Brute Force: examine every possible subset of the activites and find the largest subset of non-overlapping activities Q: If there are n activities, how many subsets are there? The book also gives a DP solution to the problem time 21 23

Greedy Scheduling of Activities Optimality time The big question here is: Does the greedy algorithm give us an optimal solution??? Surprisingly, the answer turns out to be yes 24 26 Analysis Todo Let n be the total number of activities The algorithm first sorts the activities by finish time taking O(n log n) Then the algorithm visits each activity exactly once, doing a constant amount of work each time. This takes O(n) Thus total time is O(n log n) Finish Chapter 15 Start Chapter 16 25 27