Divide-and-Conquer. Combine solutions to sub-problems into overall solution. Break up problem of size n into two equal parts of size!n.

Similar documents
Chapter 5. Divide and Conquer. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 202 Divide-and-conquer algorithms. Fan Chung Graham UC San Diego

Ch5. Divide-and-Conquer

Divide and Conquer 1

CSE 202 Divide-and-conquer algorithms. Fan Chung Graham UC San Diego

5. DIVIDE AND CONQUER I

Lecture 4 CS781 February 3, 2011

5.4 Closest Pair of Points

Plan for Today. Finish recurrences. Inversion Counting. Closest Pair of Points

Algorithm Design and Analysis

Advanced Algorithms. Problem solving Techniques. Divide and Conquer הפרד ומשול

Divide-Conquer-Glue Algorithms

CSE 421 Algorithms: Divide and Conquer

CSE 421 Closest Pair of Points, Master Theorem, Integer Multiplication

Closest Pair of Points in the Plane. Closest pair of points. Closest Pair of Points. Closest Pair of Points

Closest Pair of Points. Cormen et.al 33.4

Divide and conquer algorithms. March 12, 2018 CSCI211 - Sprenkle. What is a recurrence rela&on? How can you compute D&C running &mes?

Algorithms: Lecture 7. Chalmers University of Technology

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

2/26/2016. Divide and Conquer. Chapter 6. The Divide and Conquer Paradigm

Divide-and-Conquer. The most-well known algorithm design strategy: smaller instances. combining these solutions

CSC Design and Analysis of Algorithms

Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

CSE 421 Greedy Alg: Union Find/Dijkstra s Alg

Divide-and-Conquer. Dr. Yingwu Zhu

Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn

Divide and Conquer. Design and Analysis of Algorithms Andrei Bulatov

CS473 - Algorithms I

Union Find. Data Structures and Algorithms Andrei Bulatov

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

Divide-and-Conquer Algorithms

Union Find 11/2/2009. Union Find. Union Find via Linked Lists. Union Find via Linked Lists (cntd) Weighted-Union Heuristic. Weighted-Union Heuristic

Divide and Conquer. Algorithm Fall Semester

Computer Science Approach to problem solving

Introduction to Algorithms

Master Theorem, Introduction to Graphs

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer

1 Closest Pair Problem

UNIT-2 DIVIDE & CONQUER

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

Design and Analysis of Algorithms Prof. Madhavan Mukund Chennai Mathematical Institute

Divide and Conquer. Divide and Conquer

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Acknowledgement. Outline

Parallel Models RAM. Parallel RAM aka PRAM. Variants of CRCW PRAM. Advanced Algorithms

7.3 Divide-and-Conquer Algorithm and Recurrence Relations

Lecture 7: Divide & Conquer 2. Integer Multiplication. & Matrix Multiplication. CS 341: Algorithms. Tuesday, Jan 29 th 2019

The divide and conquer strategy has three basic parts. For a given problem of size n,

Algorithms. Algorithms ALGORITHM DESIGN

Unit-2 Divide and conquer 2016

Algorithms. Algorithms 6.5 REDUCTIONS. introduction designing algorithms establishing lower bounds classifying problems intractability

Divide and Conquer 4-0

The Limits of Sorting Divide-and-Conquer Comparison Sorts II

Randomized algorithms. Inge Li Gørtz

Test 1 Review Questions with Solutions

CS 140 : Non-numerical Examples with Cilk++

COMP 3403 Algorithm Analysis Part 2 Chapters 4 5. Jim Diamond CAR 409 Jodrey School of Computer Science Acadia University

Module 2: Classical Algorithm Design Techniques

Algorithms and Data Structures, or

Reductions. Linear Time Reductions. Desiderata. Reduction. Desiderata. Classify problems according to their computational requirements.

CMPSCI 311: Introduction to Algorithms Practice Final Exam

Lecture 15 : Review DRAFT

Midterm solutions. n f 3 (n) = 3

Simple Sorting Algorithms

Algorithm Design and Analysis

Binary Search to find item in sorted array

END-TERM EXAMINATION

EECS 2011M: Fundamentals of Data Structures

List Ranking. Chapter 4

List Ranking. Chapter 4

CSCE 411 Design and Analysis of Algorithms

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

Selection (deterministic & randomized): finding the median in linear time

CISC 621 Algorithms, Midterm exam March 24, 2016

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

CS303 (Spring 2008) Solutions to Assignment 7

Algorithms in Systems Engineering IE172. Midterm Review. Dr. Ted Ralphs

4. Sorting and Order-Statistics

CS 6463: AT Computational Geometry Spring 2006 Convex Hulls Carola Wenk

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Theory and Algorithms Introduction: insertion sort, merge sort

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Sorting. There exist sorting algorithms which have shown to be more efficient in practice.

Algorithms and Data Structures

Fundamental mathematical techniques reviewed: Mathematical induction Recursion. Typically taught in courses such as Calculus and Discrete Mathematics.

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Algorithms - Ch2 Sorting

Lecture 2: Divide and Conquer

Algorithms. Algorithms 1.4 ANALYSIS OF ALGORITHMS

CS Divide and Conquer

Algorithms with numbers (1) CISC4080, Computer Algorithms CIS, Fordham Univ.! Instructor: X. Zhang Spring 2017

n = 1 What problems are interesting when n is just 1?

Attendance (2) Performance (3) Oral (5) Total (10) Dated Sign of Subject Teacher

Solutions to Problem Set 1

Transcription:

Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright 25 Pearson-Addon Wesley. All rights reserved. Divide-and-Conquer Divide-and-conquer. Break up problem into several parts. Solve each part recursively. Combine solutions to sub-problems into overall solution. Most common usage. Break up problem of size n into two equal parts of size n. Solve two parts recursively. Combine two solutions into overall solution in linear time. Consequence. Brute force: n 2. Divide-and-conquer: n log n. Divide et impera. Veni, vidi, vici. - Julius Caesar 2

5. Mergesort Sorting Sorting. Given n elements, rearrange in ascending order. Obvious sorting applications. Lt files in a directory. Organize an MP3 library. Lt names in a phone book. Dplay Google PageRank results. Problems become easier once sorted. Find the median. Find the closest pair. Binary search in a database. Identify stattical outliers. Find duplicates in a mailing lt. Non-obvious sorting applications. Data compression. Computer graphics. Interval scheduling. Computational biology. Minimum spanning tree. Supply chain management. Simulate a system of particles. Book recommendations on Amazon. Load balancing on a parallel computer.... 4

Mergesort Mergesort. Divide array into two halves. Recursively sort each half. Merge two halves to make sorted whole. Jon von Neumann (945) A L G O R I T H M S A L G O R I T H M S divide O() A G L O R H I M S T sort 2T(n/2) A G H I L M O R S T merge O(n) 5 Merging Merging. Combine two pre-sorted lts into a sorted whole. How to merge efficiently? Linear number of comparons. Use temporary array. A G L O R H I M S T A G H I Challenge for the bored. In-place merge. [Kronrud, 969] using only a constant amount of extra storage 6

A Useful Recurrence Relation Def. T(n) = number of comparons to mergesort an input of size n. Mergesort recurrence. Solution. T(n) = O(n log 2 n). Assorted proofs. We describe several ways to prove th recurrence. Initially we assume n a power of 2 and replace with =. 7 Proof by Recursion Tree T(n) n T(n/2) T(n/2) 2(n/2) T(n/4) T(n/4) T(n/4) T(n/4) log 2 n 4(n/4)... T(n / 2 k ) 2 k (n / 2 k )... T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) n/2 (2) n log 2 n 8

Proof by Telescoping Claim. If T(n) satfies th recurrence, then T(n) = n log 2 n. assumes n a power of 2 Pf. For n > : T(n) n = 2T(n /2) n + = T(n/2) n/2 + = L = T(n/4) n/4 T(n/n) n/n = log 2 n + + + +L+ 4 243 log 2 n 9 Proof by Induction Claim. If T(n) satfies th recurrence, then T(n) = n log 2 n. assumes n a power of 2 Pf. (by induction on n) Base case: n =. Inductive hypothes: T(n) = n log 2 n. Goal: show that T(2n) = 2n log 2 (2n).

Analys of Mergesort Recurrence Claim. If T(n) satfies the following recurrence, then T(n) n lg n. log 2 n Pf. (by induction on n) Base case: n =. T() = = lg. Define n = n / 2, n 2 = n / 2. Induction step: Let n"2, assume true for, 2,..., n. 5.4 Closest Pair of Points

Closest Pair of Points Closest pair. Given n points in the plane, find a pair with smallest Euclidean dtance between them. Fundamental geometric primitive. Graphics, computer vion, geographic information systems, molecular modeling, air traffic control. Special case of nearest neighbor, Euclidean MST, Voronoi. fast closest pair inspired fast algorithms for these problems Brute force. Check all pairs of points p and q with Θ(n 2 ) comparons. -D version. O(n log n) easy if points are on a line. Assumption. No two points have same x coordinate. to make presentation cleaner 3 Closest Pair of Points: First Attempt Divide. Sub-divide region into 4 quadrants. L 4

Closest Pair of Points: First Attempt Divide. Sub-divide region into 4 quadrants. Obstacle. Impossible to ensure n/4 points in each piece. L 5 Closest Pair of Points Algorithm. Divide: draw vertical line L so that roughly n points on each side. L 6

Closest Pair of Points Algorithm. Divide: draw vertical line L so that roughly n points on each side. Conquer: find closest pair in each side recursively. L 2 2 7 Closest Pair of Points Algorithm. Divide: draw vertical line L so that roughly n points on each side. Conquer: find closest pair in each side recursively. Combine: find closest pair with one point in each side. Return best of 3 solutions. seems like Θ(n 2 ) L 8 2 2 8

Closest Pair of Points Find closest pair with one point in each side, assuming that dtance < δ. L 2 2 δ = min(2, 2) 9 Closest Pair of Points Find closest pair with one point in each side, assuming that dtance < δ. Observation: only need to consider points within δ of line L. L 2 2 δ = min(2, 2) δ 2

Closest Pair of Points Find closest pair with one point in each side, assuming that dtance < δ. Observation: only need to consider points within δ of line L. Sort points in 2δ-strip by their y coordinate. 7 L 6 4 5 2 2 3 δ = min(2, 2) 2 δ 2 Closest Pair of Points Find closest pair with one point in each side, assuming that dtance < δ. Observation: only need to consider points within δ of line L. Sort points in 2δ-strip by their y coordinate. Only check dtances of those within positions in sorted lt 7 L 6 4 5 2 2 3 δ = min(2, 2) 2 δ 22

Closest Pair of Points Def. Let s i be the point in the 2δ-strip, with the i th smallest y-coordinate. Claim. If i j 2, then the dtance between s i and s j at least δ. Pf. 3 39 j No two points lie in same δ-by-δ box. Two points at least 2 rows apart have dtance 2(δ). 2 rows 29 3 δ δ Fact. Still true if we replace 2 with 7. i 27 28 δ 26 25 Scan points in y-order and compare dtance between each point and next neighbours. If any of these dtances less than δ, update δ. δ δ 23 Closest Pair Algorithm Closest-Pair(p,, p n ) { Compute separation line L such that half the points are on one side and half on the other side. O(n log n) δ = Closest-Pair(left half) δ 2 = Closest-Pair(right half) δ = min(δ, δ 2 ) 2T(n / 2) Delete all points further than δ from separation line L Sort remaining points by y-coordinate. O(n) O(n log n) Scan points in y-order and compare dtance between each point and next neighbours. If any of these dtances less than δ, update δ. O(n) } return δ. 24

Closest Pair of Points: Analys Running time. Q. Can we achieve O(n log n)? A. Yes. Don't sort points in strip from scratch each time. Each recursion returns two lts: all points sorted by y coordinate, and all points sorted by x coordinate. Sort by merging two pre-sorted lts. 25 5.5 Integer Multiplication

# Multiply Multiply. Given two n-digit integers a and b, compute a # b. Brute force solution: Θ(n 2 ) bit operations. 27 Integer Arithmetic Add. Given two n-digit integers a and b, compute a + b. O(n) bit operations. + Add 28 To multiply two n-digit integers: Multiply four n-digit integers. Add two n-digit integers, and shift to obtain result. Divide-and-Conquer Multiplication: Warmup assumes n a power of 2

Proof by Telescoping Claim. assumes n a power of 2 Pf. For n > : T(n)/n = 4T(n/2)/n + C = 2 T(n/2)/(n/2) + C = 2 [ 2 T(n/4)/(n/4) + C ] + C = 4 T(n/4)/(n/4) + 2C + C = 4 [ 2 T(n/8)/(n/8) + C ] + 2C + C = 8 T(n/8)/(n/8) + 4C + 2C +... = n T()/() + n/2 C + n/4 C +... + 4C + 2C + = C (n/2+n/4+...+2+) = C(n-). 29 Karatsuba Multiplication To multiply two n-digit integers: Add two n digit integers. Multiply three n-digit integers. Add, subtract, and shift n-digit integers to obtain result. Theorem. [Karatsuba-Ofman, 962] Can multiply two n-digit integers in O(n.585 ) bit operations. 3

Karatsuba: Recursion Tree T(n) n T(n/2) T(n/2) T(n/2) 3(n/2) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) 9(n/4)...... T(n / 2 k ) 3 k (n / 2 k )...... T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) 3 lg n (2) 3 Matrix Multiplication

Matrix Multiplication Matrix multiplication. Given two n-by-n matrices A and B, compute C = AB. Brute force. Θ(n 3 ) arithmetic operations. Fundamental question. Can we improve upon brute force? 33 Matrix Multiplication: Warmup Divide-and-conquer. Divide: partition A and B into n-by-n blocks. Conquer: multiply 8 n-by-n recursively. Combine: add appropriate products using 4 matrix additions. 34

Matrix Multiplication: Key Idea Key idea. multiply 2-by-2 block matrices with only 7 multiplications. 7 multiplications. 8 = + 8 additions (or subtractions). 35 Fast Matrix Multiplication Fast matrix multiplication. (Strassen, 969) Divide: partition A and B into n-by-n blocks. Compute: 4 n-by-n matrices via matrix additions. Conquer: multiply 7 n-by-n matrices recursively. Combine: 7 products into 4 terms using 8 matrix additions. Analys. Assume n a power of 2. T(n) = # arithmetic operations. 36

Fast Matrix Multiplication in Practice Implementation sues. Sparsity. Caching effects. Numerical stability. Odd matrix dimensions. Crossover to classical algorithm around n = 28. Common mperception: "Strassen only a theoretical curiosity." Advanced Computation Group at Apple Computer reports 8x speedup on G4 Velocity Engine when n ~ 2,5. Range of instances where it's useful a subject of controversy. Remark. Can "Strassenize" Ax=b, determinant, eigenvalues, and other matrix ops. 37 Fast Matrix Multiplication in Theory Q. Multiply two 2-by-2 matrices with only 7 scalar multiplications? A. Yes [Strassen, 969] Θ(n log 2 7 ) = O(n 2.8 ) Q. Multiply two 2-by-2 matrices with only 6 scalar multiplications? (n log 2 A. Impossible. [Hopcroft and Kerr, 97] Θ 6 ) = O(n 2.59 ) Q. Two 3-by-3 matrices with only 2 scalar multiplications? (n log 3 A. Also impossible. Θ 2 ) = O(n 2.77 ) Q. Two 7-by-7 matrices with only 43,64 scalar multiplications? A. Yes [Pan, 98] (n log 7 Θ 4364 ) = O(n 2.8 ) Decimal wars. December, 979: O(n 2.5283 ). January, 98: O(n 2.528 ). 38

Fast Matrix Multiplication in Theory Best known. O(n 2.376 ) [Coppersmith-Winograd, 987.] Conjecture. O(n 2+ε ) for any ε >. Caveat. Theoretical improvements to Strassen are progressively less practical. 39 CLRS 4.3 Master Theorem

Master Theorem from CLRS 4.3 4 Proof by Recursion Tree T(n) f(n) T(n/b) T(n/b) T(n/b) af(n/b) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) T(n/b 2 ) a 2 f(n/b 2 )... log b n... T(n / b k ) T(n / b k ) T(n / b k ) T(n / b k ) T(n / b k ) T(n / b k ) T(n / b k ) T(n / b k ) T(n / b k )... a k f(n/b k )... T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) a logb n T(2) = T(2) n logb a T(n) = a k f(n/b k ) 42

Master Theorem from CLRS 4.3 43 Master Theorem from CLRS 4.3 44

Master Theorem from CLRS 4.3 45 Master Theorem from CLRS 4.3 46

Master Theorem from CLRS 4.3 47 Master Theorem from CLRS 4.3 48

Master Theorem from CLRS 4.3 49 Master Theorem from CLRS 4.3 5

Extra Slides 5.3 Counting Inversions

Counting Inversions Music site tries to match your song preferences with others. You rank n songs. Music site consults database to find people with similar tastes. Similarity metric: number of inversions between two rankings. My rank:, 2,, n. Your rank: a, a 2,, a n. Songs i and j inverted if i < j, but a i > a j. Songs Me You A B C D E 2 3 4 5 3 4 2 5 Inversions 3-2, 4-2 Brute force: check all Θ(n 2 ) pairs i and j. 53 Applications Applications. Voting theory. Collaborative filtering. Measuring the "sortedness" of an array. Sensitivity analys of Google's ranking function. Rank aggregation for meta-searching on the Web. Nonparametric stattics (e.g., Kendall's Tau dtance). 54

Counting Inversions: Divide-and-Conquer Divide-and-conquer. 5 4 8 2 6 9 2 3 7 55 Counting Inversions: Divide-and-Conquer Divide-and-conquer. Divide: separate lt into two pieces. 5 4 8 2 6 9 2 3 7 Divide: O(). 5 4 8 2 6 9 2 3 7 56

Counting Inversions: Divide-and-Conquer Divide-and-conquer. Divide: separate lt into two pieces. Conquer: recursively count inversions in each half. 5 4 8 2 6 9 2 3 7 Divide: O(). 5 4 8 2 6 9 2 3 7 Conquer: 2T(n / 2) 5 blue-blue inversions 8 green-green inversions 5-4, 5-2, 4-2, 8-2, -2 6-3, 9-3, 9-7, 2-3, 2-7, 2-, -3, -7 57 Counting Inversions: Divide-and-Conquer Divide-and-conquer. Divide: separate lt into two pieces. Conquer: recursively count inversions in each half. Combine: count inversions where a i and a j are in different halves, and return sum of three quantities. 5 4 8 2 6 9 2 3 7 Divide: O(). 5 4 8 2 6 9 2 3 7 Conquer: 2T(n / 2) 5 blue-blue inversions 8 green-green inversions 9 blue-green inversions 5-3, 4-3, 8-6, 8-3, 8-7, -6, -9, -3, -7 Combine:??? Total = 5 + 8 + 9 = 22. 58

Counting Inversions: Combine Combine: count blue-green inversions Assume each half sorted. Count inversions where a i and a j are in different halves. Merge two sorted halves into sorted whole. to maintain sorted invariant 3 7 4 8 9 2 6 7 23 25 6 3 2 2 3 blue-green inversions: 6 + 3 + 2 + 2 + + Count: O(n) 2 3 7 4 6 7 8 9 23 25 Merge: O(n) 59 Counting Inversions: Implementation Pre-condition. [Merge-and-Count] A and B are sorted. Post-condition. [Sort-and-Count] L sorted. Sort-and-Count(L) { if lt L has one element return and the lt L Divide the lt into two halves A and B (r A, A) Sort-and-Count(A) (r B, B) Sort-and-Count(B) (r B, L) Merge-and-Count(A, B) } return r = r A + r B + r and the sorted lt L 6