CSC Design and Analysis of Algorithms

Similar documents
CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Divide-and-Conquer. The most-well known algorithm design strategy: smaller instances. combining these solutions

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Divide-and-Conquer. Dr. Yingwu Zhu

Divide and Conquer. Algorithm Fall Semester

Divide and Conquer 4-0

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer

CS Divide and Conquer

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

CS Divide and Conquer

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

CS473 - Algorithms I

We can use a max-heap to sort data.

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

Unit-2 Divide and conquer 2016

Divide and Conquer. Algorithm D-and-C(n: input size)

CSCE 411 Design and Analysis of Algorithms

A 0 A 1... A i 1 A i,..., A min,..., A n 1 in their final positions the last n i elements After n 1 passes, the list is sorted.

Module 2: Classical Algorithm Design Techniques

CS483 Analysis of Algorithms Lecture 03 Divide-n-Conquer

Sorting. There exist sorting algorithms which have shown to be more efficient in practice.

Mergesort again. 1. Split the list into two equal parts

Data Structures and Algorithms

CS 171: Introduction to Computer Science II. Quicksort

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

CSC Design and Analysis of Algorithms. Lecture 5. Decrease and Conquer Algorithm Design Technique. Decrease-and-Conquer

Lecture 8: Mergesort / Quicksort Steven Skiena

Data Structures and Algorithms CSE 465

Advanced Algorithms. Problem solving Techniques. Divide and Conquer הפרד ומשול

EECS 2011M: Fundamentals of Data Structures

Test 1 Review Questions with Solutions

Algorithm Design and Analysis

MA/CSSE 473 Day 17. Divide-and-conquer Convex Hull. Strassen's Algorithm: Matrix Multiplication. (if time, Shell's Sort)

Quicksort (Weiss chapter 8.6)

Lecture 5: Sorting Part A

CSC Design and Analysis of Algorithms. Lecture 7. Transform and Conquer I Algorithm Design Technique. Transform and Conquer

Divide and Conquer. Divide and Conquer

5. DIVIDE AND CONQUER I

CSE 421 Closest Pair of Points, Master Theorem, Integer Multiplication

Object-oriented programming. and data-structures CS/ENGRD 2110 SUMMER 2018

CS 6402 DESIGN AND ANALYSIS OF ALGORITHMS QUESTION BANK

DATA STRUCTURES AND ALGORITHMS

Lecture 19 Sorting Goodrich, Tamassia

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

CS 171: Introduction to Computer Science II. Quicksort

Divide-and-Conquer CSE 680

CSC Design and Analysis of Algorithms

The divide and conquer strategy has three basic parts. For a given problem of size n,

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Acknowledgement. Outline

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

Sorting. CSE 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation. CSE143 Au

Sorting. CSC 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation CSC Picture

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

Divide & Conquer Design Technique

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Module 3: Sorting and Randomized Algorithms

4. Sorting and Order-Statistics

Module 3: Sorting and Randomized Algorithms. Selection vs. Sorting. Crucial Subroutines. CS Data Structures and Data Management

Divide and Conquer. Design and Analysis of Algorithms Andrei Bulatov

CPSC 320 Midterm 2 Thursday March 13th, 2014

CSC 273 Data Structures

CS 6463: AT Computational Geometry Spring 2006 Convex Hulls Carola Wenk

Lecture 2: Getting Started

SORTING AND SELECTION

Module 3: Sorting and Randomized Algorithms

COMP Analysis of Algorithms & Data Structures

Hiroki Yasuga, Elisabeth Kolp, Andreas Lang. 25th September 2014, Scientific Programming

Divide-and-Conquer. Combine solutions to sub-problems into overall solution. Break up problem of size n into two equal parts of size!n.

Programming II (CS300)

Selection (deterministic & randomized): finding the median in linear time

Midterm solutions. n f 3 (n) = 3

Sorting Algorithms. + Analysis of the Sorting Algorithms

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

UNIT-2 DIVIDE & CONQUER

COMP 3403 Algorithm Analysis Part 2 Chapters 4 5. Jim Diamond CAR 409 Jodrey School of Computer Science Acadia University

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

Algorithms in Systems Engineering ISE 172. Lecture 12. Dr. Ted Ralphs

Practical Session #11 - Sort properties, QuickSort algorithm, Selection

Sorting Algorithms. Slides used during lecture of 8/11/2013 (D. Roose) Adapted from slides by

2/26/2016. Divide and Conquer. Chapter 6. The Divide and Conquer Paradigm

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

Sorting Algorithms Day 2 4/5/17

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

CSC Design and Analysis of Algorithms. Lecture 5. Decrease and Conquer Algorithm Design Technique. Decrease-and-Conquer

Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn

Comparison Sorts. Chapter 9.4, 12.1, 12.2

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Divide-and-Conquer Algorithms

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

Scribe: Sam Keller (2015), Seth Hildick-Smith (2016), G. Valiant (2017) Date: January 25, 2017

Data Structure Lecture#17: Internal Sorting 2 (Chapter 7) U Kang Seoul National University

Randomized Algorithms, Quicksort and Randomized Selection

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS

Quick Sort. CSE Data Structures May 15, 2002

QuickSort

Lecture 7. Transform-and-Conquer

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Transcription:

CSC 8301- Design and Analysis of Algorithms Lecture 6 Divide and Conquer Algorithm Design Technique Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide a problem instance into two or more smaller instances (ideally of about the same size) 2. Solve the smaller instances (usually recursively) 3. Obtain a solution to the original instance by combining these solutions to the smaller instances 2 1

Divide-and-Conquer Technique (cont.) a problem of size n subproblem 1 of size n/2 subproblem 2 of size n/2 a solution to subproblem 1 a solution to subproblem 2 a solution to the original problem 3 General Divide-and-Conquer Recurrence T(n) = at(n/b) + f (n) where f(n) Θ(n d ), d 0 Master Theorem: If a < b d, T(n) Θ(n d ) If a = b d, T(n) Θ(n d log n) If a > b d, T(n) Θ(n log b a ) Note: The same results hold with O instead of Θ Examples: T(n) = 4T(n/2) + n T(n)? T(n) = 4T(n/2) + n 2 T(n)? T(n) = 4T(n/2) + n 3 T(n)? 4 2

Recursion Tree (use for intuition) T(n) = at(n/b) + f (n) Visualize this as a recursion tree (branch factor a): Time Total time depends on how fast f(n) grows compared with the number of leaves (d compared with log b a) 5 Divide-and-Conquer Examples Sorting: mergesort and quicksort Binary tree traversals Binary search (?) Multiplication of large integers Matrix multiplication: Strassen s algorithm Closest-pair and convex-hull algorithms 6 3

Mergesort Split array A[0..n-1] in two about equal halves and make copies of each half in arrays B and C Sort arrays B and C recursively Merge sorted arrays B and C into array A as follows: Repeat the following until no elements remain in one of the arrays: compare the first elements in the remaining unprocessed portions of the arrays copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array Once all elements in one of the arrays are processed, copy the remaining unprocessed elements from the other array into A 7 Merging of Two Sorted Arrays 8 4

9 10 5

11 12 6

3 8 2 9 1 7 4 5 13 3 8 2 9 1 7 4 5 14 7

3 8 2 9 1 7 4 5 15 3 8 2 9 1 7 4 5 2 3 8 9 1 4 5 7 16 8

3 8 2 9 1 7 4 5 2 3 8 9 1 4 5 7 17 3 8 2 9 1 7 4 5 2 3 8 9 1 4 5 7 1 2 3 4 5 7 8 9 18 9

Analysis of Mergesort Recurrence for number of comparisons in the worst case: C(n) = 2 C(n/2) + C merge (n) = 2 C(n/2) + n-1 C(1) = 0 All cases have same efficiency: Θ(n log n) Space requirement: Θ(n) (not in-place) Can be implemented without recursion (bottom-up) 19 Quicksort Select a pivot (partitioning element) here, the first element Rearrange the list so that all the elements in the first s positions are smaller than or equal to the pivot and all the elements in the remaining n-s positions are larger than or equal to the pivot (see next slide for an algorithm) s p A[i] p Exchange the pivot with the last element in the first (i.e., ) subarray the pivot is now in its final position Sort the two subarrays recursively A[i] p 20 10

Two-Way (Hoar s) Partitioning Algorithm 21 Quicksort Example 5 3 1 9 8 2 4 7 22 11

Analysis of Quicksort Best-case time efficiency: split in the middle Θ(n log n) Worst-case time efficiency: sorted array! Θ(n 2 ) Average case time efficiency: random arrays Θ(n log n) Space efficiency: not in-place Θ(log n) with a careful implementation Not stable Improvements: better pivot selection: median-of-three partitioning switch to insertion sort on small subfiles or just stopping recursive calls when unsorted subarrays become small (say, <10 elements) and finish sorting with insertion sort These yields about 20% improvement Considered the method of choice for sorting random files of nontrivial sizes 23 Quicksort: Best Case 24 12

Quicksort: Worst Case 25 Quicksort: Average Case 26 13

Binary Tree Algorithms Binary tree is a divide-and-conquer ready structure! Ex. 1: Classic traversals (preorder, inorder, postorder) Algorithm Inorder(T) if T a a Inorder(T left ) b c b c print(root of T) d e d e Inorder(T right ) Formula for the number of extended nodes: x = n + 1 Efficiency: Θ(n) 27 Binary Tree Algorithms (cont.) Ex. 2: Computing the height of a binary tree T L T R h(t) = max{h(t L ), h(t R )} + 1 if T and h( ) = -1 Efficiency: Θ(n) 28 14

Multiplication of Large Integers Consider the problem of multiplying two (large) n-digit integers represented by arrays of their digits such as: A = 12345678901357986429 B = 87654321284820912836 The grade-school algorithm: a 1 a 2 a n b 1 b 2 b n (d 10 ) d 11 d 12 d 1n (d 20 ) d 21 d 22 d 2n (d n0 ) d n1 d n2 d nn Efficiency: n 2 one-digit multiplications 29 First Divide-and-Conquer Algorithm A small example: A * B where A = 2135 and B = 4014 A = (21 10 2 + 35), B = (40 10 2 + 14) So, A * B = (21 10 2 + 35) * (40 10 2 + 14) = 21 * 40 10 4 + (21 * 14 + 35 * 40) 10 2 + 35 * 14 In general, if A = A 1 A 2 and B = B 1 B 2 (where A and B are n-digit, A 1, A 2, B 1, B 2 are n/2-digit numbers), A * B = A 1 * B 1 10 n + (A 1 * B 2 + A 2 * B 1 ) 10 n/2 + A 2 * B 2 Recurrence for the number of one-digit multiplications M(n): M(n) = 4M(n/2), M(1) = 1 Solution: M(n) = n 2 30 15

Second Divide-and-Conquer Algorithm A * B = A 1 * B 1 10 n + (A 1 * B 2 + A 2 * B 1 ) 10 n/2 + A 2 * B 2 The idea is to decrease the number of multiplications from 4 to 3: (A 1 + A 2 ) * (B 1 + B 2 ) = A 1 * B 1 + (A 1 * B 2 + A 2 * B 1 ) + A 2 * B 2, I.e., (A 1 * B 2 + A 2 * B 1 ) = (A 1 + A 2 ) * (B 1 + B 2 ) - A 1 * B 1 - A 2 * B 2, which requires only 3 multiplications at the expense of (4-1) extra add/sub. Recurrence for the number of multiplications M(n): M(n) = 3M(n/2), M(1) = 1 Solution: M(n) = 3 log 2 n = n log 2 3 n 1.585 31 Strassen s Matrix Multiplication Strassen observed [1969] that the product of two matrices can be computed as follows: C 00 C 01 A 00 A 01 B 00 B 01 = * C 10 C 11 A 10 A 11 B 10 B 11 = M 1 + M 4 - M 5 + M 7 M 3 + M 5 M 2 + M 4 M 1 + M 3 - M 2 + M 6 32 16

Formulas for Strassen s Algorithm M 1 = (A 00 + A 11 ) * (B 00 + B 11 ) M 2 = (A 10 + A 11 ) * B 00 M 3 = A 00 * (B 01 - B 11 ) M 4 = A 11 * (B 10 - B 00 ) M 5 = (A 00 + A 01 ) * B 11 M 6 = (A 10 - A 00 ) * (B 00 + B 01 ) M 7 = (A 01 - A 11 ) * (B 10 + B 11 ) 33 Analysis of Strassen s Algorithm If n is not a power of 2, matrices can be padded with zeros. Number of multiplications: M(n) = 7M(n/2), M(1) = 1 Solution: M(n) = 7 log 2 n = n log 27 n 2.807 vs. n 3 of brute-force alg. Algorithms with better asymptotic efficiency are known but they are even more complex. 34 17

Closest-Pair Problem by Divide-and-Conquer Step 1 Divide the points given into two subsets P l and P r by a vertical line x = m so that half the points lie to the left or on the line and half the points lie to the right or on the line. x = m d l d r 2d x d rectangle d d d = min{d l, d r } 35 Closest Pair by Divide-and-Conquer (cont.) Step 2 Find recursively the closest pairs for the left and right subsets. Step 3 Set d = min{d l, d r } We can limit our attention to the points in the symmetric vertical strip S of width 2d as possible closest pair. (The points are stored and processed in increasing order of their y coordinates.) Step 4 Scan the points in the vertical strip S from the lowest up. For every point p(x,y) in the strip, inspect points in in the strip that may be closer to p than d. There can be no more than 5 such points following p on the strip list! 36 18

Efficiency of the Closest-Pair Algorithm Recurrence for the Running time of the algorithm T(n) = 2T(n/2) + M(n), where M(n) O(n) By the Master Theorem (with a = 2, b = 2, d = 1) T(n) O(n log n) 37 Homework Reading: Chapter 5, Appendix B (pp. 487-491) Exercises: 5.1: 2, 6, 9 5.2: 1, 7, 8, 9 5.3: 2, 3, 8, 11 5.4: 2, 3 5.5: 2 Next: Transform and Conquer (Ch. 6) 38 19