CIS 121 Data Structures and Algorithms with Java Spring Code Snippets and Recurrences Monday, January 29/Tuesday, January 30

Similar documents
Data Structures and Algorithms Running time, Divide and Conquer January 25, 2018

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

Sorting. Task Description. Selection Sort. Should we worry about speed?

CS 171: Introduction to Computer Science II. Quicksort

Faster Sorting Methods

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.

Better sorting algorithms (Weiss chapter )

Lecture 8: Mergesort / Quicksort Steven Skiena

Lecture Notes on Quicksort

Unit-2 Divide and conquer 2016

Divide-and-Conquer. Divide-and conquer is a general algorithm design paradigm:

Quicksort (Weiss chapter 8.6)

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems.

Mergesort again. 1. Split the list into two equal parts

COE428 Lecture Notes Week 1 (Week of January 9, 2017)

The divide and conquer strategy has three basic parts. For a given problem of size n,

SORTING AND SELECTION

Sorting. CSE 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation. CSE143 Au

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 5. The Towers of Hanoi. Divide and Conquer

Lecture 5: Sorting Part A

Data Structure Lecture#17: Internal Sorting 2 (Chapter 7) U Kang Seoul National University

Data Structures and Algorithms Chapter 4

Sorting. CPSC 259: Data Structures and Algorithms for Electrical Engineers. Hassan Khosravi

Divide and Conquer Algorithms

Lecture 2: Getting Started

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

Sorting. CSC 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation CSC Picture

Data structures. More sorting. Dr. Alex Gerdes DIT961 - VT 2018

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn

Data Structures and Algorithms Week 4

CSC Design and Analysis of Algorithms

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

CS134 Spring 2005 Final Exam Mon. June. 20, 2005 Signature: Question # Out Of Marks Marker Total

Lecture 19 Sorting Goodrich, Tamassia

CSE 373: Data Structures and Algorithms

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

Lecture Notes on Quicksort

CS 506, Sect 002 Homework 5 Dr. David Nassimi Foundations of CS Due: Week 11, Mon. Apr. 7 Spring 2014

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

EECS 2011M: Fundamentals of Data Structures

Scribe: Sam Keller (2015), Seth Hildick-Smith (2016), G. Valiant (2017) Date: January 25, 2017

Cosc 241 Programming and Problem Solving Lecture 17 (30/4/18) Quicksort

University of the Western Cape Department of Computer Science

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Key question: how do we pick a good pivot (and what makes a good pivot in the first place)?

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

Introduction to Algorithms

QuickSort

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem

Module 3: Sorting and Randomized Algorithms. Selection vs. Sorting. Crucial Subroutines. CS Data Structures and Data Management

Module 3: Sorting and Randomized Algorithms

Module 3: Sorting and Randomized Algorithms

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Lecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort

CSE 373 MAY 24 TH ANALYSIS AND NON- COMPARISON SORTING

We can use a max-heap to sort data.

Mergesort again. 1. Split the list into two equal parts

Outline Purpose How to analyze algorithms Examples. Algorithm Analysis. Seth Long. January 15, 2010

II (Sorting and) Order Statistics

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

4.4 Algorithm Design Technique: Randomization

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

Divide & Conquer. 2. Conquer the sub-problems by solving them recursively. 1. Divide the problem into number of sub-problems

Divide and Conquer 4-0

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Sorting is a problem for which we can prove a non-trivial lower bound.

Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn

COMP Data Structures

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

CS 171: Introduction to Computer Science II. Quicksort

Your favorite blog : (popularly known as VIJAY JOTANI S BLOG..now in facebook.join ON FB VIJAY

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Programming II (CS300)

Sorting Algorithms. + Analysis of the Sorting Algorithms

CSC 273 Data Structures

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 4 Notes. Class URL:

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS

Algorithms Lab 3. (a) What is the minimum number of elements in the heap, as a function of h?

CS302 Topic: Algorithm Analysis #2. Thursday, Sept. 21, 2006

Algorithms and Data Structures (INF1) Lecture 7/15 Hua Lu

1 Probabilistic analysis and randomized algorithms

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Introduction to Computers and Programming. Today

4. Sorting and Order-Statistics

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

Lecture 9: Sorting Algorithms

MergeSort. Algorithm : Design & Analysis [5]

DATA STRUCTURES AND ALGORITHMS

Recursion. Chapter 7. Copyright 2012 by Pearson Education, Inc. All rights reserved

COSC 311: ALGORITHMS HW1: SORTING

MERGESORT & QUICKSORT cs2420 Introduction to Algorithms and Data Structures Spring 2015

Sorting Goodrich, Tamassia Sorting 1

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

Sorting & Growth of Functions

CS 140 : Non-numerical Examples with Cilk++

Transcription:

CIS 11 Data Structures and Algorithms with Java Spring 018 Code Snippets and Recurrences Monday, January 9/Tuesday, January 30 Learning Goals Practice solving recurrences and proving asymptotic bounds Strengthen intuition for the divide and conquer paradigm Review Quicksort Code Snippets We can apply our knowledge of Big-Oh and summations to find the run time of a snippet of our code. Besides recursion, nested iteration is where our code s efficiency will be bottlenecked. We should consider the loop as a summation, and use our knowledge to simplify it from there. Try starting from the innermost loop with fixed bounds and working outwards. Code Snippet Problems Problem 1 Problem 1 a Provide a running time analysis of the following loop. That is, find both Big-Oh and Big Ω: for ( int i = 0 ; i < n ; i ++) for ( int j = i ; j <= n ; j++) for ( int k = i ; k <= j ; k++) sum++; Problem 1 b Provide a running time analysis of the following loop. That is, find both Big-Oh and Big Ω: for ( int i = ; i < n ; i = i i ) for ( int j = 1 ; j < Math. s q r t ( i ) ; j = j+j ) System. out. p r i n t l n ( ) ; Divide and Conquer - Quicksort Using the divide and conquer paradigm, we can solve problems by dividing them into smaller subproblems, and later merging them together. While Mergesort spends most of its time combining these subproblems after their completion, we can also perform this additional work before we solve the subproblems. In an instance of Quicksort, we must first decide on a pivot. This could be the element at any location in the input array. The function Partition is then invoked. Partition accomplishes the following: it places the pivot in the location that it should be in the output and places all elements that are at most the pivot to the left of the pivot and all elements greater than the pivot to its right. Then we recurse on both parts. The pseudocode for Quicksort is as follows. 1

QSort(A[lo..hi]) if hi <= lo then return else pivotindex = floor((lo+hi)/) (this could have been any location) loc = Partition(A, lo, hi, pindex) QSort(A[lo..loc-1]) QSort(A[loc+1..hi]) One possible implementation of the function Partition is as follows. Partition(A, lo, hi, pindex) pivot = A[pIndex] swap(a, pivotindex, hi) left = lo right = hi-1 while left <= right do if (A[left] <= pivot) then left = left + 1 else swap(a, left, right) right = right - 1 swap(a, left, hi) return left Ideally, we d like a recurrence describing our Quicksort to look like: 1, n = 1 T (n/) + cn, n However, certain inputs can cause sub-optimal run time. For example, an instance of the quicksort algorithm in which the pivot is always the first element in the input array performs poorly on an array in descending order of its elements. The worst case running time of the algorithm is given by 1, n = 1 T (n 1) + cn, n Hence the worst case running time of QSort is Θ(n ). Recurrences As you have seen in class, recurrences are equations that can help us describe the running time of a recursive algorithm. You have thus far seen two different ways of solving recurrences: Iteration. In this method, we expand T (n) fully by substitution and solve for T (n) directly. Recursion trees. In this method, we draw the recursive calls to T (n) in a tree format and count the amount of work done in each level of the tree. In this lab, we will focus on the method of iteration. Let s first go through some examples before running through problems.

Example: Method of Iteration I Let s examine the following recurrence T (n). T (n 1) + n n 1 0 otherwise Using the method of iteration, we expand T (n) as follows: T (n 1) + n = [T (n ) + (n 1)] + n = [[T (n 3) + (n )] + (n 1)] + n. = n i i=1 = 1 (n + 1)(n) = ( n + 1 Example: Method of Recursion Trees ) = Θ(n ) Let s examine the recurrence of merge sort. For those that are unfamiliar with it, the algorithm works by taking an unsorted array, sorting the left and right halves of the array recursively, and then merging the two sorted halves together to end up with the final sorted list. Let T (n) represent the time the algorithm takes for an input of size n. Since the two halves are sorted recursively by the same algorithm, but with inputs that are each half the size of the original, each half should take time T ( n ). The merging takes linear time. So we can write T ( n ) + cn for some constant c. The recursion-tree is shown below. 3

Note that at the very top level (i.e., the end of the algorithm), it costs cn to merge the two sorted halves of the array. But to get there, we needed to solve the two problems of size n n. Each costs c to solve. Therefore, across that level, the total cost is c n + c n = cn. We can continue this all the way until we get to the very bottom of the tree, which are single elements. Note then that every level ends up costing cn. The height of the tree is lg n. Therefore, the total cost is c n lg n. To see why the height is lg n, observe that subproblem sizes decrease by a factor of each time we go down one level, we stop when we reach singleton elements. The subproblem size for a node at depth i is n. Thus, the subproblem size hits n = 1 when n i = 1 or, equivalently, when i = lg n. You can read more i examples in CLRS 4.4. Problems You may assume in the following cases that n is either some power of or 3. You may also find the following helpful: k i=0 Geometric series: q i = qk+1 1 q 1 = 1 qk+1 1 q Sum of binomial coefficients over upper index: n j=0 ( ) j = m ( ) n + 1 m + 1 Problem 1 Problem. Solve the following recurrences. 1. T (n 1) + ( ) n n. 3. 4. T ( n ) + n n 1 T ( n 3 ) + n n 1 T (n 1) + n n 1 Problem The CIS160 and CIS11 TAs are having another social gathering. For some reason or another, the 160 TAs particularly enjoy playing the Tower of Hanoi and insist that the 11 TAs compete with them to see who can solve the game the fastest. Of course, the clever 11 TAs are very proficient with their running time analysis skills, and know that an optimal strategy is detailed by the following code: 4

function HanoiTower(h, f rom, to, mid) if h 1 then HanoiTower(h 1, from, mid, to) MoveDisk(f rom, to) HanoiTower(h 1, mid, to, from) O(1) time operation Problem. What is the running time of the above algorithm? Prove it inductively. 5