Question 7.11 Show how heapsort processes the input:

Similar documents
Lower Bound on Comparison-based Sorting

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

We can use a max-heap to sort data.

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

How fast can we sort? Sorting. Decision-tree example. Decision-tree example. CS 3343 Fall by Charles E. Leiserson; small changes by Carola

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

CSE 373 MAY 24 TH ANALYSIS AND NON- COMPARISON SORTING

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

SORTING. Comparison of Quadratic Sorts

Sorting Pearson Education, Inc. All rights reserved.

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours: Wed 4-6 pm (CSEB 3043), or by appointment.

Total Points: 60. Duration: 1hr

1. Covered basics of a simple design technique (Divideand-conquer) 2. Next, more sorting algorithms.

Design and Analysis of Algorithms

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

1 Lower bound on runtime of comparison sorts

Searching, Sorting. part 1

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

4. Sorting and Order-Statistics

Treaps. 1 Binary Search Trees (BSTs) CSE341T/CSE549T 11/05/2014. Lecture 19

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Lower bound for comparison-based sorting

Sorting is a problem for which we can prove a non-trivial lower bound.

Data Structures and Algorithms Chapter 4

Comparison Sorts. Chapter 9.4, 12.1, 12.2

Introduction to Computers and Programming. Today

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

Next. 1. Covered basics of a simple design technique (Divideand-conquer) 2. Next, more sorting algorithms.

Sorting Shabsi Walfish NYU - Fundamental Algorithms Summer 2006

CS61BL. Lecture 5: Graphs Sorting

Sorting. There exist sorting algorithms which have shown to be more efficient in practice.

Data Structures and Algorithms Week 4

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

Divide and Conquer Sorting Algorithms and Noncomparison-based

Data Structures and Algorithms

CS240 Fall Mike Lam, Professor. Quick Sort

Data Structures. Sorting. Haim Kaplan & Uri Zwick December 2013

Data Structures and Algorithms

Quicksort (Weiss chapter 8.6)

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

Solutions. (a) Claim: A d-ary tree of height h has at most 1 + d +...

Data Structures and Algorithms. Roberto Sebastiani

Programming II (CS300)

Sorting. Riley Porter. CSE373: Data Structures & Algorithms 1

Examination Questions Midterm 2

Introduction to Algorithms February 24, 2004 Massachusetts Institute of Technology Professors Erik Demaine and Shafi Goldwasser Handout 8

Comparison Based Sorting Algorithms. Algorithms and Data Structures: Lower Bounds for Sorting. Comparison Based Sorting Algorithms

COMP Data Structures

Algorithms and Data Structures: Lower Bounds for Sorting. ADS: lect 7 slide 1

COMP2012H Spring 2014 Dekai Wu. Sorting. (more on sorting algorithms: mergesort, quicksort, heapsort)

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

EECS 2011M: Fundamentals of Data Structures

Data Structures and Algorithms

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

Design and Analysis of Algorithms - - Assessment

Mergesort again. 1. Split the list into two equal parts

II (Sorting and) Order Statistics

Quick Sort. CSE Data Structures May 15, 2002

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

CS Sorting Terms & Definitions. Comparing Sorting Algorithms. Bubble Sort. Bubble Sort: Graphical Trace

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

having any value between and. For array element, the plot will have a dot at the intersection of and, subject to scaling constraints.

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

Sorting Algorithms. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

Sorting and Selection

Lecture Notes on Quicksort

Better sorting algorithms (Weiss chapter )

The Limits of Sorting Divide-and-Conquer Comparison Sorts II

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

Lecture 5: Sorting Part A

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS

Sorting Algorithms. + Analysis of the Sorting Algorithms

Recitation 9. Prelim Review

CSE 143. Two important problems. Searching and Sorting. Review: Linear Search. Review: Binary Search. Example. How Efficient Is Linear Search?

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang

21# 33# 90# 91# 34# # 39# # # 31# 98# 0# 1# 2# 3# 4# 5# 6# 7# 8# 9# 10# #

FINALTERM EXAMINATION Fall 2009 CS301- Data Structures Question No: 1 ( Marks: 1 ) - Please choose one The data of the problem is of 2GB and the hard

Introduction. Sorting. Definitions and Terminology: Program efficiency. Sorting Algorithm Analysis. 13. Sorting. 13. Sorting.

Lecture 8: Mergesort / Quicksort Steven Skiena

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Acknowledgement. Outline

Reading for this lecture (Goodrich and Tamassia):

Selection (deterministic & randomized): finding the median in linear time

Lecture Notes on Quicksort

Computational Optimization ISE 407. Lecture 16. Dr. Ted Ralphs

CS126 Final Exam Review

CS 310 Advanced Data Structures and Algorithms

Chapter 10. Sorting and Searching Algorithms. Fall 2017 CISC2200 Yanjun Li 1. Sorting. Given a set (container) of n elements

Data structures. More sorting. Dr. Alex Gerdes DIT961 - VT 2018

Topics Recursive Sorting Algorithms Divide and Conquer technique An O(NlogN) Sorting Alg. using a Heap making use of the heap properties STL Sorting F

CS 5321: Advanced Algorithms Sorting. Acknowledgement. Ali Ebnenasir Department of Computer Science Michigan Technological University

ECE 2574: Data Structures and Algorithms - Basic Sorting Algorithms. C. L. Wyatt

QuickSort

CMPSCI 187: Programming With Data Structures. Lecture #34: Efficient Sorting Algorithms David Mix Barrington 3 December 2012

Sorting is ordering a list of objects. Here are some sorting algorithms

Introduction to Algorithms I

L14 Quicksort and Performance Optimization

Transcription:

Question 7.11 Show how heapsort processes the input: 142, 543, 123, 65, 453, 879, 572, 434, 111, 242, 811, 102. Solution. Step 1 Build the heap. 1.1 Place all the data into a complete binary tree in the order given. You should get: 142 543 123 65 453 879 572 434 111 242 811 102 Fig. 1. After Step 1.1. 1.2 Now turn it in to a maximum heap (i.e. that is the root is the largest element of each subtree). You should get: 879 811 572 434 543 123 142 65 111 242 453 102 Fig. 2. After Step 1.2.

Step 2. Remove the maximum element from the heap 12 times and store in an array in reverse order (i.e. place the first element removed in position 12). The heap order property must be maintained after each deletion. After the first deletion the heap should look like this: 811 543 572 434 453 123 142 65 242 111 102 879 Fig. 3. After the first deletion. After the fifth deletion the heap should look like this: 434 242 142 111 65 123 102 453 543 572 811 879 Fig. 4. After the fifth deletion. Continue this process until all no elements remain in the heap. Note. The use an extra array to store the result can be avoided by placing deleted elements at the end of the heap and marking them as not being in the heap.

Question 7.12 What is the running time of heapsort for presorted input? Solution. O(N log N). Question 7.19 Sort 3, 1, 4, 1, 5, 9, 6, 5, 3, 5 using median-of-three partitioning and a cut-off of 3. Solution. Median-of-three refers to the strategy used to choose the pivot element. The cut-off is the array size at which insertion sort is used in place of quicksort. The reason for this is that insertion sort beats insertion sort if the number of elements to be sorted is small. 1st 6th 11th 3 1 4 1 5 9 2 6 5 3 5 Initial Set 3 1 4 1 5 5 2 6 5 3 9 The first, middle, and last element are sorted 3 1 4 1 5 3 2 6 5 5 9 The median is "hidden" in the second last position. 3 1 4 1 5 3 2 5 5 6 9 Partition the remainder of the elements and swap the pivot back. Fig. 5. Following the initial partitioning. There are only two elements in the RHS partition. An insertion sort subroutine is called to sort these two elements. Quicksort is recursively applied to the elements in the LHS of the pivot.

1st 4th 8th 3 1 4 1 5 3 2 5 Initial Set 1 1 4 3 5 3 2 5 The first, middle, and last element are sorted 1 1 4 2 5 3 3 5 The median is "hidden" in the second last position. 1 1 3 2 3 4 5 5 Partition the remainder of the elements and swap the pivot back. Fig. 6. Following the second partitioning. Again the number of elements to the RHS of the pivot is 3 and thus are sorted with a call to an insertion sort subroutine. 1st 2nd 4th 1 1 3 2 Initial Set 1 1 3 2 The first, middle, and last element are sorted 1 3 1 2 The median is "hidden" in the second last position. 1 1 3 2 Partition the remainder of the elements and swap the pivot back. Fig. 7. Following the third partitioning. Now the number of elements of both the LHS and the RHS is less than equal to the cut-off of 3 and so is sorted using an insertion sort subroutine.

Question 7.20 Using the quicksort implementation of this chapter, determine the running time of quicksort for: (a) Sorted Input; (b) Reverse Ordered Input; (c) Random Input. Solution. (a) O(N log N). This is because the size of the problem is exactly halved at each step, at a cost that is linear to the current size. (b) O(N log N). For the same reasons as in (a). (c) This question requires you to do an average case analysis of quicksort. This is done in the notes. The answer is O(N log N). Question 7.32 Suppose you are given a sorted list of n elements followed by f(n) randomly ordered elements. How would you sort the entire list if 1. f(n) = O(1) 2. f(n) = O(log n) 3. f(n) = O( n) 4. How large can f(n) be for the entire list to be still sortable in O(n) time? Solution. Looking at each case: 1. If there are just a handful of unsorted elements a constant number, say, k at the end then we could just run k iterations of the inner loop of insertion sort. This will be a running time of O(k n). Alternatively, we could sort these k elements separately and then perform a merge: a constant number of elements can be sorted in a constant amount of time and the merge takes n + k writes into a new array and another n + k to copy back to the original. 2. If there are O(log n) extras then the insertion-sort approach will require O(n log n) time. If we independently sort them and then merge the two pieces together we will take time: O(log n log log n) to sort the extra elements and then, as in the previous part, merge the two pieces in time n + log n (merge/write) and n + log n to copy back, which is in all O(n). 3. If there are O( N) extras then we can sort these in O( n log n) = O( n log n) since log n = log n log 2 = O(log n). Note that this sorting is still o(n) (dominated by a multiple of n). Merging can be done in two steps of, respectively, n + n (merges/writes) and the same to copy back. This is still O(n). 4. If f(n) = O(n/ log n) then sorting can be done in n O( log n log ( n ) n ) = O( (log n log log n)) = O(n) o(n) log n log n

Question 7.32 Prove that any algorithm that finds an element x in a sorted list of n elements requires Ω(log n) comparisons. Solution. This can be argued by the 20 questions -style of argument that we used for lower bounds on sorting. Since we could imagine an input that would require us reporting any of the n different array positions we have to have a separate outcome (leaf node) corresponding to each of these possibilities. In order for a binary tree (since only 2-way comparisons are allowed) to have n leaves we require a depth of at least log n = Ω(log n). Question 7.34 Using Stirling s Formula, N! ( ) N N e 2πN, give a precise estimation for log(n!). Solution. ( ) N N log(n!) log 2πN e ( ) N N log + log 2πN e ( ) N N log + log 2πN e N log N N log e N + log 2πN log N N log e N N log N N log e

Question 2. Row (column) sorting an array of numbers means taking each row (column) and sorting numbers in that row. Given an m n array of numbers (m rows of n numbers), if the array is row-sorted and then column-sorted, is the array still row-sorted? Solution. Yes. The proof is by contradiction. Assume that a pair a ij, a ij+1 exists such that a ij > a ij+1 (such a pair exists if and only if the matrix is not row sorted). Because the matrix is column sorted this implies that every element in the set {a ij, a i+1j,..., a mj } is greater than every element in the set {a 1j+1, a 2j+1,..., a ij+1 }. Now, {a ij, a i+1j,..., a mj } = m i + 1 Thus at most i 1 elements from column j are less than the elements in the set {a 1j+1, a 2j+1,..., a ij+1 }. But, {a 1j+1, a 2j+1,..., a ij+1 } = i Therefore no matter what the permutation of the elements in column j at least one element from the set {a 1j+1, a 2j+1,..., a ij+1 } will be in the same row as an element in column j that is greater than it. But column sorting only changed the permutations of the elements in the columns. Therefore the matrix must not have been row-sorted.