Module 3: Sorting and Randomized Algorithms

Similar documents
Module 3: Sorting and Randomized Algorithms

Module 3: Sorting and Randomized Algorithms. Selection vs. Sorting. Crucial Subroutines. CS Data Structures and Data Management

COMP Analysis of Algorithms & Data Structures

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

COMP Analysis of Algorithms & Data Structures

Randomized Algorithms, Quicksort and Randomized Selection

Introduction to Algorithms

Data Structures and Algorithms

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

4. Sorting and Order-Statistics

Selection (deterministic & randomized): finding the median in linear time

Design and Analysis of Algorithms

Chapter 8 Sorting in Linear Time

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Fast Sorting and Selection. A Lower Bound for Worst Case

Introduction to Algorithms and Data Structures. Lecture 12: Sorting (3) Quick sort, complexity of sort algorithms, and counting sort

Introduction to Algorithms

Chapter 8 Sort in Linear Time

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

University of Waterloo CS240, Winter 2010 Assignment 2

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

Lower bound for comparison-based sorting

How fast can we sort? Sorting. Decision-tree example. Decision-tree example. CS 3343 Fall by Charles E. Leiserson; small changes by Carola

COMP Data Structures

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS

Lecture: Analysis of Algorithms (CS )

Algorithms Lab 3. (a) What is the minimum number of elements in the heap, as a function of h?

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist.

The complexity of Sorting and sorting in linear-time. Median and Order Statistics. Chapter 8 and Chapter 9

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

CSC Design and Analysis of Algorithms

Bin Sort. Sorting integers in Range [1,...,n] Add all elements to table and then

Sorting Algorithms. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

Lecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Data Structures. Sorting. Haim Kaplan & Uri Zwick December 2013

Data Structures and Algorithms Chapter 4

Other techniques for sorting exist, such as Linear Sorting which is not based on comparisons. Linear Sorting techniques include:

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Algorithms Chapter 8 Sorting in Linear Time

Lecture 5: Sorting Part A

Sorting and Selection

SORTING AND SELECTION

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

8. Sorting II. 8.1 Heapsort. Heapsort. [Max-]Heap 6. Heapsort, Quicksort, Mergesort. Binary tree with the following properties Wurzel

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

CS61B Lectures # Purposes of Sorting. Some Definitions. Classifications. Sorting supports searching Binary search standard example

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Comparison Based Sorting Algorithms. Algorithms and Data Structures: Lower Bounds for Sorting. Comparison Based Sorting Algorithms

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours: Wed 4-6 pm (CSEB 3043), or by appointment.

Data Structures and Algorithms Week 4

Searching, Sorting. part 1

Algorithmic Analysis and Sorting, Part Two. CS106B Winter

Algorithms and Data Structures: Lower Bounds for Sorting. ADS: lect 7 slide 1

Practical Session #11 - Sort properties, QuickSort algorithm, Selection

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

2/26/2016. Divide and Conquer. Chapter 6. The Divide and Conquer Paradigm

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Shell s sort. CS61B Lecture #25. Sorting by Selection: Heapsort. Example of Shell s Sort

Comparison Sorts. Chapter 9.4, 12.1, 12.2

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

Quicksort Using Median of Medians as Pivot

Mergesort again. 1. Split the list into two equal parts

Module 5: Hashing. CS Data Structures and Data Management. Reza Dorrigiv, Daniel Roche. School of Computer Science, University of Waterloo

Divide and Conquer 4-0

Examination Questions Midterm 2

Quicksort (Weiss chapter 8.6)

SORTING, SETS, AND SELECTION

CS 303 Design and Analysis of Algorithms

Lecture 9: Sorting Algorithms

II (Sorting and) Order Statistics

EECS 2011M: Fundamentals of Data Structures

Analysis of Algorithms - Medians and order statistic -

Deliverables. Quick Sort. Randomized Quick Sort. Median Order statistics. Heap Sort. External Merge Sort

Tirgul 4. Order statistics. Minimum & Maximum. Order Statistics. Heaps. minimum/maximum Selection. Overview Heapify Build-Heap

Algorithmic Analysis and Sorting, Part Two

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

Quick Sort. CSE Data Structures May 15, 2002

21# 33# 90# 91# 34# # 39# # # 31# 98# 0# 1# 2# 3# 4# 5# 6# 7# 8# 9# 10# #

Total Points: 60. Duration: 1hr

Sorting (Chapter 9) Alexandre David B2-206

Algorithms and Data Structures. Marcin Sydow. Introduction. QuickSort. Sorting 2. Partition. Limit. CountSort. RadixSort. Summary

Lecture 19 Sorting Goodrich, Tamassia

CIS 121 Data Structures and Algorithms with Java Spring Code Snippets and Recurrences Monday, January 29/Tuesday, January 30

4.4 Algorithm Design Technique: Randomization

CS240 Fall Mike Lam, Professor. Quick Sort

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Design and Analysis of Algorithms PART III

Sorting (Chapter 9) Alexandre David B2-206

Lecture 3: Sorting 1

Data structures. More sorting. Dr. Alex Gerdes DIT961 - VT 2018

Sorting Algorithms. For special input, O(n) sorting is possible. Between O(n 2 ) and O(nlogn) E.g., input integer between O(n) and O(n)

Sorting Shabsi Walfish NYU - Fundamental Algorithms Summer 2006

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

Next. 1. Covered basics of a simple design technique (Divideand-conquer) 2. Next, more sorting algorithms.

Question 7.11 Show how heapsort processes the input:

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

Transcription:

Module 3: Sorting and Randomized Algorithms CS 240 - Data Structures and Data Management Reza Dorrigiv, Daniel Roche School of Computer Science, University of Waterloo Winter 2010 Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 1 / 24

Selection vs. Sorting We have already seen some algorithms for the selection problem: Given an array A of n numbers, find the kth largest number. (note: we always count from zero, so 0 k < n) Best heap-based algorithm had time cost Θ (n + k log n). For median selection, k = n 2, giving cost Θ(n log n). This is the same cost as our best sorting algorithms. Question: Can we do selection in linear time? Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 2 / 24

Selection vs. Sorting We have already seen some algorithms for the selection problem: Given an array A of n numbers, find the kth largest number. (note: we always count from zero, so 0 k < n) Best heap-based algorithm had time cost Θ (n + k log n). For median selection, k = n 2, giving cost Θ(n log n). This is the same cost as our best sorting algorithms. Question: Can we do selection in linear time? The quick-select algorithm answers this question in the affirmative. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 2 / 24

Selection vs. Sorting We have already seen some algorithms for the selection problem: Given an array A of n numbers, find the kth largest number. (note: we always count from zero, so 0 k < n) Best heap-based algorithm had time cost Θ (n + k log n). For median selection, k = n 2, giving cost Θ(n log n). This is the same cost as our best sorting algorithms. Question: Can we do selection in linear time? The quick-select algorithm answers this question in the affirmative. Observation: Finding the element at a given position is tough, but finding the position of a given element is simple. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 2 / 24

Crucial Subroutines quick-select and the related algorithm quick-sort rely on two subroutines: choose-pivot(a): Choose an index i such that A[i] will make a good pivot (hopefully near the middle of the order). partition(a, p): Using pivot A[p], rearrange A so that all items the pivot come first, followed by the pivot, followed by all items greater than the pivot. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 3 / 24

Selecting a pivot Ideally, we would select a median as the pivot. But this is the problem we re trying to solve! First idea: Always select first element in array choose-pivot1(a) 1. return 0 We will consider more sophisticated ideas later on. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 4 / 24

Partition Algorithm partition(a, p) A: array of size n, p: integer s.t. 0 p < n 1. swap(a[0], A[p]) 2. i 1, j n 1 3. while i < j do 4. while A[i] A[0] and i < n do 5. i i + 1 6. while A[j] > A[0] and j > 0 do 7. j j 1 8. if i < j then 9. swap(a[i], A[j]) 10. swap(a[0], A[j]) 11. return j Idea: Keep swapping the outer-most wrongly-positioned pairs. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 5 / 24

QuickSelect Algorithm quick-select1(a, k) A: array of size n, k: integer s.t. 0 k < n 1. p choose-pivot1(a) 2. i partition(a, p) 3. if i = k then 4. return A[i] 5. else if i > k then 6. return quick-select1(a[0, 1,..., i 1], k) 7. else if i < k then 8. return quick-select1(a[i + 1, i + 2,..., n 1], k i 1) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 6 / 24

Analysis of quick-select1 Worst-case analysis: Recursive { call could always have size n 1. T (n 1) + cn, n 2 Recurrence given by T (n) = d, n = 1 Solution: T (n) = cn + c(n 1) + c(n 2) + + c 2 + d Θ(n 2 ) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 7 / 24

Analysis of quick-select1 Worst-case analysis: Recursive { call could always have size n 1. T (n 1) + cn, n 2 Recurrence given by T (n) = d, n = 1 Solution: T (n) = cn + c(n 1) + c(n 2) + + c 2 + d Θ(n 2 ) Best-case analysis: First chosen pivot could be the kth element No recursive calls; total cost is Θ(n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 7 / 24

Average-case analysis of quick-select1 Assume all n! permutations are equally likely. Average cost is sum of costs for all permutations, divided by n!. Define T (n, k) as average cost for selecting kth item from size-n array: T (n, k) = cn + 1 n ( k 1 T (n i 1, k i) + i=0 n 1 i=k+1 We could analyze this recurrence directly, or be a little lazier and still get the same asymptotic result. For simplicity, define T (n) = max T (n, k). 0 k<n T (i, k) ) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 8 / 24

Average-case analysis of quick-select1 The cost is determined by i, the position of the pivot A[0]. For more than half of the n! permutations, n 4 i < 3n 4. In this case, the recursive call will have length at most 3n 4, for any k. The average cost is then given by: { ( cn + 1 T (n) 2 T (n) + T ( 3n/4 )), n 2 d, n = 1 Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 9 / 24

Average-case analysis of quick-select1 The cost is determined by i, the position of the pivot A[0]. For more than half of the n! permutations, n 4 i < 3n 4. In this case, the recursive call will have length at most 3n 4, for any k. The average cost is then given by: { ( cn + 1 T (n) 2 T (n) + T ( 3n/4 )), n 2 d, n = 1 Rearranging gives: T (n) 2cn + T ( 3n/4 ) 2cn + 2c(3n/4) + 2c(9n/16) + + d ( ) 3 i d + 2cn O(n) 4 i=0 Since T (n) must be Ω(n) (why?), T (n) Θ(n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 9 / 24

Randomized algorithms A randomized algorithm is one which relies on some random numbers in addition to the input. The cost will depend on the input and the random numbers used. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 10 / 24

Randomized algorithms A randomized algorithm is one which relies on some random numbers in addition to the input. The cost will depend on the input and the random numbers used. Generating random numbers: Computers can t generate randomness. Instead, some external source is used (e.g. clock, mouse, gamma rays,... ) This is expensive, so we use a pseudo-random number generator (PRNG), a deterministic program that uses a true-random initial value or seed. This is much faster and often indistinguishable from truly random. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 10 / 24

Randomized algorithms A randomized algorithm is one which relies on some random numbers in addition to the input. The cost will depend on the input and the random numbers used. Generating random numbers: Computers can t generate randomness. Instead, some external source is used (e.g. clock, mouse, gamma rays,... ) This is expensive, so we use a pseudo-random number generator (PRNG), a deterministic program that uses a true-random initial value or seed. This is much faster and often indistinguishable from truly random. Goal: To shift the probability distribution from what we can t control (the input), to what we can control (the random numbers). There should be no more bad instances, just unlucky numbers. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 10 / 24

Expected running time Define T (I, R) as the running time of the randomized algorithm for a particular input I and the sequence of random numbers R. The expected running time T (exp) A (I ) of a randomized algorithm A for a particular input I is the expected value for T (I, R): T (exp) A (I ) = E[T (I, R)] = R T (I, R) Pr[R] Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 11 / 24

Expected running time Define T (I, R) as the running time of the randomized algorithm for a particular input I and the sequence of random numbers R. The expected running time T (exp) A (I ) of a randomized algorithm A for a particular input I is the expected value for T (I, R): T (exp) A (I ) = E[T (I, R)] = R T (I, R) Pr[R] The worst-case expected running time is then T (exp) A (n) = max size(i )=n T (exp) A (I ). For many randomized algorithms, worst-, best-, and average-case expected times are the same (why?). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 11 / 24

Randomized QuickSelect random(n) returns an integer uniformly from {0, 1, 2,..., n 1}. First idea: Randomly permute the input first using shuffle: shuffle(a) A: array of size n 1. for i 0 to n 2 do 2. swap(a[i], A[i + random(n i)) Expected cost becomes the same as the average cost, which is Θ(n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 12 / 24

Randomized QuickSelect Second idea: Change the pivot selection. choose-pivot2(a) 1. return random(n) quick-select2(a, k) 1. p choose-pivot2(a) 2.... With probability at least 1 2, the random pivot has position n 4 i < 3n 4, so the analysis is just like that for the average-case. The expected cost is again Θ(n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 13 / 24

Worst-case linear time Blum, Floyd, Pratt, Rivest, and Tarjan invented the medians-of-five algorithm in 1973 for pivot selection: choose-pivot3(a) A: array of size n 1. m n/5 1 2. for i 0 to m do 3. j index of median of A[5i,..., 5i + 4] 4. swap(a[i], A[j]) 5. return quick-select3(a[0,..., m], m/2 ) quick-select3(a, k) 1. p choose-pivot3(a) 2.... This mutually recursive algorithm can be shown to be Θ(n) in the worst case, but it s a little beyond the scope of this course. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 14 / 24

QuickSort QuickSelect is based on a sorting method developed by Hoare in 1960: quick-sort1(a) A: array of size n 1. if n 1 then return 2. p choose-pivot1(a) 3. i partition(a, p) 4. quick-sort1(a[0, 1,..., i 1]) 5. quick-sort1(a[i + 1,..., size(a) 1]) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 15 / 24

QuickSort QuickSelect is based on a sorting method developed by Hoare in 1960: quick-sort1(a) A: array of size n 1. if n 1 then return 2. p choose-pivot1(a) 3. i partition(a, p) 4. quick-sort1(a[0, 1,..., i 1]) 5. quick-sort1(a[i + 1,..., size(a) 1]) Worst case: T (worst) (n) = T (worst) (n 1) + Θ(n) Same as quick-select1; T (worst) (n) Θ(n 2 ) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 15 / 24

QuickSort QuickSelect is based on a sorting method developed by Hoare in 1960: quick-sort1(a) A: array of size n 1. if n 1 then return 2. p choose-pivot1(a) 3. i partition(a, p) 4. quick-sort1(a[0, 1,..., i 1]) 5. quick-sort1(a[i + 1,..., size(a) 1]) Worst case: T (worst) (n) = T (worst) (n 1) + Θ(n) Same as quick-select1; T (worst) (n) Θ(n 2 ) Best case: T (best) (n) = T (best) ( n 1 2 ) + T (best) ( n 1 2 ) + Θ(n) Similar to merge-sort; T (best) (n) Θ(n log n) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 15 / 24

Average-case analysis of quick-sort1 Of all n! permutations, (n 1)! have pivot A[0] at a given position i. Average cost over all permutations is given by: T (n) = 1 n 1 ( ) T (i) + T (n i 1) + Θ(n), n 2 n i=0 It is possible to solve this recursion directly. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 16 / 24

Average-case analysis of quick-sort1 Of all n! permutations, (n 1)! have pivot A[0] at a given position i. Average cost over all permutations is given by: T (n) = 1 n n 1 ( ) T (i) + T (n i 1) + Θ(n), n 2 i=0 It is possible to solve this recursion directly. Instead, notice that the cost at each level of the recursion tree is O(n). So let s consider the height (or depth ) of the recursion, on average. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 16 / 24

Average depth of recursion for quick-sort1 Define H(n) as the average recursion depth for size-n inputs. So { 1 + 1 n 1 H(n) = n i=0 max ( H(i), H(n i 1) ), n 2 0, n 1 Let i be the position of the pivot A[0]. Again, n 4 i < 3n 4 for more than half of all permutations. Then larger recursive call has length at most 3n 4. This will determine the recursion depth at least half the time. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 17 / 24

Average depth of recursion for quick-sort1 Define H(n) as the average recursion depth for size-n inputs. So { 1 + 1 n 1 H(n) = n i=0 max ( H(i), H(n i 1) ), n 2 0, n 1 Let i be the position of the pivot A[0]. Again, n 4 i < 3n 4 for more than half of all permutations. Then larger recursive call has length at most 3n 4. This will determine the recursion depth at least half the time. Therefore H(n) 1 + 1 4( 3H(n) + H( 3n ) 4 ) for n 2, which simplifies to H(n) 4 + H( 3n 4 ). So H(n) O(log n). Average cost is O(nH(n)) O(n log n). Since best-case is Θ(n log n), average must be Θ(n log n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 17 / 24

More notes on QuickSort We can randomize by using choose-pivot2, giving Θ(n log n) expected time for quick-sort2. We can use choose-pivot3 (along with quick-select3) to get quick-sort3 with Θ(n log n) worst-case time. We can use tail recursion to save space on one of the recursive calls. By making sure the other one is always smaller, the auxiliary space is Θ(log n) in the worst case, even for quick-sort1. QuickSort is often the most efficient algorithm in practice. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 18 / 24

Lower bounds for sorting We have seen many sorting algorithms: Sort Running time Analysis Selection Sort Θ(n 2 ) worst-case Insertion Sort Θ(n 2 ) worst-case Merge Sort Θ(n log n) worst-case Heap Sort Θ(n log n) worst-case quick-sort1 Θ(n log n) average-case quick-sort2 Θ(n log n) expected quick-sort3 Θ(n log n) worst-case Question: Can one do better than Θ(n log n)? Answer: Yes and no! It depends on what we allow. No: Comparison-based sorting lower bound is Ω(n log n). Yes: Non-comparison-based sorting can achieve O(n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 19 / 24

The Comparison Model In the comparison model data can only be accessed in two ways: comparing two elements moving elements around (e.g. copying, swapping) This makes very few assumptions on the kind of things we are sorting. We count the number of above operations. All sorting algorithms seen so far are in the comparison model. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 20 / 24

Lower bound for sorting in the comparison model Theorem. Any correct comparison-based sorting algorithm requires at least Ω(n log n) comparison operations. Proof. A correct algorithm takes different actions (moves, swaps, etc.) for each of the n! possible permutations. The choice of actions is determined only by comparisons. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 21 / 24

Lower bound for sorting in the comparison model Theorem. Any correct comparison-based sorting algorithm requires at least Ω(n log n) comparison operations. Proof. A correct algorithm takes different actions (moves, swaps, etc.) for each of the n! possible permutations. The choice of actions is determined only by comparisons. The algorithm can be viewed as a decision tree. Each internal node is a comparison, each leaf is a set of actions. Each permutation must correspond to a leaf. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 21 / 24

Lower bound for sorting in the comparison model Theorem. Any correct comparison-based sorting algorithm requires at least Ω(n log n) comparison operations. Proof. A correct algorithm takes different actions (moves, swaps, etc.) for each of the n! possible permutations. The choice of actions is determined only by comparisons. The algorithm can be viewed as a decision tree. Each internal node is a comparison, each leaf is a set of actions. Each permutation must correspond to a leaf. The worst-case number of comparisons is the longest path to a leaf. Since the tree has at least n! leaves, the height is at least lg n!. Therefore worst-case number of comparisons is Ω(n log n). Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 21 / 24

Counting Sort Requirement: Each A[i] satisfies 0 A[i] < k; k is given. counting-sort(a, k) A: array of size n, k: positive integer 1. C array of size k, filled with zeros 2. for i 0 to n 1 do 3. increment C[A[i]] 4. for i 1 to k 1 do 5. C[i] C[i] + C[i 1] 6. B copy(a) 7. for i n 1 down to 0 do 8. decrement C[B[i]] 9. A[C[B[i]]] B[i] Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 22 / 24

Counting Sort Requirement: Each A[i] satisfies 0 A[i] < k; k is given. counting-sort(a, k) A: array of size n, k: positive integer 1. C array of size k, filled with zeros 2. for i 0 to n 1 do 3. increment C[A[i]] 4. for i 1 to k 1 do 5. C[i] C[i] + C[i 1] 6. B copy(a) 7. for i n 1 down to 0 do 8. decrement C[B[i]] 9. A[C[B[i]]] B[i] Time cost: Θ(n + k), which is Θ(n) if k O(n). Auxiliary space: Θ(n + k), can be made Θ(k) counting-sort is stable: equal items stay in original order. Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 22 / 24

Radix Sort Requirement: Each A[i] is a string of d digits x d 1 x d 2 x 0, and each x i satisfies 0 x i < k. Example: integers between 0 and k d 1 radix-sort(a, d, k) A: array of size n, d: positive integer, k: positive integer 1. for i 0 to d 1 do 2. Call counting-sort(a, k) with each x i as the key Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 23 / 24

Radix Sort Requirement: Each A[i] is a string of d digits x d 1 x d 2 x 0, and each x i satisfies 0 x i < k. Example: integers between 0 and k d 1 radix-sort(a, d, k) A: array of size n, d: positive integer, k: positive integer 1. for i 0 to d 1 do 2. Call counting-sort(a, k) with each x i as the key Time cost: Θ(d(n + k)) Auxiliary space: Θ(n + k) Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 23 / 24

Summary of sorting Randomized algorithms can eliminate bad cases Best-case, worst-case, average-case, expected-case can all differ Sorting is an important and very well-studied algorithm Can be done in Θ(n log n) time; faster is not possible for general input HeapSort is the only fast algorithm we have seen with O(1) auxiliary space. QuickSort is often the fastest in practice MergeSort is also Θ(n log n), selection & insersion sorts are Θ(n 2 ). CountingSort, RadixSort can achieve o(n log n) if the input is special Reza Dorrigiv, Daniel Roche (CS, UW) CS240 - Module 3 Winter 2010 24 / 24