CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

Similar documents
CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019

CSE 373: Data Structures and Algorithms

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

CSE 373: Data Structures and Algorithms

CSE 373: Data Structures and Algorithms

Sorting. Riley Porter. CSE373: Data Structures & Algorithms 1

Announcements. HW4: Due tomorrow! Final in EXACTLY 2 weeks. Start studying. Summer 2016 CSE373: Data Structures & Algorithms 1

CSE 373: Data Structure & Algorithms Comparison Sor:ng

CSE 373: Data Structures & Algorithms More Sor9ng and Beyond Comparison Sor9ng

CSE 373 NOVEMBER 8 TH COMPARISON SORTS

CSE 373 MAY 24 TH ANALYSIS AND NON- COMPARISON SORTING

Sorting Algorithms. + Analysis of the Sorting Algorithms

Quick Sort. CSE Data Structures May 15, 2002

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Chapter 7 Sorting. Terminology. Selection Sort

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

DATA STRUCTURES AND ALGORITHMS

Mergesort again. 1. Split the list into two equal parts

COMP Data Structures

Lecture 19 Sorting Goodrich, Tamassia

Sorting. CSE 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation. CSE143 Au

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

What next? CSE332: Data Abstractions Lecture 20: Parallel Prefix and Parallel Sorting. The prefix-sum problem. Parallel prefix-sum

Divide and Conquer Sorting Algorithms and Noncomparison-based

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 3 Parallel Prefix, Pack, and Sorting

CS : Data Structures

Unit Outline. Comparing Sorting Algorithms Heapsort Mergesort Quicksort More Comparisons Complexity of Sorting 2 / 33

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

Quicksort (Weiss chapter 8.6)

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

CSE 143. Two important problems. Searching and Sorting. Review: Linear Search. Review: Binary Search. Example. How Efficient Is Linear Search?

Sorting. CSC 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation CSC Picture

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

II (Sorting and) Order Statistics

Algorithms and Data Structures (INF1) Lecture 7/15 Hua Lu

Sorting. Quicksort analysis Bubble sort. November 20, 2017 Hassan Khosravi / Geoffrey Tien 1

O(n): printing a list of n items to the screen, looking at each item once.

Copyright 2009, Artur Czumaj 1

CSE373 Fall 2013, Final Examination December 10, 2013 Please do not turn the page until the bell rings.

Better sorting algorithms (Weiss chapter )

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson

Sorting is a problem for which we can prove a non-trivial lower bound.

CSE373: Data Structures & Algorithms Lecture 28: Final review and class wrap-up. Nicki Dell Spring 2014

Sorting. Weiss chapter , 8.6

Intro to Algorithms. Professor Kevin Gold

CSE332: Data Abstractions Lecture 21: Parallel Prefix and Parallel Sorting. Tyler Robison Summer 2010

Binary Heaps. COL 106 Shweta Agrawal and Amit Kumar

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

CSE373 Fall 2013, Final Examination December 10, 2013 Please do not turn the page until the bell rings.

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

CSE373: Data Structure & Algorithms Lecture 23: More Sorting and Other Classes of Algorithms. Catie Baker Spring 2015

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

Table ADT and Sorting. Algorithm topics continuing (or reviewing?) CS 24 curriculum

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Comparison Sorts. Chapter 9.4, 12.1, 12.2

4. Sorting and Order-Statistics

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

Merge Sort. Algorithm Analysis. November 15, 2017 Hassan Khosravi / Geoffrey Tien 1

Sorting. CPSC 259: Data Structures and Algorithms for Electrical Engineers. Hassan Khosravi

Lecture 8: Mergesort / Quicksort Steven Skiena

We can use a max-heap to sort data.

Sorting (I) Hwansoo Han

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang

Measuring algorithm efficiency

CS125 : Introduction to Computer Science. Lecture Notes #38 and #39 Quicksort. c 2005, 2003, 2002, 2000 Jason Zych

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Acknowledgement. Outline

COMP 250 Fall recurrences 2 Oct. 13, 2017

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

CSE373: Data Structures & Algorithms Lecture 9: Priority Queues and Binary Heaps. Linda Shapiro Spring 2016

CS221: Algorithms and Data Structures. Sorting Takes Priority. Steve Wolfman (minor tweaks by Alan Hu)

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

Algorithmic Analysis and Sorting, Part Two. CS106B Winter

Cosc 241 Programming and Problem Solving Lecture 17 (30/4/18) Quicksort

CSE373: Data Structures & Algorithms Lecture 9: Priority Queues. Aaron Bauer Winter 2014

Introduction to Computers and Programming. Today

Data structures. More sorting. Dr. Alex Gerdes DIT961 - VT 2018

CS1 Lecture 30 Apr. 2, 2018

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours: Wed 4-6 pm (CSEB 3043), or by appointment.

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist.

Data Structures Brett Bernstein

Treaps. 1 Binary Search Trees (BSTs) CSE341T/CSE549T 11/05/2014. Lecture 19

CS240 Fall Mike Lam, Professor. Quick Sort

Sorting Shabsi Walfish NYU - Fundamental Algorithms Summer 2006

Sorting. Dr. Baldassano Yu s Elite Education

Algorithmic Analysis and Sorting, Part Two

Data Structures and Algorithms

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

Computer Science 4U Unit 1. Programming Concepts and Skills Algorithms

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

Lecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort

KF5008 Algorithm Efficiency; Sorting and Searching Algorithms;

The divide and conquer strategy has three basic parts. For a given problem of size n,

COMP2012H Spring 2014 Dekai Wu. Sorting. (more on sorting algorithms: mergesort, quicksort, heapsort)

CSE 332 Spring 2014: Midterm Exam (closed book, closed notes, no calculators)

Transcription:

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting Dan Grossman Fall 2013

Introduction to Sorting Stacks, queues, priority queues, and dictionaries all focused on providing one element at a time But often we know we want all the things in some order Humans can sort, but computers can sort fast Very common to need data sorted somehow Alphabetical list of people List of countries ordered by population Search engine results by relevance Algorithms have different asymptotic and constant-factor trade-offs No single best sort for all scenarios Knowing one way to sort just isn t enough 2

More Reasons to Sort General technique in computing: Preprocess data to make subsequent operations faster Example: Sort the data so that you can Find the k th largest in constant time for any k Perform binary search to find elements in logarithmic time Whether the performance of the preprocessing matters depends on How often the data will change (and how much it will change) How much data there is 3

The main problem, stated carefully For now, assume we have n comparable elements in an array and we want to rearrange them to be in increasing order Input: An array A of data records A key value in each data record A comparison function (consistent and total) Effect: Reorganize the elements of A such that for any i and j, if i < j then A[i] A[j] (Also, A must have exactly the same data it started with) Could also sort in reverse order, of course An algorithm doing this is a comparison sort 4

Variations on the Basic Problem 1. Maybe elements are in a linked list (could convert to array and back in linear time, but some algorithms needn t do so) 2. Maybe ties need to be resolved by original array position Sorts that do this naturally are called stable sorts Others could tag each item with its original position and adjust comparisons accordingly (non-trivial constant factors) 3. Maybe we must not use more than O(1) auxiliary space Sorts meeting this requirement are called in-place sorts 4. Maybe we can do more with elements than just compare Sometimes leads to faster algorithms 5. Maybe we have too much data to fit in memory Use an external sorting algorithm 5

Sorting: The Big Picture Surprising amount of neat stuff to say about sorting: Simple algorithms: O(n 2 ) Fancier algorithms: O(n log n) Comparison lower bound: (n log n) Specialized algorithms: O(n) Handling huge data sets Insertion sort Selection sort Shell sort Heap sort Merge sort Quick sort (avg) Bucket sort Radix sort External sorting 6

Insertion Sort Idea: At step k, put the k th element in the correct position among the first k elements Alternate way of saying this: Sort first two elements Now insert 3 rd element in order Now insert 4 th element in order Loop invariant : when loop index is i, first i elements are sorted Time? Best-case Worst-case Average case 7

Insertion Sort Idea: At step k, put the k th element in the correct position among the first k elements Alternate way of saying this: Sort first two elements Now insert 3 rd element in order Now insert 4 th element in order Loop invariant : when loop index is i, first i elements are sorted Time? Best-case O(n) Worst-case O(n 2 ) Average case O(n 2 ) start sorted start reverse sorted (see text) 8

Selection sort Idea: At step k, find the smallest element among the not-yetsorted elements and put it at position k Alternate way of saying this: Find smallest element, put it 1 st Find next smallest element, put it 2 nd Find next smallest element, put it 3 rd Loop invariant : when loop index is i, first i elements are the i smallest elements in sorted order Time? Best-case Worst-case Average case 9

Selection sort Idea: At step k, find the smallest element among the not-yetsorted elements and put it at position k Alternate way of saying this: Find smallest element, put it 1 st Find next smallest element, put it 2 nd Find next smallest element, put it 3 rd Loop invariant : when loop index is i, first i elements are the i smallest elements in sorted order Time? Best-case O(n 2 ) Worst-case O(n 2 ) Average case O(n 2 ) Always T(1) = 1 and T(n) = n + T(n-1) 10

Mystery This is one implementation of which sorting algorithm (for ints)? void mystery(int[] arr) { for(int i = 1; i < arr.length; i++) { int tmp = arr[i]; int j; for(j=i; j > 0 && tmp < arr[j-1]; j--) arr[j] = arr[j-1]; arr[j] = tmp; } } Note: Like with heaps, moving the hole is faster than unnecessary swapping (constant-factor issue) 11

Insertion Sort vs. Selection Sort Different algorithms Solve the same problem Have the same worst-case and average-case asymptotic complexity Insertion-sort has better best-case complexity; preferable when input is mostly sorted Other algorithms are more efficient for non-small arrays that are not already almost sorted Insertion sort may do well on small arrays 12

Aside: We Will Not Cover Bubble Sort It is not, in my opinion, what a normal person would think of It doesn t have good asymptotic complexity: O(n 2 ) It s not particularly efficient with respect to common factors Basically, almost everything it is good at some other algorithm is at least as good at Perhaps people teach it just because someone taught it to them? Fun, short, optional read: Bubble Sort: An Archaeological Algorithmic Analysis, Owen Astrachan, SIGCSE 2003 http://www.cs.duke.edu/~ola/bubble/bubble.pdf 13

The Big Picture Surprising amount of juicy computer science: 2-3 lectures Simple algorithms: O(n 2 ) Fancier algorithms: O(n log n) Comparison lower bound: (n log n) Specialized algorithms: O(n) Handling huge data sets Insertion sort Selection sort Shell sort Heap sort Merge sort Quick sort (avg) Bucket sort Radix sort External sorting 14

Heap sort Sorting with a heap is easy: insert each arr[i], or better yet use buildheap for(i=0; i < arr.length; i++) arr[i] = deletemin(); Worst-case running time: O(n log n) We have the array-to-sort and the heap So this is not an in-place sort There s a trick to make it in-place 15

In-place heap sort But this reverse sorts how would you fix that? Treat the initial array as a heap (via buildheap) When you delete the i th element, put it at arr[n-i] That array location isn t needed for the heap anymore! 4 7 5 9 8 6 10 3 2 1 heap part sorted part 5 7 6 9 8 10 4 3 2 1 arr[n-i]= deletemin() heap part sorted part 16

AVL sort We can also use a balanced tree to: insert each element: total time O(n log n) Repeatedly deletemin: total time O(n log n) Better: in-order traversal O(n), but still O(n log n) overall But this cannot be made in-place and has worse constant factors than heap sort both are O(n log n) in worst, best, and average case neither parallelizes well heap sort is better 17

Hash sort??? Don t even think about trying to sort with a hash table! Finding min item in a hashtable is O(n), so this would be a slower, more complicated selection sort 18

Divide and conquer Very important technique in algorithm design 1. Divide problem into smaller parts 2. Independently solve the simpler parts Think recursion Or potential parallelism 3. Combine solution of parts to produce overall solution (The name divide and conquer is rather clever.) 19

Divide-and-Conquer Sorting Two great sorting methods are fundamentally divide-and-conquer 1. Mergesort: Sort the left half of the elements (recursively) Sort the right half of the elements (recursively) Merge the two sorted halves into a sorted whole 2. Quicksort: Pick a pivot element Divide elements into less-than pivot and greater-than pivot Sort the two divisions (recursively on each) Answer is sorted-less-than then pivot then sorted-greater-than 20

Mergesort 8 2 9 4 5 3 1 6 To sort array from position lo to position hi: If range is 1 element long, it is already sorted! (Base case) Else: Sort from lo to (hi+lo)/2 Sort from (hi+lo)/2 to hi Merge the two halves together Merging takes two sorted parts and sorts everything O(n) but requires auxiliary space 21

Example, Focus on Merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array (After merge, copy back to original array) 22

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 (After merge, copy back to original array) 23

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 (After merge, copy back to original array) 24

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 3 (After merge, copy back to original array) 25

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 3 4 (After merge, copy back to original array) 26

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 3 4 5 (After merge, copy back to original array) 27

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 3 4 5 6 (After merge, copy back to original array) 28

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 3 4 5 6 8 (After merge, copy back to original array) 29

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array 1 2 3 4 5 6 8 9 (After merge, copy back to original array) 30

Example, focus on merging Start with: 8 2 9 4 5 3 1 6 After recursion: (not magic ) 2 4 8 9 1 3 5 6 Merge: Use 3 fingers and 1 more array (After merge, copy back to original array) 1 2 3 4 5 6 8 9 1 2 3 4 5 6 8 9 31

Example, Showing Recursion Divide 1 Element Divide Divide 8 2 9 4 5 3 1 6 8 2 9 4 5 3 1 6 8 2 8 2 9 4 5 3 1 6 9 4 5 3 1 6 Merge 2 8 4 9 3 5 1 6 Merge 2 4 8 9 1 3 5 6 Merge 1 2 3 4 5 6 8 9 32

Some details: saving a little time What if the final steps of our merge looked like this: 2 4 5 6 1 3 8 9 Main array 1 2 3 4 5 6 Auxiliary array Wasteful to copy to the auxiliary array just to copy back 33

Some details: saving a little time If left-side finishes first, just stop the merge and copy back: copy If right-side finishes first, copy dregs into right then copy back first second 34

Some details: Saving Space and Copying Simplest / Worst: Use a new auxiliary array of size (hi-lo) for every merge Better: Use a new auxiliary array of size n for every merging stage Better: Reuse same auxiliary array of size n for every merging stage Best (but a little tricky): Don t copy back at 2 nd, 4 th, 6 th, merging stages, use the original array as the auxiliary array and vice-versa Need one copy at end if number of stages is odd 35

Swapping Original / Auxiliary Array ( best ) First recurse down to lists of size 1 As we return from the recursion, swap between arrays Merge by 1 Merge by 2 Merge by 4 Merge by 8 Merge by 16 Copy if Needed (Arguably easier to code up without recursion at all) 36

Linked lists and big data We defined sorting over an array, but sometimes you want to sort linked lists One approach: Convert to array: O(n) Sort: O(n log n) Convert back to list: O(n) Or: merge sort works very nicely on linked lists directly Heapsort and quicksort do not Insertion sort and selection sort do but they re slower Merge sort is also the sort of choice for external sorting Linear merges minimize disk accesses And can leverage multiple disks to get streaming accesses 37

Analysis Having defined an algorithm and argued it is correct, we should analyze its running time and space: To sort n elements, we: Return immediately if n=1 Else do 2 subproblems of size n/2 and then an O(n) merge Recurrence relation: T(1) = c 1 T(n) = 2T(n/2) + c 2 n 38

One of the recurrence classics For simplicity let constants be 1 no effect on asymptotic answer T(1) = 1 So total is 2 k T(n/2 k ) + kn where T(n) = 2T(n/2) + n n/2 k = 1, i.e., log n = k = 2(2T(n/4) + n/2) + n That is, 2 log n T(1) + n log n = 4T(n/4) + 2n = n + n log n = 4(2T(n/8) + n/4) + 2n = O(n log n) = 8T(n/8) + 3n. = 2 k T(n/2 k ) + kn 39

Or more intuitively This recurrence is common you just know it s O(n log n) Merge sort is relatively easy to intuit (best, worst, and average): The recursion tree will have log n height At each level we do a total amount of merging equal to n 40

Quicksort Also uses divide-and-conquer Recursively chop into two pieces Instead of doing all the work as we merge together, we will do all the work as we recursively split into halves Unlike merge sort, does not need auxiliary space O(n log n) on average, but O(n 2 ) worst-case Faster than merge sort in practice? Often believed so Does fewer copies and more comparisons, so it depends on the relative cost of these two operations! 41

Quicksort Overview 1. Pick a pivot element 2. Partition all the data into: A. The elements less than the pivot B. The pivot C. The elements greater than the pivot 3. Recursively sort A and C 4. The answer is, as simple as A, B, C (Alas, there are some details lurking in this algorithm) 42

Think in Terms of Sets S 13 81 92 43 65 31 57 26 75 0 select pivot value S 1 S 2 partition S 13 0 43 31 65 26 57 92 75 81 S 1 0 13 26 31 43 57 65 S 2 75 81 92 Quicksort(S 1 ) and Quicksort(S 2 ) S 0 13 26 31 43 57 65 75 81 92 Presto! S is sorted [Weiss] 43

Example, Showing Recursion Divide Divide Divide 1 Element 1 2 8 2 9 4 5 3 1 6 2 4 3 1 5 8 9 6 3 2 1 4 6 8 9 Conquer 1 2 Conquer 1 2 3 4 6 8 9 Conquer 1 2 3 4 5 6 8 9 44

Details Have not yet explained: How to pick the pivot element Any choice is correct: data will end up sorted But as analysis will show, want the two partitions to be about equal in size How to implement partitioning In linear time In place 45

Pivots Best pivot? Median Halve each time 8 2 9 4 5 3 1 6 5 2 4 3 1 8 9 6 Worst pivot? Greatest/least element Problem of size n - 1 O(n 2 ) 8 2 9 4 5 3 1 6 1 8 2 9 4 5 3 6 46

Potential pivot rules While sorting arr from lo (inclusive) to hi (exclusive) Pick arr[lo] or arr[hi-1] Fast, but worst-case occurs with mostly sorted input Pick random element in the range Does as well as any technique, but (pseudo)random number generation can be slow Still probably the most elegant approach Median of 3, e.g., arr[lo], arr[hi-1], arr[(hi+lo)/2] Common heuristic that tends to work well 47

Partitioning Conceptually simple, but hardest part to code up correctly After picking pivot, need to partition in linear time in place One approach (there are slightly fancier ones): 1. Swap pivot with arr[lo] 2. Use two fingers i and j, starting at lo+1 and hi-1 3. while (i < j) if (arr[j] > pivot) j-- else if (arr[i] < pivot) i++ else swap arr[i] with arr[j] 4. Swap pivot with arr[i] * *skip step 4 if pivot ends up being least element 48

Example Step one: pick pivot as median of 3 lo = 0, hi = 10 0 1 2 3 4 5 6 7 8 9 8 1 4 9 0 3 5 2 7 6 Step two: move pivot to the lo position 0 1 2 3 4 5 6 7 8 9 6 1 4 9 0 3 5 2 7 8 49

Example Now partition in place 6 1 4 9 0 3 5 2 7 8 Often have more than one swap during partition this is a short example Move fingers 6 1 4 9 0 3 5 2 7 8 Swap 6 1 4 2 0 3 5 9 7 8 Move fingers 6 1 4 2 0 3 5 9 7 8 Move pivot 5 1 4 2 0 3 6 9 7 8 50

Analysis Best-case: Pivot is always the median T(0)=T(1)=1 T(n)=2T(n/2) + n -- linear-time partition Same recurrence as mergesort: O(n log n) Worst-case: Pivot is always smallest or largest element T(0)=T(1)=1 T(n) = 1T(n-1) + n Basically same recurrence as selection sort: O(n 2 ) Average-case (e.g., with random pivot) O(n log n), not responsible for proof (in text) 51

Cutoffs For small n, all that recursion tends to cost more than doing a quadratic sort Remember asymptotic complexity is for large n Common engineering technique: switch algorithm below a cutoff Reasonable rule of thumb: use insertion sort for n < 10 Notes: Could also use a cutoff for merge sort Cutoffs are also the norm with parallel algorithms Switch to sequential algorithm None of this affects asymptotic complexity 52

Cutoff skeleton void quicksort(int[] arr, int lo, int hi) { if(hi lo < CUTOFF) insertionsort(arr,lo,hi); else } Notice how this cuts out the vast majority of the recursive calls Think of the recursive calls to quicksort as a tree Trims out the bottom layers of the tree 53