input sort left half sort right half merge results Mergesort overview

Similar documents
Algorithms. Algorithms. Algorithms 2.2 M ERGESORT. mergesort. bottom-up mergesort. sorting complexity comparators stability

Algorithms. Algorithms 2.2 MERGESORT. mergesort bottom-up mergesort sorting complexity comparators stability ROBERT SEDGEWICK KEVIN WAYNE

Algorithms. Algorithms 2.2 MERGESORT. mergesort bottom-up mergesort sorting complexity comparators stability ROBERT SEDGEWICK KEVIN WAYNE

Algorithms 2.2 MERGESORT. mergesort bottom-up mergesort sorting complexity comparators stability

Algorithms 2.2 MERGESORT. mergesort bottom-up mergesort sorting complexity comparators stability

Mergesort. ! mergesort! sorting complexity! comparators. ! bottom-up mergesort. Two classic sorting algorithms

4.2 Sorting and Searching. Section 4.2

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

Algorithms. Algorithms 1.4 ANALYSIS OF ALGORITHMS

4.1, 4.2 Performance, with Sorting

CS/COE 1501

Sorting. 4.2 Sorting and Searching. Sorting. Sorting. Insertion Sort. Sorting. Sorting problem. Rearrange N items in ascending order.

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

COS 226 Lecture 4: Mergesort. Merging two sorted files. Prototypical divide-and-conquer algorithm

Design and Analysis of Algorithms

Mergesort. CSE 2320 Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

4.1 Performance. Running Time. The Challenge. Scientific Method. Reasons to Analyze Algorithms. Algorithmic Successes

Lecture T5: Analysis of Algorithm

Lecture 2: Getting Started

Sorting Algorithms. + Analysis of the Sorting Algorithms

Lecture 1. Introduction / Insertion Sort / Merge Sort

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

CS1 Lecture 30 Apr. 2, 2018

CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019

Running Time. Analytic Engine. Charles Babbage (1864) how many times do you have to turn the crank?

1.4 Analysis of Algorithms

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist.

Cosc 241 Programming and Problem Solving Lecture 17 (30/4/18) Quicksort

MergeSort. Algorithm : Design & Analysis [5]

Data Structures 1.4 ANALYSIS OF ALGORITHMS. introduction observations mathematical models order-of-growth classifications theory of algorithms memory

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

4.1 Performance. Running Time. The Challenge. Scientific Method

1.4 Analysis of Algorithms

Introduction to Computers and Programming. Today

Sorting is ordering a list of objects. Here are some sorting algorithms

CS 171: Introduction to Computer Science II. Quicksort

Sorting. Riley Porter. CSE373: Data Structures & Algorithms 1

Data Structures and Algorithms

4. Sorting and Order-Statistics

Divide and Conquer Sorting Algorithms and Noncomparison-based

COS 226 Algorithms and Data Structures Fall Midterm Solutions

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

Lecture 6 Sorting and Searching

CSE 373: Data Structures and Algorithms

Fundamental problem in computing science. putting a collection of items in order. Often used as part of another algorithm

Lecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Lower bound for comparison-based sorting

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

5. DIVIDE AND CONQUER I

Sorting is a problem for which we can prove a non-trivial lower bound.

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

Algorithms 1.4 ANALYSIS OF ALGORITHMS. observations mathematical models order-of-growth classifications dependencies on inputs memory

EECS 2011M: Fundamentals of Data Structures

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems.

CSE 373: Data Structures and Algorithms

Divide & Conquer. 2. Conquer the sub-problems by solving them recursively. 1. Divide the problem into number of sub-problems

Lecture 8: Mergesort / Quicksort Steven Skiena

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

Sorting. Bringing Order to the World

Elementary Sorts. rules of the game selection sort insertion sort sorting challenges shellsort. Sorting problem. Ex. Student record in a University.

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson

COS 226 Midterm Exam, Spring 2009

CSE 143. Two important problems. Searching and Sorting. Review: Linear Search. Review: Binary Search. Example. How Efficient Is Linear Search?

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Algorithms and Data Structures, Fall Introduction. Rasmus Pagh. Based on slides by Kevin Wayne, Princeton. Monday, August 29, 11

Randomized Algorithms, Quicksort and Randomized Selection

Lecture 5: Sorting Part A

Lecture 9: Sorting Algorithms

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

COS 126 Written Exam 2, Fall 2009

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

COS 226 Midterm Exam, Spring 2010

Chapter 7 Sorting. Terminology. Selection Sort

COT 6405 Introduction to Theory of Algorithms

Back to Sorting More efficient sorting algorithms

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Computer Programming

Chapter 8 Sorting in Linear Time

Unit Outline. Comparing Sorting Algorithms Heapsort Mergesort Quicksort More Comparisons Complexity of Sorting 2 / 33

Mergesort again. 1. Split the list into two equal parts

Sorting L7.2 Recall linear search for an element in an array, which is O(n). The divide-and-conquer technique of binary search divides the array in ha

CSE 373: Data Structures and Algorithms

Lecture Notes on Quicksort

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

Analysis of Algorithms

CS-A1140 Data structures and algorithms

07 B: Sorting II. CS1102S: Data Structures and Algorithms. Martin Henz. March 5, Generated on Friday 5 th March, 2010, 08:31

Lecture 19 Sorting Goodrich, Tamassia

Sorting. Task Description. Selection Sort. Should we worry about speed?

Sorting & Growth of Functions

Comparison Based Sorting Algorithms. Algorithms and Data Structures: Lower Bounds for Sorting. Comparison Based Sorting Algorithms

13. Sorting and Searching

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Transcription:

Algorithms OBT S DGWICK K VIN W AYN Two classic sorting algorithms: mergesort and quicksort Critical components in the world s computational infrastructure. Full scientific understanding of their properties has enabled us to develop them into practical system sorts. Quicksort honored as one of top 10 algorithms of 20 th 2.2 GSOT mergesort century in science and engineering. ergesort. [this lecture] bottom-up mergesort Algorithms F O U T H D I T I O N... sorting complexity divide-and-conquer Quicksort. [next lecture] OBT S DGWICK K VIN W AYN http://algs4.cs.princeton.edu... 2 ergesort Basic plan. Divide array into two halves. ecursively sort each half. erge two halves. 2.2 GSOT mergesort bottom-up mergesort Algorithms sorting complexity input G S O T X A P L sort left half G O S T X A P L sort right half G O S A L P T X merge results A G L O P S T X ergesort overview divide-and-conquer OBT S DGWICK K VIN W AYN http://algs4.cs.princeton.edu 4

Abstract in-place merge demo Abstract in-place merge demo Goal. Given two sorted subarrays a[lo] to a[mid] and a[mid+1] to a[hi], replace with sorted subarray a[lo] to a[hi]. Goal. Given two sorted subarrays a[lo] to a[mid] and a[mid+1] to a[hi], replace with sorted subarray a[lo] to a[hi]. lo mid mid+1 hi lo hi a[] G A C T a[] A C G T sorted sorted sorted 5 6 ergesort: Transylvanian-Saxon folk dance erging: Java implementation private static void merge(comparable[] a, Comparable[] aux, int lo, int mid, int hi) for (int k = lo; k <= hi; k++) aux[k] = a[k]; int i = lo, j = mid+1; for (int k = lo; k <= hi; k++) if (i > mid) a[k] = aux[j++]; else if (j > hi) a[k] = aux[i++]; else if (less(aux[j], aux[i])) a[k] = aux[j++]; else a[k] = aux[i++]; copy merge lo i mid j hi aux[] A G L O H I S T k http://www.youtube.com/watch?v=xaq3g_nvoo a[] A G H I L 7 8

ergesort quiz 1 ergesort: Java implementation How many calls to less() does merge() make in the worst case to merge two subarrays of length N / 2 into an array of length N. Assume N is even. A. N / 2 B. N / 2 + 1 C. N 1 D. N. I don't know. public class erge private static void merge(...) /* as before */ private static void sort(comparable[] a, Comparable[] aux, int lo, int hi) if (hi <= lo) return; int mid = lo + (hi - lo) / 2; sort(a, aux, lo, mid); sort(a, aux, mid+1, hi); merge(a, aux, lo, mid, hi); public static void sort(comparable[] a) Comparable[] aux = new Comparable[a.length]; sort(a, aux, 0, a.length - 1); lo mid hi 9 10 11 12 13 14 15 16 17 18 19 10 ergesort: trace ergesort quiz 2 lo hi merge(a, aux, 0, 0, 1) merge(a, aux, 2, 2, 3) merge(a, aux, 0, 1, 3) merge(a, aux, 4, 4, 5) merge(a, aux, 6, 6, 7) merge(a, aux, 4, 5, 7) merge(a, aux, 0, 3, 7) merge(a, aux, 8, 8, 9) merge(a, aux, 10, 10, 11) merge(a, aux, 8, 9, 11) merge(a, aux, 12, 12, 13) merge(a, aux, 14, 14, 15) merge(a, aux, 12, 13, 15) merge(a, aux, 8, 11, 15) merge(a, aux, 0, 7, 15) a[] 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 G S O T X A P L G S O T X A P L G S O T X A P L G S O T X A P L G S O T X A P L G S O T X A P L G O S T X A P L G O S T X A P L G O S T X A P L G O S T A X P L G O S A T X P L G O S A T X P L G O S A T X P L G O S A T X L P G O S A L P T X A G L O P S T X Trace of merge results for top-down mergesort Which of the following subarray lengths will occur when running mergesort on an array of length 12? A. 1, 2, 3, 4, 6, 8, 12 B. 1, 2, 3, 6, 12 C. 1, 2, 4, 8, 12 D. 1, 3, 6, 9, 12. I don't know. 12 6 6 3 3 3 3 2 1 2 1 2 1 2 1 1 1 1 1 1 1 1 1 result after recursive call 11 12

ergesort: animation ergesort: animation 50 random items 50 reverse-sorted items http://www.sorting-algorithms.com/merge-sort algorithm position in order current subarray not in order http://www.sorting-algorithms.com/merge-sort algorithm position in order current subarray not in order 13 14 ergesort: empirical analysis unning time estimates: Laptop executes 108 compares/second. Supercomputer executes 1012 compares/second. ergesort analysis: number of compares Proposition. ergesort uses N lg N compares to sort an array of length N. Pf sketch. The number of compares C (N) to mergesort an array of length N satisfies the recurrence: insertion sort (N 2 ) mergesort (N log N) C (N) C ( N / 2 ) + C ( N / 2 ) + N 1 for N > 1, with C (1) = 0. computer thousand million billion thousand million billion left half right half merge home instant 2.8 hours 317 years instant 1 second 18 min super instant 1 second 1 week instant instant instant We solve this simpler recurrence, and assume N is a power of 2: D (N) = 2 D (N / 2) + N, for N > 1, with D (1) = 0. result holds for all N (analysis cleaner in this case) Bottom line. Good algorithms are better than supercomputers. 15 16

Divide-and-conquer recurrence ergesort analysis: number of array accesses Proposition. If D (N) satisfies D (N) = 2 D (N / 2) + N for N > 1, with D (1) = 0, then D (N) = N lg N. Proposition. ergesort uses 6 N lg N array accesses to sort an array of length N. Pf by picture. [assuming N is a power of 2] Pf sketch. The number of array accesses A (N) satisfies the recurrence: D (N) N = N A (N) A ( N / 2 ) + A ( N / 2 ) + 6 N for N > 1, with A (1) = 0. D (N / 2) D (N / 2) 2 (N/2) = N Key point. Any algorithm with the following structure takes N log N time: lg N D(N / 4) D(N / 4) D(N / 4) D(N / 4) D(N / 8) D(N / 8) D(N / 8) D(N / 8) D(N / 8) D(N / 8) D(N / 8) D(N / 8) 4 (N/4) = N 8 (N/8) = N T(N) = N lg N public static void f(int N) if (N == 0) return; f(n/2); f(n/2); linear(n); solve two problems of half the size do a linear amount of work Notable examples. FFT, hidden-line removal, Kendall-tau distance, 17 18 ergesort analysis: memory ergesort quiz 3 Proposition. ergesort uses extra space proportional to N. Pf. The array aux[] needs to be of length N for the last merge. two sorted subarrays A C D G H I N U V B F J O P Q S T A B C D F G H I J N O P Q S T U V merged result Is our implementation of mergesort stable? A. Yes. B. No, but it can be modified to be stable. C. No, mergesort is inherently unstable. D. I don't remember what stability means.. I don't know. a sorting algorithm is stable if it preserves the relative order of equal keys Def. A sorting algorithm is in-place if it uses c log N extra memory. x. Insertion sort, selection sort, shellsort. Challenge 1 (not hard). Use aux[] array of length ~ ½ N instead of N. Challenge 2 (very hard). In-place merge. [Kronrod 1969] input sorted C A1 B A2 A3 A3 A1 A2 B C not stable 19 20

Stability: mergesort Stability: mergesort Proposition. ergesort is stable. Proposition. erge operation is stable. public class erge private static void merge(...) /* as before */ private static void merge(...) for (int k = lo; k <= hi; k++) aux[k] = a[k]; private static void sort(comparable[] a, Comparable[] aux, int lo, int hi) if (hi <= lo) return; int mid = lo + (hi - lo) / 2; sort(a, aux, lo, mid); sort(a, aux, mid+1, hi); merge(a, aux, lo, mid, hi); public static void sort(comparable[] a) /* as before */ int i = lo, j = mid+1; for (int k = lo; k <= hi; k++) if (i > mid) a[k] = aux[j++]; else if (j > hi) a[k] = aux[i++]; else if (less(aux[j], aux[i])) a[k] = aux[j++]; else a[k] = aux[i++]; 0 1 2 3 4 5 6 7 8 9 10 A1 A2 A3 B D A4 A5 C F G Pf. Suffices to verify that merge operation is stable. Pf. Takes from left subarray if equal keys. 21 22 ergesort: practical improvements ergesort with cutoff to insertion sort: visualization Use insertion sort for small subarrays. ergesort has too much overhead for tiny subarrays. Cutoff to insertion sort for 10 items. first subarray second subarray first merge private static void sort(comparable[] a, Comparable[] aux, int lo, int hi) if (hi <= lo + CUTOFF - 1) Insertion.sort(a, lo, hi); return; int mid = lo + (hi - lo) / 2; sort (a, aux, lo, mid); sort (a, aux, mid+1, hi); merge(a, aux, lo, mid, hi); first half sorted second half sorted 23 result 24

ergesort: practical improvements ergesort: practical improvements Stop if already sorted. Is largest item in first half smallest item in second half? Helps for partially-ordered arrays. A B C D F G H I J A B C D F G H I J N O P Q S T U V N O P Q S T U V private static void sort(comparable[] a, Comparable[] aux, int lo, int hi) if (hi <= lo) return; int mid = lo + (hi - lo) / 2; sort (a, aux, lo, mid); sort (a, aux, mid+1, hi); if (!less(a[mid+1], a[mid])) return; merge(a, aux, lo, mid, hi); 25 liminate the copy to the auxiliary array. Save time (but not space) by switching the role of the input and auxiliary array in each recursive call. private static void merge(comparable[] a, Comparable[] aux, int lo, int mid, int hi) int i = lo, j = mid+1; for (int k = lo; k <= hi; k++) if (i > mid) aux[k] = a[j++]; else if (j > hi) aux[k] = a[i++]; else if (less(a[j], a[i])) aux[k] = a[j++]; merge from a[] to aux[] else aux[k] = a[i++]; private static void sort(comparable[] a, Comparable[] aux, int lo, int hi) if (hi <= lo) return; int mid = lo + (hi - lo) / 2; sort (aux, a, lo, mid); sort (aux, a, mid+1, hi); merge(a, aux, lo, mid, hi); switch roles of aux[] and a[] assumes aux[] is initialize to a[] once, before recursive calls 26 Java 6 system sort Basic algorithm for sorting objects = mergesort. Cutoff to insertion sort = 7. Stop-if-already-sorted test. liminate-the-copy-to-the-auxiliary-array trick. 2.2 GSOT Arrays.sort(a) Algorithms mergesort bottom-up mergesort sorting complexity divide-and-conquer OBT SDGWICK KVIN WAYN http://hg.openjdk.java.net/jdk6/jdk6/jdk/file/tip/src/share/classes/java/util/arrays.java http://algs4.cs.princeton.edu 27

Bottom-up mergesort Bottom-up mergesort: Java implementation Basic plan. Pass through array, merging subarrays of size 1. epeat for subarrays of size 2, 4, 8,... sz = 1 merge(a, aux, 0, 0, 1) merge(a, aux, 2, 2, 3) merge(a, aux, 4, 4, 5) merge(a, aux, 6, 6, 7) merge(a, aux, 8, 8, 9) merge(a, aux, 10, 10, 11) merge(a, aux, 12, 12, 13) merge(a, aux, 14, 14, 15) sz = 2 merge(a, aux, 0, 1, 3) merge(a, aux, 4, 5, 7) merge(a, aux, 8, 9, 11) merge(a, aux, 12, 13, 15) a[i] 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 G S O T X A P L G S O T X A P L G S O T X A P L G S O T X A P L G S O T X A P L G S O T X A P L G S O T A X P L G S O T A X P L G S O T A X P L G S O T A X P L G O S T A X P L G O S A T X P L G O S A T X L P public class ergebu private static void merge(...) /* as before */ public static void sort(comparable[] a) int N = a.length; Comparable[] aux = new Comparable[N]; for (int sz = 1; sz < N; sz = sz+sz) for (int lo = 0; lo < N-sz; lo += sz+sz) merge(a, aux, lo, lo+sz-1, ath.min(lo+sz+sz-1, N-1)); sz = 4 merge(a, aux, 0, 3, 7) merge(a, aux, 8, 11, 15) G O S A T X L P G O S A L P T X sz = 8 merge(a, aux, 0, 7, 15) A G L O P S T X Trace of merge results for bottom-up mergesort 29 Bottom line. Simple and non-recursive version of mergesort. 30 ergesort: visualizations ergesort quiz 4 Which is faster in practice: top-down mergesort or bottom-up mergesort? A. Top-down (recursive) mergesort. B. Bottom-up (nonrecursive) mergesort. C. About the same. D. I don't know. top-down mergesort (cutoff = 12) bottom-up mergesort (cutoff = 12) 31 32

Natural mergesort Idea. xploit pre-existing order by identifying naturally-occurring runs. input 1 5 10 16 3 4 23 9 13 2 7 8 12 14 first run 1 5 10 16 3 4 23 9 13 2 7 8 12 14 second run 1 5 10 16 3 4 23 9 13 2 7 8 12 14 merge two runs 1 3 4 5 10 16 23 9 13 2 7 8 12 14 Tradeoff. Fewer passes vs. extra compares per pass to identify runs. Timsort Natural mergesort. Use binary insertion sort to make initial runs (if needed). Intro ----- A few more clever optimizations. This describes an adaptive, stable, natural mergesort, modestly called timsort (hey, I earned it <wink>). It has supernatural performance on many kinds of partially ordered arrays (less than lg(n!) comparisons needed, and as few as N-1), yet as fast as Python's previous highly tuned samplesort hybrid on random arrays. In a nutshell, the main routine marches over the array once, left to right, alternately identifying the next run, then merging it into the previous runs "intelligently". verything else is complication for speed, and some hard-won measure of memory efficiency.... Consequence. Linear time on many arrays with pre-existing order. Now widely used. Python, Java 7, GNU Octave, Android,. http://hg.openjdk.java.net/jdk7/jdk7/jdk/file/tip/src/share/classes/java/util/arrays.java Tim Peters 33 34 Commercial break 2.2 GSOT Algorithms mergesort bottom-up mergesort sorting complexity divide-and-conquer OBT SDGWICK KVIN WAYN http://algs4.cs.princeton.edu https://www.youtube.com/watch?v=tshdbsynvo 35

Complexity of sorting Decision tree (for 3 distinct keys a, b, and c) Computational complexity. Framework to study efficiency of algorithms for solving a particular problem X. odel of computation. Allowable operations. Cost model. Operation counts. yes a < b code between compares (e.g., sequence of exchanges) no height of tree = worst-case number of compares Upper bound. Cost guarantee provided by some algorithm for X. Lower bound. Proven limit on cost guarantee of all algorithms for X. Optimal algorithm. Algorithm with best possible cost guarantee for X. yes b < c no yes a < c no lower bound ~ upper bound a b c a < c b a c b < c model of computation cost model decision tree # compares can access information only through compares (e.g., Java Comparable framework) yes no yes no upper bound ~ N lg N from mergesort lower bound? a c b c a b b c a c b a optimal algorithm? each leaf corresponds to one (and only one) ordering; (at least) one leaf for each possible ordering complexity of sorting 37 38 Compare-based lower bound for sorting Compare-based lower bound for sorting Proposition. Any compare-based sorting algorithm must use at least lg ( N! ) ~ N lg N compares in the worst-case. Proposition. Any compare-based sorting algorithm must use at least lg ( N! ) ~ N lg N compares in the worst-case. Pf. Assume array consists of N distinct values a1 through an. Worst case dictated by height h of decision tree. Binary tree of height h has at most 2h leaves. N! different orderings at least N! leaves. Pf. Assume array consists of N distinct values a1 through an. Worst case dictated by height h of decision tree. Binary tree of height h has at most 2h leaves. N! different orderings at least N! leaves. h 2 h # leaves N! h lg ( N! ) ~ N lg N Stirling's formula at least N! leaves no more than 2 h leaves 39 40

Complexity of sorting Complexity results in context odel of computation. Allowable operations. Cost model. Operation count(s). Upper bound. Cost guarantee provided by some algorithm for X. Lower bound. Proven limit on cost guarantee of all algorithms for X. Optimal algorithm. Algorithm with best possible cost guarantee for X. Compares? ergesort is optimal with respect to number compares. Space? ergesort is not optimal with respect to space usage. model of computation cost model upper bound lower bound optimal algorithm decision tree # compares ~ N lg N ~ N lg N mergesort complexity of sorting First goal of algorithm design: optimal algorithms. Lessons. Use theory as a guide. x. Design sorting algorithm that guarantees ~ ½ N lg N compares? x. Design sorting algorithm that is both time- and space-optimal? 41 42 Complexity results in context (continued) Sorting summary Lower bound may not hold if the algorithm can take advantage of: The initial order of the input. x: insertion sort requires only a linear number of compares on partially-sorted arrays. The distribution of key values. x: 3-way quicksort requires only a linear number of compares on arrays with a constant number of distinct keys. [stay tuned] The representation of the keys. x: radix sorts require no key compares they access the data via character/digit compares. selection insertion shell merge timsort? inplace? stable? best average worst remarks ½ N 2 ½ N 2 ½ N 2 N exchanges N ¼ N 2 ½ N 2 use for small N or partially ordered N log 3 N? c N 3/2 tight code; subquadratic ½ N lg N N lg N N lg N N N lg N N lg N N log N guarantee; stable improves mergesort when preexisting order N N lg N N lg N holy sorting grail 43 44

INTVIW QUSTION: SHUFFL A LINKD LIST Algorithms 2.2 GSOT mergesort bottom-up mergesort sorting complexity divide-and-conquer Problem. Given a singly-linked list, rearrange its nodes uniformly at random. Assumption. Access to a perfect random-number generator. Version 1. Linear time, linear extra space. Version 2. Linearithmic time, logarithmic or constant extra space. input first 2 3 4 5 6 7 null all N! permutations equally likely OBT SDGWICK KVIN WAYN http://algs4.cs.princeton.edu shuffled first 5 6 2 7 3 4 null 46