Lecture 19 Sorting Goodrich, Tamassia

Similar documents
Sorting. Data structures and Algorithms

Quick-Sort. Quick-Sort 1

Merge Sort fi fi fi 4 9. Merge Sort Goodrich, Tamassia

Quick-Sort fi fi fi 7 9. Quick-Sort Goodrich, Tamassia

Sorting. Divide-and-Conquer 1

Sorting Goodrich, Tamassia Sorting 1

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015

Merge Sort

Recursive Sorts. Recursive Sorts. Divide-and-Conquer. Divide-and-Conquer. Divide-and-conquer paradigm:

Merge Sort Goodrich, Tamassia Merge Sort 1

Sorting. Outline. Sorting with a priority queue Selection-sort Insertion-sort Heap Sort Quick Sort

Data Structures and Algorithms " Sorting!

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Quick-Sort. Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm:

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015

Sorting. Lecture10: Sorting II. Sorting Algorithms. Performance of Sorting Algorithms

SORTING AND SELECTION

1. The Sets ADT. 1. The Sets ADT. 1. The Sets ADT 11/22/2011. Class Assignment Set in STL

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS

Merge Sort Goodrich, Tamassia. Merge Sort 1

CPE702 Algorithm Analysis and Design Week 7 Algorithm Design Patterns

SORTING, SETS, AND SELECTION

Outline and Reading. Quick-Sort. Partition. Quick-Sort. Quick-Sort Tree. Execution Example

Comparison Sorts. Chapter 9.4, 12.1, 12.2

Divide-and-Conquer. Divide-and conquer is a general algorithm design paradigm:

QuickSort

Chapter 4: Sorting. Spring 2014 Sorting Fun 1

We can use a max-heap to sort data.

Introduction to Computers and Programming. Today

EECS 2011M: Fundamentals of Data Structures

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

Sorting and Selection

Algorithms and Data Structures (INF1) Lecture 7/15 Hua Lu

COMP Data Structures

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

Cpt S 122 Data Structures. Sorting

Programming II (CS300)

Priority queues. Priority queues. Priority queue operations

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

COMP2012H Spring 2014 Dekai Wu. Sorting. (more on sorting algorithms: mergesort, quicksort, heapsort)

SORTING LOWER BOUND & BUCKET-SORT AND RADIX-SORT

Programming II (CS300)

Sorting Algorithms. + Analysis of the Sorting Algorithms

Lecture 5: Sorting Part A

Quick Sort. CSE Data Structures May 15, 2002

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Plan of the lecture. Quick-Sort. Partition of lists (or using extra workspace) Quick-Sort ( 10.2) Quick-Sort Tree. Partitioning arrays

CSE 373: Data Structures and Algorithms

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

CS 171: Introduction to Computer Science II. Quicksort

BM267 - Introduction to Data Structures

Sorting Pearson Education, Inc. All rights reserved.

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

Divide and Conquer 4-0

Lecture 8: Mergesort / Quicksort Steven Skiena

CSC Design and Analysis of Algorithms

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

Algorithms in Systems Engineering ISE 172. Lecture 12. Dr. Ted Ralphs

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.

Divide-and-Conquer. Dr. Yingwu Zhu

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems.

COMP 352 FALL Tutorial 10

Unit-2 Divide and conquer 2016

Mergesort again. 1. Split the list into two equal parts

Divide-and-Conquer. The most-well known algorithm design strategy: smaller instances. combining these solutions

Data Structures and Algorithms

Sorting is a problem for which we can prove a non-trivial lower bound.

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Randomized Algorithms, Quicksort and Randomized Selection

Lecture 2: Getting Started

Algorithm for siftdown(int currentposition) while true (infinite loop) do if the currentposition has NO children then return

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

e-pg PATHSHALA- Computer Science Design and Analysis of Algorithms Module 10

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Sorting. Task Description. Selection Sort. Should we worry about speed?

CS Divide and Conquer

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

The divide and conquer strategy has three basic parts. For a given problem of size n,

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer

Parallel Sorting Algorithms

Chapter 7 Sorting. Terminology. Selection Sort

O(n): printing a list of n items to the screen, looking at each item once.

Copyright 2009, Artur Czumaj 1

Divide and Conquer. Algorithm D-and-C(n: input size)

Data Structures and Algorithms Week 4

CSE 373: Data Structures and Algorithms

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

CHAPTER 11 SETS, AND SELECTION

COMP Analysis of Algorithms & Data Structures

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Transcription:

Lecture 19 Sorting 7 2 9 4 2 4 7 9 7 2 2 7 9 4 4 9 7 7 2 2 9 9 4 4 2004 Goodrich, Tamassia

Outline Review 3 simple sorting algorithms: 1. selection Sort (in previous course) 2. insertion Sort (in previous course) 3. heap-sort (mentioned it earlier in the course) Divide and Conquer algorithm design applies not just to sorting Sorting algorithms today using divide and conquer Merge Sort Quick Sort you should have had Quick Sort it in a previous course 2

General Comments on Sorting Given a sequence S of elements to sort for most general implementation, use comparator C, can reuse same code for non-decreasing and non-increasing order sorting in-place vs. not in-place sorting: in-place sorting: need only a constant amount of additional space. This meanse sorting done in the original sequence S itself not in-place sorting: need more than a constant amount of additional space. For example need another sequence of size S.size() use in-place algorithm if space is an issue recursive vs. non-recursive: non-recursive and recursive algorithms may have the same asymptotic (big-o) complexity, but in practice, non-recursive code is usually faster and should be preferred recursive code is easier to write and understand, with practice

Heap Sort array A of size n, indexed from 0 to n - 1 Need to sort in non-decreasing order H = new empty min-heap for i = 0 to n - 1 H.insert(A[i]) for i = 0 to n - 1 A[i] = H.deleteMin() For non-increasing order, use max-heap Complexity: O(n log n) Need O(n) additional space for the heap data structure can implement heap-sort in-place by reusing the input array A for the heap data structure

Selection Sort array A of size n, indexes range from 0 to n - 1 Iterate n times: find the i th smallest element in the array A insert this element in the i th location of the array A after i th iteration, elements in range 0 i - 1 of A in correct position 3 4 5 1 9 5 2 1 4 5 3 9 5 2 1 2 5 3 9 5 4 1 2 3 5 9 5 4 1 2 3 4 9 5 5 1 2 3 4 5 9 5 1 2 3 4 5 5 9 1 2 3 4 5 5 9

Selection Sort Algorithm SelectSort(A,n) Input: Array A and its size n Output: Sorts elements of A in non-decreasing order for i = 0 to n - 2 //First find ith smallest element minindex = i for j = i + 1 to n-1 if A[j] < A[minIndex] then minindex = j // now swap the smallest element with ith element temp = A[minIndex] A[minIndex] = A[i] A[i] = temp

Selection Sort Complexity Outer for loop is performed n-1 times Inner for loop is performed n -1- i times for a fixed i total time is n 2 i 0 n 1 i n 1 i 1 i n 2 n 2 1 O n 2 for i = 0 to n - 2 minindex = i for j = i + 1 to n-1 if A[j] < A[minIndex] then minindex = j temp = A[minIndex] A[minIndex] = A[i] A[i] = temp All other operations inside the for loops take constant amount of time. Thus total running time is O(n 2 ) Running time is independent of the contents of array A, that is it is the best, worst, and average case running time

Insertion Sort after iteration i, array A should be sorted in the range 0 i but elements in range 0 i are not necessarily in their final correct positions after iteration i 3 4 5 1 9 5 2 3 4 1 5 9 5 2 3 1 4 5 9 5 2 1 3 4 5 9 5 2 1 3 4 5 5 9 2 1 3 4 5 5 2 9 1 3 4 5 2 5 9 1 3 4 2 5 5 9 1 3 2 4 5 5 9 1 2 3 4 5 5 9

Insertion Sort Algorithm InsertSort(A,n) Output: Sorts elements of A in non-decreasing order for i = 1 to n - 1 // first find correct position for element A[i] so that // subarray A[0 i] stays sorted x = A[i] j = i - 1 while j 0 and A[j]> x do A[j+1] = A[j] j = j-1 // the correct position for x is j+1, put x in that position A[j + 1] = x

Insertion Sort Complexity for loop is performed n - 1 times while loop is performed, for each fixed i, i times in the worst case 0 times in the best case, when sub-array A[0 i] is already sorted All other statements take for i = 1 to n - 1 x = A[i] j = i - 1 while j 0 and A[j]> x do A[j+1] = A[j] j = j-1 A[j + 1] = x constant amount of time n 1 n 1 i In the best case, insertion sort is O(n) i 1 2 In the worst case, insertion sort is O(n 2 ) n

Divide-and-Conquer Approach S 1 S 2 Divide-and conquer is a general algorithm design paradigm Suppose need to solve some problem on a sequence S example: sorting Suppose solving the problem on S is hard, but if we have a solution on subsequences S 1 and S 2 of S, then combining these solutions into solution on S is easy Divide: divide the input data S in two disjoint subsets S 1 and S 2 Recur: solve the subproblems associated with S 1 and S 2 Conquer: combine the solutions for S 1 and S 2 into solution for S Base case for recursion are subproblems of size 0 or 1 Base case is usually trivial to solve S

Divide-and-Conquer Example sequence S to sort: 7 3 5 9 1 2 4 8 1 S 1 S 2 Divide: split S into subsequences S 1 and S 2 Recur: sort subsequences S 1 and S 2 3 5 7 9 1 1 2 4 8 Conquer: merge sorted S 1 and S 2 into sorted S 1 1 2 3 4 5 7 8 9 Base case: sorting a sequence of size 1 is trivial

Divide-and-Conquer Merge-sort is a sorting algorithm based on the divide-andconquer paradigm Like heap-sort it has O(n log n) running time Unlike heap-sort does not use an auxiliary priority queue accesses data sequentially, so it is suitable to sort data on a disk 13

Merge-Sort Merge-sort on an input sequence S with n elements has three steps: Divide: partition S into two sequences S 1 and S 2 of about n/2 elements each Recur: recursively sort S 1 and S 2 Conquer: merge S 1 and S 2 into single sorted sequence Algorithm mergesort(s, C) Input sequence S with n elements, comparator C Output sequence S sorted according to C if S.size() > 1 (S 1, S 2 ) partition(s, n/2) mergesort(s 1, C) mergesort(s 2, C) S merge(s 1, S 2 ) 14

Merging Two Sorted Sequences Conquer step: merge two sorted sequences S 1 and S 2 into a sorted sequence S Assume linked list implementation of a sequence S 1 S 2 3 4 5 7 1 1 2 8 9 3 4 5 7 1 2 8 9 1 3 4 5 7 2 8 9 1 1 3 4 5 7 8 9 1 1 2 4 5 7 8 9 1 1 2 3 5 7 8 9 1 1 2 3 4 7 8 9 1 1 2 3 4 5 8 9 9 S 1 1 2 3 4 5 7 1 1 2 3 4 5 7 8 1 1 2 3 4 5 7 8 9

Merging Two Sorted Sequences merging two sorted sequences, each with n/2 elements and implemented with a doubly linked list is O(n) array implementation is O(n) Algorithm merge(a, B) Input sequences A and B with n/2 elements each Output sorted sequence of A B S empty sequence while A.isEmpty() B.isEmpty() if A.first().element() < B.first().element() S.insertLast(A.removeFirst()) else S.insertLast(B.removeFirst()) while A.isEmpty() S.insertLast(A.removeFirst()) while B.isEmpty() S.insertLast(B.removeFirst()) return S 16

Merge-Sort Tree An execution of merge-sort is depicted by a binary tree each node represents a recursive call of merge-sort and stores unsorted sequence before the execution and its partition sorted sequence at the end of the execution (after ) the root is the initial call the leaves are calls on subsequences of size 0 or 1 7 2 9 4 2 4 7 9 7 2 2 7 9 4 4 9 7 7 2 2 9 9 4 4 17

Execution Example Partition 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 18

Execution Example (cont.) Recursive call, partition 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 19

Execution Example (cont.) Recursive call, partition 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 20

Execution Example (cont.) Recursive call, base case 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 21

Execution Example (cont.) Recursive call, base case 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 2 2 22

Execution Example (cont.) Merge 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 2 2 23

Execution Example (cont.) Recursive call,, base case, merge 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 2 2 9 9 4 4 24

Execution Example (cont.) Merge 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 2 2 9 9 4 4 25

Execution Example (cont.) Recursive call,, merge, merge 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 6 8 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1 26

Execution Example (cont.) Merge 7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 6 8 7 2 2 7 9 4 4 9 3 8 3 8 6 1 1 6 7 7 2 2 9 9 4 4 3 3 8 8 6 6 1 1 27

Non-Recursive Merge Sort Recursive implementation is less efficient (by a constant factor) than non-recursive implementation Merge-sort can be implemented non-recursively at iteration i, break the sequence into groups of size 2 i-1 groups of 1, then 2, then 4, merge 2 nearby groups together 7 2 9 4 3 8 6 1 i = 1 7 2 9 4 3 8 6 1 2 7 4 9 3 8 1 6 i = 2 2 7 4 9 3 8 1 6 2 4 7 9 1 3 6 8 i = 3 2 4 7 9 1 3 6 8 1 2 3 4 6 7 8 9 28

Analysis of Merge-Sort Recurrence equations T(1) = c T(n) = kn + 2T(n/2) Algorithm mergesort(s, C) if S.size() > 1 (S 1, S 2 ) partition(s, n/2) mergesort(s 1, C) mergesort(s 2, C) S merge(s 1, S 2 ) Solve recurrence equations: T(n) = kn +2T(n/2) = kn+2[kn/2+2t(n/4)] = kn+kn+4t(n/4) = 2kn+4T(n/4) = 2kn+4[kn/4+2T(n/8)] = 3kn+8T(n/8)= = ikn+2 i T(n/2 i ) Unwrapping stops when n/2 i = 1, i.e. when i = log n Thus T(n) = (log n) kn + 2 log n T (1) = kn(log n)+cn Thus running time is O(n log n)

Part 2: Quick-Sort 30

Quick-Sort Randomized sorting algorithm based on the divide-and-conquer paradigm x Divide: pick a random element x (called pivot) and partition S into L elements less than x x E elements equal x G elements greater than x L E G Recur: sort L and G Conquer: join L, E and G x

Partition To partition, iterate remove each element y from S insert y into L, E or G, depending on result of the comparison with pivot x Each insertion and removal is at the beginning or end of sequence, and hence takes O(1) Thus, partition step is O(n) Algorithm partition(s, p) Input sequence S, position p of pivot Output subsequences L, E, G of the elements of S less than, equal to, or greater than the pivot, resp. L, E, G empty sequences x S.elementAtPosition(p) while S.isEmpty() y S.remove(S.first()) if y < x L.insertLast(y) else if y = x E.insertLast(y) else // y > x G.insertLast(y) return L, E, G

Quick-Sort Tree An execution of quick-sort is depicted by a binary tree Each node represents a recursive call of quick-sort and stores unsorted sequence before the execution and its pivot sorted sequence at the end of the execution The root is the initial call The leaves are calls on subsequences of size 0 or 1 7 4 9 6 2 2 4 6 7 9 4 2 2 4 7 9 7 9 2 2 9 9 33

Execution Example Pivot selection 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6 34

Execution Example Partition, recursive call, pivot selection 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 2 4 3 1 2 4 7 9 3 8 6 1 1 3 8 6 9 4 4 9 35

Execution Example Partition, recursive call, base case 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 2 4 3 1 2 4 7 3 8 6 1 1 3 8 6 1 1 9 4 4 9 36

Execution Example Recursive call, pivot selection 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 2 4 3 1 2 4 7 3 8 6 1 1 3 8 6 1 1 34 3 37

Execution Example Recursive call,, base case, join 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 2 4 3 1 1 2 3 4 3 8 6 1 1 3 8 6 1 1 4 3 3 4 4 4 38

Execution Example Recursive call, pivot selection 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 2 4 3 1 1 2 3 4 7 9 7 1 1 3 8 6 1 1 4 3 3 4 4 4 39

Execution Example Partition,, recursive call, base case 7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9 2 4 3 1 1 2 3 4 7 9 7 1 1 3 8 6 1 1 4 3 3 4 9 9 4 4 40

Execution Example Join, join 7 2 9 4 3 7 6 1 1 2 3 4 6 7 7 9 2 4 3 1 1 2 3 4 7 9 7 17 7 9 1 1 4 3 3 4 9 9 4 4 41

Worst-case Running Time Happens if pivot is the unique minimum or maximum element One of L and G has size n - 1 and the other has size 0 The running time is proportional to the sum Worst-case time is O(n 2 ) n + (n - 1) + + 2 + 1 depth time 0 n 1 n 1 n 1 1

Expected Running Time Consider a recursive call on a sequence of size s good call: the sizes of L and G are each less than ¾ s bad call: one of L and G has size greater than ¾ s 7 2 9 4 3 8 6 1 9 7 2 9 4 3 8 6 1 2 4 3 1 7 9 8 1 1 1 7 2 9 4 3 8 6 good call bad call A call is good with probability ½, since half of elements are good pivots 1 2 3 4 6 7 8 9 bad pivots good pivots bad pivots

height at most log n Expected Running Time Each good pivot reduces sequence size by at least ¾ For a node of recursion depth i, we expect i/2 ancestors are good calls The size of the input sequence for the current call is at most (3/4) i/2 n Solving (3/4) i/2 n = 1, for i for a node of depth i =2log 4/3 n, the expected input size is one the expected height of the quicksort tree is O(log n) The amount or work done at the nodes of the same depth is O(n) Thus, the expected running time of quick-sort is O(n log n) 7 8 9 at most n elements over all nodes 3

In-Place Quick-Sort Can implement quick-sort in-place In partition step, rearrange elements of input sequence s.t. the elements less than pivot have rank less than j the elements equal to pivot have rank between j and k the elements greater than pivot have rank greater than k Recursive calls consider elements with rank less than j elements with rank greater than k Algorithm inplacequicksort(s, l, r) Input sequence S, ranks l and r Output sequence S with the elements of rank between l and r rearranged in increasing order if l r return //base case i random integer between l and r x S.elemAtRank(i) (j, k) inplacepartition(x,l,r) inplacequicksort(s,l,j-1) inplacequicksort(s,k+1,r)

In-Place Partitioning (x,l,r) First partition S (between ranks l and r) into L (< x) and EG (>= x) j k 3 2 5 1 0 7 3 5 9 2 7 9 8 9 6 6 9 l r Repeat until j and k cross scan j to the right until find element > x scan k to the left until find element < x swap elements at ranks j and k j k 3 2 5 1 0 7 3 5 9 2 7 9 8 9 6 6 9 j k 3 2 5 1 0 2 3 5 9 7 7 9 8 9 6 6 9 k j 3 2 5 1 0 2 3 5 9 7 7 9 8 9 6 6 9

In-Place Partitioning (x,l,r) Next partition EG into E and G j Repeat until j and k cross l 3 2 5 1 0 2 3 5 9 7 7 9 8 9 6 6 9 r scan j to the right until find element > x scan k to the left until find element = x swap elements at indices j and k k j 3 2 5 1 0 2 3 5 9 7 7 9 8 9 6 6 9 j 3 2 5 1 0 2 3 5 6 7 7 9 8 9 6 9 9 j 3 2 5 1 0 2 3 5 6 7 7 9 8 9 6 9 9 j k 3 2 5 1 0 2 3 5 6 6 7 9 8 9 7 9 9 k j 3 2 5 1 0 2 3 5 6 6 7 9 8 9 7 9 9 k k k

Summary of Sorting Algorithms Algorithm Time Notes selection-sort O(n 2 ) insertion-sort O(n 2 ) in-place slow (good for small inputs) in-place slow (good for small inputs) quick-sort O(n log n) expected in-place, randomized fastest in practice (good for large inputs) heap-sort O(n log n) merge-sort O(n log n) in-place fast (good for large inputs) sequential data access fast (good for huge inputs when data must be stored on a disk)