QUICKSORT TABLE OF CONTENTS

Similar documents
Data Structures and Algorithms

Sorting Algorithms. + Analysis of the Sorting Algorithms

Better sorting algorithms (Weiss chapter )

Faster Sorting Methods

Sorting. Task Description. Selection Sort. Should we worry about speed?

Quicksort. The divide-and-conquer strategy is used in quicksort. Below the recursion step is described:

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson

Overview of Sorting Algorithms

QuickSort

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

Data structures. More sorting. Dr. Alex Gerdes DIT961 - VT 2018

Quick Sort. CSE Data Structures May 15, 2002

Key question: how do we pick a good pivot (and what makes a good pivot in the first place)?

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

Divide and Conquer Sorting Algorithms and Noncomparison-based

Mergesort again. 1. Split the list into two equal parts

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Quicksort (Weiss chapter 8.6)

Sorting Algorithms. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

COMP Data Structures

CS125 : Introduction to Computer Science. Lecture Notes #40 Advanced Sorting Analysis. c 2005, 2004 Jason Zych

Real-world sorting (not on exam)

SORTING. Comparison of Quadratic Sorts

ECE 242 Data Structures and Algorithms. Advanced Sorting II. Lecture 17. Prof.

having any value between and. For array element, the plot will have a dot at the intersection of and, subject to scaling constraints.

HASH TABLES. Hash Tables Page 1

Sorting and Searching Algorithms

Mergesort again. 1. Split the list into two equal parts

CSC 273 Data Structures

Cpt S 122 Data Structures. Sorting

Unit-2 Divide and conquer 2016

DATA STRUCTURES AND ALGORITHMS

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

COSC242 Lecture 7 Mergesort and Quicksort

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

cmpt-225 Sorting Part two

e-pg PATHSHALA- Computer Science Design and Analysis of Algorithms Module 10

Sorting. Weiss chapter , 8.6

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Dual Sorting Algorithm Based on Quick Sort

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

BM267 - Introduction to Data Structures

ECE 242 Data Structures and Algorithms. Advanced Sorting I. Lecture 16. Prof.

Sorting: Quick Sort. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

II (Sorting and) Order Statistics

CS 206 Introduction to Computer Science II

protected BinaryNode root; } 02/17/04 Lecture 11 1

Lecture Notes on Quicksort

CS 171: Introduction to Computer Science II. Quicksort

Week 10. Sorting. 1 Binary heaps. 2 Heapification. 3 Building a heap 4 HEAP-SORT. 5 Priority queues 6 QUICK-SORT. 7 Analysing QUICK-SORT.

Data Structures and Algorithms

Chapter 5. Quicksort. Copyright Oliver Serang, 2018 University of Montana Department of Computer Science

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

The Substitution method

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

Algorithm for siftdown(int currentposition) while true (infinite loop) do if the currentposition has NO children then return

CS 61B Summer 2005 (Porter) Midterm 2 July 21, SOLUTIONS. Do not open until told to begin

COMP171 Data Structures and Algorithms Fall 2006 Midterm Examination

Plan of the lecture. Quick-Sort. Partition of lists (or using extra workspace) Quick-Sort ( 10.2) Quick-Sort Tree. Partitioning arrays

Module 3: Sorting and Randomized Algorithms

Algorithms in Systems Engineering ISE 172. Lecture 12. Dr. Ted Ralphs

Week 10. Sorting. 1 Binary heaps. 2 Heapification. 3 Building a heap 4 HEAP-SORT. 5 Priority queues 6 QUICK-SORT. 7 Analysing QUICK-SORT.

Lecture Notes on Quicksort

Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn

Lecture 11: In-Place Sorting and Loop Invariants 10:00 AM, Feb 16, 2018

CS125 : Introduction to Computer Science. Lecture Notes #38 and #39 Quicksort. c 2005, 2003, 2002, 2000 Jason Zych

4. Sorting and Order-Statistics

Sorting. Quicksort analysis Bubble sort. November 20, 2017 Hassan Khosravi / Geoffrey Tien 1

S O R T I N G Sorting a list of elements implemented as an array. In all algorithms of this handout the sorting of elements is in ascending order

Question 7.11 Show how heapsort processes the input:

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

Computational Optimization ISE 407. Lecture 16. Dr. Ted Ralphs

CS102 Sorting - Part 2

Quick-Sort. Quick-sort is a randomized sorting algorithm based on the divide-and-conquer paradigm:

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Algorithms in Systems Engineering ISE 172. Lecture 16. Dr. Ted Ralphs

CS 171: Introduction to Computer Science II. Quicksort

Divide and Conquer Algorithms: Advanced Sorting. Prichard Ch. 10.2: Advanced Sorting Algorithms

L14 Quicksort and Performance Optimization

CSCI 204 Introduction to Computer Science II

// walk through array stepping by step amount, moving elements down for (i = unsorted; i >= step && item < a[i-step]; i-=step) { a[i] = a[i-step];

Sorting and Searching

COMP2012H Spring 2014 Dekai Wu. Sorting. (more on sorting algorithms: mergesort, quicksort, heapsort)

O(n): printing a list of n items to the screen, looking at each item once.

Quick Sort. Biostatistics 615 Lecture 9

Module 3: Sorting and Randomized Algorithms

Module 3: Sorting and Randomized Algorithms. Selection vs. Sorting. Crucial Subroutines. CS Data Structures and Data Management

DO NOT. UNIVERSITY OF CALIFORNIA Department of Electrical Engineering and Computer Sciences Computer Science Division. P. N.

Algorithms. MIS, KUAS, 2007 Ding

Data Structures And Algorithms

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

About this exam review

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

Sorting. There exist sorting algorithms which have shown to be more efficient in practice.

COT 5407: Introduction to Algorithms. Giri Narasimhan. ECS 254A; Phone: x3748

Sorting and Selection

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem

CS Sorting Terms & Definitions. Comparing Sorting Algorithms. Bubble Sort. Bubble Sort: Graphical Trace

5/31/2006. Last Time. Announcements. Today. Variable Scope. Variable Lifetime. Variable Scope - Cont. The File class. Assn 3 due this evening.

Transcription:

QUICKSORT TABLE OF CONTENTS 1. What Is Quicksort? 2. Advantages of Quicksort 3. Disadvantages of Quicksort 4. Partition Algorithm 5. Quicksort Algorithm (including correct trace of Figure 7.12) 6. Improved Partitioning 7. Small Partitions Quicksort Page 1

What Is Quicksort? Quicksort is a recursive sorting routine that works by partitioning the array so that items with smaller keys are separated from those with larger keys and recursively applying itself to the two groups. Advantages of Quicksort Its average-case time complexity to sort an array of n elements is O(n lg n). On the average it runs very fast, even faster than Merge Sort. It requires no additional memory. Disadvantages of Quicksort Its running time can differ depending on the contents of the array. Quicksort's running time degrades if given an array that is almost sorted (or almost reverse sorted). Its worst-case running time, O(n 2 ) to sort an array of n elements, happens when given a sorted array. It is not stable. Partition Algorithm (pages 325-333) Most of the work of Quicksort is done in the Partition algorithm so we study it first. There is a Partition Workshop applet available to help you visualize this algorithm. Partition considers the array elements a[first] through a[]. Given a pivot value, it partitions the elements in the array so that all items with keys less than the pivot are moved to lower positions in the array and all items with keys greater than the pivot are moved to higher positions. For example, given the array: 90 100 20 60 80 110 120 40 10 30 50 70 The Partition algorithm yields: 50 30 20 60 10 40 70 110 80 100 90 120 All are 70 All are > 70 Quicksort Page 2

In the Partition workshop applet, the pivot value is arbitrary. The Quicksort algorithm, however, uses a[] as the pivot value. To accomplish the partitioning, Partition uses two variables, and, initialized to the position below first and, respectively. The initial setup is: 90 100 20 60 80 110 120 40 10 30 50 70 The algorithm increments until the item at that position is greater than or equal to the pivot value: 90 100 20 60 80 110 120 40 10 30 50 70 It decrements until the item at that position is less than or equal to the pivot value: 90 100 20 60 80 110 120 40 10 30 50 70 Quicksort Page 3

Then it swaps the values at those two positions: 50 100 20 60 80 110 120 40 10 30 90 70 Again, increment until a[] >= pivot: 50 100 20 60 80 110 120 40 10 30 90 70 Decrement until a[] <= pivot: 50 100 20 60 80 110 120 40 10 30 90 70 And swap: 50 30 20 60 80 110 120 40 10 100 90 70 Quicksort Page 4

Continuing, increment : 50 30 20 60 80 110 120 40 10 100 90 70 Decrement : 50 30 20 60 80 110 120 40 10 100 90 70 And swap: 50 30 20 60 10 110 120 40 80 100 90 70 Continuing through the next swap: 50 30 20 60 10 40 120 110 80 100 90 70 Quicksort Page 5

Eventually this stops when the two variables cross paths: 50 30 20 60 10 40 120 110 80 100 90 70 The Partition algorithm finishes by swapping the pivot at position with the item at : 50 30 20 60 10 40 70 110 80 100 90 120 The algorithm: public int partition( int first, int ) { = first; = + 1; pivot = a[]; } while ( < ) { increment until a[] >= pivot; decrement until a[] <= pivot; if ( < ) swap a[] and a[] } swap a[] and a[]; return ; Quicksort Page 6

Quicksort Algorithm Quicksort starts by partitioning the array. For the example above, the result of the first partition is: 50 30 20 60 10 40 70 110 80 100 90 120 All are 70 All are > 70 You can see that the pivot value, 70, holds the same position it would if the array were sorted. All items to the left of it are smaller and all items to the right of it are larger. Thus, in terms of sorting, the 70 is in the right spot and need not be considered further. Quicksort can continue by simply applying itself recursively to the two segments of the array above and below the pivot. Here's the algorithm; notice that the Partition algorithm returns the final position of the pivot (6 in the above example). public void recquicksort( int first, int ) { if ( - first + 1 <= SEGMENT_SIZE ) return; else { int p = partition( first, ); recquicksort( first, pivot-1 ); recquicksort( pivot+1, ); } } SEGMENT_SIZE is a named constant. The computation - first + 1 yields the number of elements in the array segment from first to. The if statement stops the recursion if the segment size is smaller than the named constant. The purpose of this is explained later; for now, just consider SEGMENT_SIZE to be equal to 1 since no sorting is necessary for one or fewer items. Quicksort Page 7

We will continue tracing the Quicksort on the data given above. This is the same data as shown in Figure 7.12, but that figure is incorrect. Following is the correct trace for that data. 50 30 20 60 10 40 70 110 80 100 90 120 10 30 20 40 50 60 70 110 80 100 90 120 Quicksort Page 8

Quicksort Page 9

Following is the recursion tree showing the execution of the recquicksort algorithm for the above example. The three values listed are those of first, and the position where the call to partition leaves the pivot value. 0, 11, 6 Black nodes call the partition method whereas the leaf nodes don t. 0, 5, 3 7, 11, 11 0, 2, 1 4, 5, 4 7, 10, 8 12, 11 0, 0 2, 2 4, 3 5, 5 7, 7 9, 10, 10 9, 9 11, 10 Quicksort Page 10

Improved Partitioning (pages 344 350) To avoid the worst-case scenario, several authors have suggested different ways to select the pivot value. 1. An easy fix is to choose the pivot value randomly instead of always choosing the item at a[]. You can accomplish this by replacing the statement: with: double pivot = a[]; int pos = (int)math.random(first-+1)+; double pivot = a[pos]; Weiss 1 considers this effort to be pointless. The computation of the random number adds a nontrivial amount of run time to the algorithm and doesn't guarantee that the worst-case is avoided. Through sheer bad luck, the random choices of pivot values may be those that lead to the worst-case behavior. Furthermore, it doesn't reduce the running time of the rest of the algorithm. 2. A technique that does work at the expense of some additional run time is the median-of-three technique covered on pages 345 through 350. The pivot is chosen to be the median of the first, and middle elements of the array. In terms of the algorithm, choose the median of a[first], a[] and a[first+/2]. For example, the median for the following array is 90: 90 100 20 60 80 110 120 40 10 30 50 70 first median 90 The implementation of the median-of-three partitioning is on page 348. The algorithm chooses the pivot by putting a[first], a[] and a[first+/2] into ascending order: 70 100 20 60 80 90 120 40 10 30 50 110 first median 90 1 Weiss, Mark Allen, Data Structures and Algorithm Analysis in Java, page 242. Quicksort Page 11

This means that a[first+/2] is the median and, hence, the pivot. Furthermore, a[first] and a[] have already been correctly partitioned; i.e. a[first] <= pivot and a[] >= pivot. Thus, the Partition algorithm can commence by swapping the pivot value with the item immediately to the left of the right end and starting its left and right markers at first and -1: 70 100 20 60 80 50 120 40 10 30 90 110 first pivot 90 The Partition now proceeds as before, yielding: 70 30 20 60 80 50 10 40 120 100 90 110 first pivot 90 70 30 20 60 80 50 10 40 90 100 120 110 first pivot 90 According to Weiss, median-of-three partitioning not only eliminates the worst-case behavior of Quicksort, it also reduces it's the running time by 5%. 2 2 Ibid, page 243. Quicksort Page 12

Small Partitions (pages 350 354) When using median-of-three partitioning, the Quicksort recursion will not work for partitions smaller than three elements, so we must substitute a different sorting method such as Straight Selection or Linear Insertion sort. This can be implemented fairly simply in one of two ways: 1. Change SEGMENT_SIZE to some value larger than 1. Then recquicksort s initial if statement: if ( - first + 1 <= SEGMENT_SIZE ) return; will stop at small segments leaving the entire array almost, but not quite, sorted. Once Quicksort is finished, you can run the entire array through the Linear Insertion sort to complete the job. Since the array is almost sorted, Linear Insertion will run in approximately O(n) time. 2. Change SEGMENT_SIZE to some value larger than 1 and change recquicksort s initial if statement to: if ( - first + 1 <= SEGMENT_SIZE ) othersort( first, ); Where othersort is say, linear insertion sort, which works well on small seqments. Sedgewick says that either of these techniques, when used with median-of-three partitioning, results in about 20% reduction in the running time of Quicksort. Furthermore, this improvement is about the same for any value of M from 5 to 25. 3 3 Sedgewick, Robert, Algorithms, Addison-Wesley Publishing Company, 1988, page 124. Quicksort Page 13