Sorting. Weiss chapter , 8.6

Similar documents
Better sorting algorithms (Weiss chapter )

Sorting (Weiss chapter )

Quicksort (Weiss chapter 8.6)

Data structures. More sorting. Dr. Alex Gerdes DIT961 - VT 2018

Mergesort again. 1. Split the list into two equal parts

Mergesort again. 1. Split the list into two equal parts

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Programming in Haskell Aug-Nov 2015

Overview of Sorting Algorithms

1 a = [ 5, 1, 6, 2, 4, 3 ] 4 f o r j i n r a n g e ( i + 1, l e n ( a ) 1) : 3 min = i

Faster Sorting Methods

Sorting. Task Description. Selection Sort. Should we worry about speed?

CS 310 Advanced Data Structures and Algorithms

Algorithmic Analysis and Sorting, Part Two

Real-world sorting (not on exam)

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Sorting Algorithms. + Analysis of the Sorting Algorithms

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

COMP2012H Spring 2014 Dekai Wu. Sorting. (more on sorting algorithms: mergesort, quicksort, heapsort)

Divide and Conquer Algorithms: Advanced Sorting

Cosc 241 Programming and Problem Solving Lecture 17 (30/4/18) Quicksort

Divide and Conquer Algorithms: Advanced Sorting. Prichard Ch. 10.2: Advanced Sorting Algorithms

Searching in General

Introduction to Programming: Lecture 6

Cpt S 122 Data Structures. Sorting

Quick Sort. CSE Data Structures May 15, 2002

Algorithmic Analysis and Sorting, Part Two. CS106B Winter

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson

! Search: find a given item in a list, return the. ! Sort: rearrange the items in a list into some. ! list could be: array, linked list, string, etc.

Lecture 8: Mergesort / Quicksort Steven Skiena

ITEC2620 Introduction to Data Structures

COMP Data Structures

CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019

Algorithms and Data Structures (INF1) Lecture 7/15 Hua Lu

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

COSC242 Lecture 7 Mergesort and Quicksort

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

Copyright 2009, Artur Czumaj 1

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

Object-oriented programming. and data-structures CS/ENGRD 2110 SUMMER 2018

Sorting. Quicksort analysis Bubble sort. November 20, 2017 Hassan Khosravi / Geoffrey Tien 1

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

CSC 273 Data Structures

Divide and Conquer Sorting Algorithms and Noncomparison-based

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

Administrivia. HW on recursive lists due on Wednesday. Reading for Wednesday: Chapter 9 thru Quicksort (pp )

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

// walk through array stepping by step amount, moving elements down for (i = unsorted; i >= step && item < a[i-step]; i-=step) { a[i] = a[i-step];

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist.

Comparison Sorts. Chapter 9.4, 12.1, 12.2

CSE 373: Data Structures and Algorithms

CSE 143. Two important problems. Searching and Sorting. Review: Linear Search. Review: Binary Search. Example. How Efficient Is Linear Search?

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

Lecture 9: Sorting Algorithms

COMP 250. Lecture 7. Sorting a List: bubble sort selection sort insertion sort. Sept. 22, 2017

Searching and Sorting

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

CS/COE 1501

MPATE-GE 2618: C Programming for Music Technology. Unit 4.2

Sorting and Searching Algorithms

Search,Sort,Recursion

5/31/2006. Last Time. Announcements. Today. Variable Scope. Variable Lifetime. Variable Scope - Cont. The File class. Assn 3 due this evening.

Randomized Algorithms, Quicksort and Randomized Selection

2/14/13. Outline. Part 5. Computational Complexity (2) Examples. (revisit) Properties of Growth-rate functions(1/3)

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

9/10/12. Outline. Part 5. Computational Complexity (2) Examples. (revisit) Properties of Growth-rate functions(1/3)

BM267 - Introduction to Data Structures

Key question: how do we pick a good pivot (and what makes a good pivot in the first place)?

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

ECE 2574: Data Structures and Algorithms - Basic Sorting Algorithms. C. L. Wyatt

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

Lecture Notes on Quicksort

Data Structures and Algorithms

Sorting (I) Hwansoo Han

IS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)

Sorting. Bringing Order to the World

CS61BL. Lecture 5: Graphs Sorting

CHAPTER 7 Iris Hui-Ru Jiang Fall 2008

Lecture 7: Searching and Sorting Algorithms

DATA STRUCTURES AND ALGORITHMS

Lecture 5: Sorting Part A

We can use a max-heap to sort data.

Data Structures and Algorithms

DATA STRUCTURES AND ALGORITHMS

Sorting. CSE 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation. CSE143 Au

Chapter 17: Sorting Algorithms. 1.1 Description Performance Example: Bubble Sort Bubble Sort Algorithm 5

CS 171: Introduction to Computer Science II. Quicksort

Data Structures and Algorithm Analysis (CSC317) Introduction: sorting as example

Quick Sort. Biostatistics 615 Lecture 9

Data Structures and Algorithm Analysis (CSC317) Introduction: sorting as example

Sorting and Searching

! Search: find a given item in a list, return the. ! Sort: rearrange the items in a list into some. ! list could be: array, linked list, string, etc.

QuickSort

Sorting Algorithms. For special input, O(n) sorting is possible. Between O(n 2 ) and O(nlogn) E.g., input integer between O(n) and O(n)

Sorting. CSC 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation CSC Picture

S O R T I N G Sorting a list of elements implemented as an array. In all algorithms of this handout the sorting of elements is in ascending order

12/1/2016. Sorting. Savitch Chapter 7.4. Why sort. Easier to search (binary search) Sorting used as a step in many algorithms

Searching, Sorting. part 1

Simple Sorting Algorithms

Transcription:

Sorting Weiss chapter 8.1 8.3, 8.6

Sorting 5 3 9 2 8 7 3 2 1 4 1 2 2 3 3 4 5 7 8 9 Very many different sorting algorithms (bubblesort, insertion sort, selection sort, quicksort, heapsort, mergesort, shell sort, counting sort, radix sort,...)

Application of sorting Are two words anagrams of each other? Sort both words, check if they come out the same! m e t e o r r e m o t e e e m o r t e e m o r t

Bubblesort Check for adjacent elements that are in the wrong order, and swap them Starting from one end of the array and working towards the other

Bubblesort Compare a[0] and a[1]: 5 3 9 2 8 3 5 9 2 8

Bubblesort Compare a[1] and a[2]: 3 5 9 2 8 3 5 9 2 8

Bubblesort Compare a[2] and a[3]: 3 5 9 2 8 3 5 2 9 8

Bubblesort Compare a[3] and a[4]: 3 5 2 9 8 3 5 2 8 9

Bubblesort Back to the beginning! 3 5 2 8 9 3 5 2 8 9

Bubblesort Compare a[1] and a[2]: 3 5 2 8 9 3 2 5 8 9

Bubblesort Compare a[2] and a[3]: 3 2 5 8 9 3 2 5 8 9

Bubblesort Compare a[3] and a[4]: 3 2 5 8 9 3 2 5 8 9

Bubblesort Back to the beginning! 3 2 5 8 9 2 3 5 8 9

Bubblesort What order to do the swaps in? Start at the beginning of the array, loop upwards until you reach the top Then go round again How do we know when to stop? When the array is sorted When the last loop didn't swap any elements

Bubblesort void sort(int[] array) { boolean swapped; do { swapped = false; for (int i = 0; i < array.length-1; i++) if (a[i] > a[i+1]) { // Swap a[i] and a[i+1] int x = a[i]; } a[i] = a[i+1]; a[i] = x; swapped = true; } } while(swapped); This for-loop is O(n) But how many times does the do-loop execute?

Performance of bubblesort After one loop, the biggest element in the array has bubbled up to the top (hence the name bubblesort) Look at what happens to 9 in our example So the do-loop executes n times Total complexity O(n 2 ) (we assume that comparisons and swaps take O(1) time)

Bubble sort is bad! What if the array is in reverse order? 9 8 5 3 2 After one loop, only the 9 is in the right place: 8 5 3 2 9 It is very inefficient.

Insertion sort Imagine someone deals you cards. You pick up each one in turn and put it into the right place in your hand: This is the idea of insertion sort.

Insertion sort Sorting 5 3 9 2 8 : Start by picking up the 5: 5

Insertion sort Sorting 5 3 9 2 8 : Then insert the 3 into the right place: 3 5

Insertion sort Sorting 5 3 9 2 8 : Then the 9: 3 5 9

Insertion sort Sorting 5 3 9 2 8 : Then the 2: 2 3 5 9

Insertion sort Sorting 5 3 9 2 8 : Finally the 8: 2 3 5 8 9

Complexity of insertion sort Insertion sort does n insertions for an array of size n Does this mean it is O(n)? No! An insertion is not constant time. To insert into a sorted array, you must move all the elements up one, which is O(n). Thus total is O(n 2 ).

Insertion sort in Haskell insert x [] = [x] insert x (y:ys) x < y = x:y:ys otherwise = y:insert x ys sort [] = [] sort (x:xs) = insert x (sort xs)

In-place insertion sort A sorting algorithm is in-place if it does not need to create any temporary arrays Let's make an in-place insertion sort! Basic idea: loop through the array, and insert each element into the part which is already sorted

In-place insertion sort 5 3 9 2 8 The first element of the array is sorted: 5 3 9 2 8 White bit: sorted

In-place insertion sort 5 3 9 2 8 Insert the 3 into the correct place: 3 5 9 2 8

In-place insertion sort 3 5 9 2 8 Insert the 9 into the correct place: 3 5 9 2 8

In-place insertion sort 3 5 9 2 8 Insert the 2 into the correct place: 2 3 5 9 8

In-place insertion sort 2 3 5 9 8 Insert the 8 into the correct place: 2 3 5 8 9

In-place insertion To insert an item, make space by moving everything greater than it upwards 2 3 5 9 4 2 3 5 9

In-place insertion 2 3 5 9 4 2 3 5 9 2 3 5 9

In-place insertion This notation means 0, 1,, n-1 // Assuming that a[0..n) is sorted, // inserting a[n] into the right place // so that a[0..n] is sorted void insert(int[] a, int n) { int x = a[n]; int i = n; while(i > 0 && a[i-1] > x) { a[i] = a[i-1]; i--; } a[i] = x; }

In-place insertion sort void sort(int[] array) { for (int i = 1; i < n; i++) insert(array, i); } An aside: we have the invariant that array[0..i) is sorted An invariant is something that holds whenever the loop starts Initially, i = 1 and array[0..1) is sorted When array[0..i) is sorted, the loop body makes array[0..i+1) sorted, establishing the invariant for the next iteration When the loop finishes, i = n, so array[0..n) is sorted the whole array!

A negative result Bubblesort and insertion sort are both based on swapping adjacent elements No sorting algorithm that works like this can be better than O(n 2 )! See section 8.3 for details.

Selection sort Find the smallest element of the array, and delete it Find the smallest remaining element, and delete it And so on Finding the smallest element is O(n), so total complexity is O(n 2 )

Selection sort Sorting 5 3 9 2 8 : The smallest element is 2: 2 We also delete 2 from the input array.

Selection sort Sorting 5 3 9 8 : Now the smallest element is 3: 2 3 We delete 3 from the input array.

Selection sort Sorting 5 9 8 : Now the smallest element is 5: 2 3 5 We delete 5 from the input array. (...and so on)

In-place selection sort Instead of deleting the smallest element, swap it with the first element! The next time round, ignore the first element of the array: we know it's the smallest one. Instead, find the smallest element of the rest of the array, and swap it with the second element.

In-place selection sort Sorting 5 3 9 2 8 : The smallest element is 2: 2 3 9 5 8

In-place selection sort 2 3 9 5 8 The smallest element in the rest of the array is 3: 2 3 9 5 8

In-place selection sort 2 3 9 5 8 The smallest element in the rest of the array is 5: 2 3 5 9 8

In-place selection sort 2 3 5 9 8 The smallest element in the rest of the array is 8: 2 3 5 8 9

In-place selection sort int sort(int[] a) { for (int i = 0; i < a.length - 1; i++) { int min = /* the smallest element in a[i..n) */; int x = a[min]; a[min] = a[i]; a[i] = x; } }

Divide and conquer algorithms and quicksort

Divide and conquer Very general name for a type of recursive algorithm You have a problem to solve. Split that problem into smaller subproblems Recursively solve those subproblems Combine the solutions for the subproblems to solve the whole problem

To solve this...

1. Split the problem into subproblems 2. Recursively solve the subproblems 3. Combine the solutions

Quicksort Pick an element from the array, called the pivot Partition the array: First come all the elements smaller than the pivot, then the pivot, then all the elements greater than the pivot Recursively quicksort the two partitions

Quicksort 5 3 9 2 8 7 3 2 1 4 Say the pivot is 5. Partition the array into: all elements less than 5, then 5, then all elements greater than 5 3 3 2 2 1 4 5 9 8 7 Less than the pivot Greater than the pivot

Quicksort Now recursively quicksort the two partitions! 3 3 2 2 1 4 5 9 8 7 Quicksort Quicksort 1 2 2 3 3 4 5 7 8 9

Pseudocode // call as sort(a, 0, a.length-1); void sort(int[] a, int low, int high) { if (low >= high) return; int pivot = partition(a, low, high); // assume that partition returns the // index where the pivot now is sort(a, low, pivot-1); sort(a, pivot+1, high); } Usual optimisation: switch to insertion sort when the input array is small

Haskell code sort [] = [] sort (x:xs) = sort (filter (< x) xs) ++ [x] ++ sort (filter (>= x) xs) Split: filter Combine: ++

Complexity of quicksort In the best case, partitioning splits an array of size n into two halves of size n/2: n n/2 n/2

Complexity of quicksort The recursive calls will split these arrays into four arrays of size n/4: n n/2 n/2 n/4 n/4 n/4 n/4

n n/2 n/2 Total time is O(n log n)! n/4 n/4 n/4 n/4 log n levels n/8 n/8 n/8 n/8 n/8 n/8 n/8 n/8 O(n) time per level

Complexity of quicksort But that's the best case! In the worst case, everything is greater than the pivot (say) The recursive call has size n-1 Which in turn recurses with size n-2, etc. Amount of time spent in partitioning: n + (n-1) + (n-2) + + 1 = O(n 2 )

Worst cases Sorted array Reverse-sorted array Try these out!

Complexity of quicksort Quicksort works well when the pivot splits the array into roughly equal parts Median-of-three: pick first, middle and last element of the array and pick the median of those three Pick pivot at random: gives O(n log n) expected (probabilistic) complexity Introsort: detect when we get into the O(n 2 ) case and switch to a different algorithm (e.g. heapsort)

Summary of quicksort Divide-and-conquer algorithm: choose pivot, partition array into two, recursively sort both partitions O(n log n) if both partitions have about equal size, O(n 2 ) if one is much bigger than the other One solution: choose pivot at random (others in book) Very fast in practice

Next lecture How to perform partitioning More sorting algorithms Is O(n log n) the limit of sorting? How to find the complexity of recursive programs