Sorting. There exist sorting algorithms which have shown to be more efficient in practice.

Size: px
Start display at page:

Download "Sorting. There exist sorting algorithms which have shown to be more efficient in practice."

Transcription

1 Sorting Next to storing and retrieving data, sorting of data is one of the more common algorithmic tasks, with many different ways to perform it. Whenever we perform a web search and/or view statistics at some website, the presented data has most likely been sorted in some way. In this lecture and in the following lectures we will examine several different ways of sorting. The following are some reasons for investigating several of the different algorithms (as opposed to one or two, or the best algorithm). There exist very simply understood algorithms which, although for large data sets behave poorly, perform well for small amounts of data, or when the range of the data is sufficiently small. There exist sorting algorithms which have shown to be more efficient in practice. There are still yet other algorithms which work better in specific situations; for example, when the data is mostly sorted, or unsorted data needs to be merged into a sorted list (for example, adding names to a phonebook). 1

2 O(n) Sorting Algorithms: Counting Sort and Radix Sort Counting Sort. Counting sort is primarily used on data that is sorted by integer values which fall into a relatively small range (compared to the amount of random access memory available on a computer). Without loss of generality, we can assume the range of integer values is [0 : m], for some m 0. Now given array a[0 : n 1] the idea is to define an array of lists l[0 : m], scan a, and, for i = 0, 1,..., n 1 store element a[i] in list l[v(a[i])], where v is the function that computes an array element s sorting value. The sorted list can then be obtained by scanning the lists of l one-by-one in increasing order, and placing the encountered objects in a final list. Both steps require Θ(n) steps. Counting Sort is most commonly used on an array a of integers. One reason for this is that objects of a general type are often sorted on very large integers, which makes Counting Sort infeasible. For example, if an array of Employees are sorted based on social security number, then these values range in the tens of millions. When using an array of integers, l can be replaced by an integer array f, where f[i] represents the frequency of the number of elements of a that are equal to i. Example 1. Perform Counting Sort one the elements 9, 3, 3, 10, 5, 10, 3, 4, 9, 10, 1, 3, 5, 2, 4, 9, 9. 2

3 Radix Sort. Radix Sort can be applied to an array of integers for which each integer is represented by k bits, and the time needed to access a single bit is O(1). The algorithm works in k stages. At the beginning of stage i, 1 i n, it is assumed that the integers are stored in some array. The elements are then scanned one-by-one, with elements having i th least significant bit equal to j (j = 0, 1) being placed in array b j (keeping the same order as in a). The round ends by rewriting a as the elements of b 0 followed by the elements of b 1. Assuming that k is held constant, the complexity of radix sort is Θ(kn) = Θ(n). Example 2. Perform Radix Sort on the elements 9, 13, 10, 5, 10, 3, 4, 9, 1, 5, 2. 3

4 Insertion Sort. Insertion Sort represents a doubly iterative way of sorting data in an array. Step 1. sort the first item in the array. Step i + 1: assume the first i elements of the array are sorted. Move item i + 1 left to its appropriate location so that the first i + 1 items are sorted. Example 3. Use insertion sort on the array 43, 6, 72, 50, 44, 36, 21, 32, 47. 4

5 Code for Insertion Sort Applied to an Array of Integers: //Sorting in place the elements a[left:right]. void insertion_sort(int[] array, int left, int right) { int i, j,tmp; } //Attempt to move element i to the left. for(i=left+1; i <= right; i++) { //Move left until finding the proper location of a[i]. for(j=i; j > left; j--) { if(a[j] < a[j-1]) { tmp = a[j-1]; a[j-1] = a[j]; a[j] = tmp; } else break; //found the right location for a[i] } } 5

6 Expected Running Time of an Algorithm When analyzing the running time of an algorithm, sometimes we are interested in its expected running time or average running time. We denote this by the function Tave(n), and it represents the running time averaged over all possible inputs of size n. Moreover, a very useful tool for analyzing an algorithm s average running time is the notion of a random variable. We will usually denote random variables by using capital letters, such as X, Y, Z, I, T, etc.. Moreover, a random variable X has a domain of elements that X can assume at any given time. For example, if we let X denote the value showing face up after rolling a six-sided die, then the domain of X is dom(x) = {1, 2, 3, 4, 5, 6}. In this case we call X a numerical random variable because its domain consists of a set of numbers. Finally, each domain value x is assigned a probability p(x), which indicates the likelihood that X will assume the value x. Moreover, we require that p(x) = 1. We call these probabilities the probability distribution of X. x dom(x) Example 4. Let random variable X be observed as the value of the two-bit binary number whose most significant bit is obtained by tossing a coin that has a probability of 1/4 of landing heads, and whose least significant bit is obtained by tossing a fair coin. Provide the domain of X and its associated probability distribution. An important statistic of a numerical random variable X is its average value or expectation. This is denoted by E[X], and is defined by E[X] = x dom(x) xp(x). Hence, E[X] is just the sum of the domain values of X weighted by their probabilities. 6

7 Example 5. Compute E[X] for the random variable of Example 4. Notice that we can obtain new random variables by combining existing ones. For example, If X and Y are numerical random variables, then Z 1 = X + Y, Z 2 = X Y, Z 3 = XY, and Z 4 = X/Y are also random variables. Moreover, for sums of random variables, there is a very convenient way of computing the expectation. The proof of the following result can be found in most probability textbooks. Theorem 1. If X and Y are numerical random variables, then E[X +Y ] = E[X]+E[Y ]. In general, if X 1, X 2,..., X n are numerical random variables, then E[X 1 + X X n ] = E[X 1 ] + E[X 2 ] + + E[X n ]. Example 6. Let X be the outcome of rolling six-sided die 1, while Y is the result of rolling six-sided die 2. Verify that E[X + Y ] = E[X] + E[Y ]. 7

8 A key insight to the problem of sorting is to realize that every array of n objects (which have some ordering property) represents a permutation of the integers 1, 2,..., n. An n-permutation is simply an ordered arrangement of the first n positive integers. In other words, if σ is an n-permutation, then we write σ = (σ(1) σ(2) σ(n)), where σ(i) and σ(j) are positive integers less than n + 1, and distinct if and only if i j. Example 7. What is the permutation associated with the unsorted array of Example 3? An inversion of a permuation σ is a pair (i, j) such that i < j and σ(j) < σ(i). For example, the permutation ( ) has six inversions. 8

9 Observation. Every successful comparison of insertion sort reduces the number of inversions by 1. Thus, the number of steps (comparisons) needed for insertion sort is directly proportional to the number of inversions possessed by the array. Lemma 1. The average number of inversions possessed by a random permutation is n(n 1) 4. Proof. For each i and j, with 1 i < j n, Let I ij be an indicator random variable that equals 1 iff, when an n-permutation is constructed at random, index i occurs to the right of index j. Then E[I ij ] = pr(i ij = 1) = 0.5. Also, the number of inversions of the random permutation equals Therefore, the expected number of inversions is E[ 1 i<j n 1 i<j n 1 i<j n I ij ] = 1 2 = 1 ( n 2 2 I ij. 1 i<j n ) = E[I ij ] = n(n 1). 4 Theorem 2. The average and worst-case for insertion sort is O(n 2 ). Proof. The worst case occurs when the array is sorted in reverse order. In this case the array has the maximum number of inversions, which is Θ(n 2 ). Moreover, Lemma 1 established that, even for the average case (where we assume every permutation is equally likely), the number of inversions is n(n 1) 4 = Θ(n 2 ). 9

10 Introduction to Divide and Conquer Algorithms There exist many problems that can be solved using a divide-and-conquer algorithm. A divide-andconquer algorithm A follows these general guidelines. Divide. Algorithm A divides original problem into one or more subproblems of a smaller size. Conquer. Each subproblem is solved by making a recursive call to A. Combine. Finally, A combines the subproblem solutions into a final solution for the original problem. Some problems that can be solved using a divide-and-conquer algorithm: Binary Search: Locating an element in a sorted array. Quicksort and Mergesort: Sorting an array. Order Statistics: finding the k th least or greatest element of an array. Geometric Algorithms: finding the convex hull of a set of points; finding two points in a plane that are closest. Matrix Operations: matrix inversion, Fast-Fourier Transform, matrix multiplication. Maximum Subsequence Sum: find the maximum sum of any subsequence in a sequence of integers. A divide-and-conquer recurrence relation is used to define the running time T (n) of a divide an conquer algorithm. They take the general form T (n) = at (n/b) + f(n), where a represents the number of subproblems, n/b gives the size of each subproblem, while f(n) provides the number of steps needed to divide the input and combine the solutions. Example 6 provides one approach for solving these recurrence relations. 10

11 Mergesort is an algorithm that sorts an array of elements by splitting it into two halves, a left [] and a right [], recursively sorting both halves (again using Mergesort), and then merging the two sorted halves into one sorted array. Example 8. Demonstrate Mergesort using the array 5, 8, 6, 2, 7, 1, 0, 9, 3, 4, 6. 11

12 Example 9. Provide a divdide-and-conquer recurrence relation for Mergesort. Make a recursion tree for the relation and use it to analyze the growth of T (n). 12

13 Quicksort Quicksort is considered in practice to be the most efficient sorting algorithm for arrays of data stored in local memory. Quicksort is a divide-and-conquer algorithm which works in the following manner. Let a[] be the array to be sorted. 1. if a[] is an array with 5 or fewer elements, then sort array using Insertion Sort; 2. find an array element M (called the pivot) which is a good candidate for splitting a[] into two subarrays, a left [] and a right [], such that x M for every x a left [] and x M for every x a right []; 3. swap the elements of a[] so that the elements x M move to the left side of a[] to form a left [], and the elements x M move to the right side of a[] to form a right []; 4. a[] is now of the form a 0, a 1,..., a i 1, a i = M, a i+1, a i+2,..., a n 1 where a j M for every j i 1, and a j M for every j i; 5. let a left = a[0 : (i 1)] and a right = a[i + 1 : (n 1)]. After both a left and a right have been sorted using Quicksort, the entire array a[] will be sorted. Note that in the future we refer to the algorithm that partitions a[] into a left [] and a right [] (relative to some median M), the Partition algorithm. Finding an element to split the array. A median for an array a[] is an element M a[] which splits the array into two equal pieces, where piece 1 (respectively 2) consists of elements all of which are less than or equal to (respectively, greater than or equal to) M. Although finding the median M of a[] would satisfy step 2 of quicksort, in practice finding the median of the entire array seems too costly. On the other hand, a compromise between speed and accuracy that seems to work in practice is to use the median-of-three rule which is take the median of the three elements a[0], a[n 1], a[ n 1 ]. 2 Swapping elements of a[]. Once a pivot M has been selected, it can be swapped with the last element of the array (of course, in the median-of-three strategy, it might be the last element). The remaining elements a[0] through a[n 2] can be swapped using two markers, left and right, which respectively begin on the left and right sides of the array. Both markers move toward the center of the array. A marker stops when it encounters an element which should be on the other side of the array. For example, marker left will stop when it encounters an element x for which x M. When both markers stop, they swap elements at those points, unless they have crossed one another, in which case the process terminates. 13

14 Example 10. Demonstrate the Quicksort algorithm using the array 5, 8, 6, 2, 7, 1, 0, 9, 3, 4, 6. 14

15 Complexity of Quicksort. The time complexity (i.e. number of steps T (n) for an array size of n comparables) of Quicksort depends on how the median is chosen. Later in this lecture we demonstrate how to find an exact median in O(n) steps. Using this approach quicksort has a worstcase complexity of O(n log n). On the other hand, if the median is chosen randomly, then it can be shown that quicksort has O(n log n) average-case complexity, but O(n 2 ) worst-case complexity. In practice, the median-of-three approach gives empirically faster running times than both the exact and random approaches. Example 11. Verify that, if an exact median for an array of n comparables can be found in O(n) steps, then Quicksort has a worst-case complexity of O(n log n). Example 12. Show that the average size of a left [] is n 1 2 when the input to Quicksort is n distinct elements, and the median M is randomly chosen from one of the elements. 15

16 General Solution to the Divide and Conquer Recurrence Relation Theorem 3. Let n equal a positive power of b, then the asymptotic growth of the general solution to the recurrence relation T (n) = at (n/b) + f(n) is given by T (n) = Θ(n log b a ) + log b n 1 j=0 a j f(n/b j ). 16

17 Example 13. Use Theorem 3 to determine the running time of a divide-and-conquer algorithm that requires three recursive calls (each with input-size n/2) and 4n 2 steps that include dividing the input, and using the three solutions to obtain the final solution to the orginal problem. Other sorting algorithms: Shellsort: similar to insertion sort, but the sorting is done iteratively on subarrays of the original arrays; the elements of the subarrays are equidistant from one another, with that distance converging to one on the final sort. Bubblesort: Similar to Insertion Sort, but the swapping of elements begins at the end of the array, rather than the front of the array. 17

18 Exercises. 1. If Counting Sort is used to sort an array of integers that fall within the interval [ 5000, 10000], then how large of an auxiliary array should one use? Explain. 2. What is the running time of Insertion Sort if all elements are equal? Explain. 3. Sort 13,1,4,5,8,7,9,11,3 using Insertion Sort. 4. Sort 13,1,4,5,8,7,9,11,3 using Radix Sort. 5. Suppose we generate a three-bit binary number in the following manner. The least significant bit is generated by tossing a fair coin (e.g. heads yields a 1 bit, while tails yields a 0 bit). The middle bit is obtained by tossing a coin that has probability 0.60 of landing heads (again, heads yields a 1 bit). Finally, the leading bit is obtained by multiplying the first two bits. Let X denote the decimal value of the generated number. State the domain of X, and provide a probability distribution for the domain. Verify that your distribution adds to one. 6. Compute E[X] for the random variable X of the previous problem. 7. Let X be a random bit that has a probability of 0.5 of equaling 1; Y a random bit that has probability 0.75 of equaling 1; Then let Z = XY. Finally, let W = X + Y + Z. Compute E[W ] in two different ways: i) by determing the domain and probability distribution for W, and ii) by using Theorem Consider the linear search algorithm with scans through an array a to determine if a contains an element x. Assuming a has a size of n, we say that the algorithm require i steps if x is located at index i; i.e. a[i] = x, for i = 0, 1,..., n 1. Furthermore, we say the algorithm requires n steps if x is not found in a. Suppose that we know the following statistics about linear search: 60% of all searches fail to locate the element x in a. Moreover, when x is found in a, it is equally likely to be in any of the array locations. Letting S denote the number of steps needed for a linear search over an array of size n, use the above facts to determine i) the domain of S, ii) a probability distribution for the domain of S, and iii) E[S]. 9. An n-permutation is an ordering of the numbers 0, 1, 2,..., n 1, in which each number occurs exactly once. For example, 4, 3, 1, 2, 0 is a 5-permutation. Assume there is a function called rand int(i,j) which, on inputs i j, returns a randomly chosen integer from the set {i, i + 1,..., j}. Now consider the following algorithm which, on input n, generates a random n- permutation within an array a[0],..., a[n 1]. To assign a[i] it calls rand int(0,n-1) until rand int returns a value that was not assigned to a[0], a[1],..., a[i 1]. The code for this algorithm is provided as follows. int a[n]; //Initialize a[] for(i=0; i < n; i++) a[i] = UNDEFINED; for(i=0; i < n; i++) 18

19 { while(a[i] == UNDEFINED) { m = rand_int(0,n-1); } } //Check if m has already been used for(j=0; j < i; j++) if(a[j] == m) break; if(i == j) //m has not previously been used a[i] == m; What is the expected running time of this algorithm? Explain and show work. 10. Repeat the previous problem, but now assume that, rather than checking each a[0],..., a[i 1] to see if the current random value m has already been used, an array called used is provided so that m has been used iff used[m] evaluates to true. In other words, the used array is initialized so that each of its values is set false, and then used[m] is set to true when m is first returned by rand int(0,n-1). Re-write the code of the previous problem and adapt it to this new algorithm, and analyze its expected running time. 11. This problem provides an even better approach to generating a random permutation. The algorithm starts by assigning a[i] the value i, for i = 0, 1,..., n 1. It then iterates n times so that, on iteration i, i = 0, 1,..., n 1, it swaps a[i] with a[k], where k is randomly chosen from the set {i, i + 1,..., n 1}. Prove that this algorithm yields a random n-permutation written in a[]. What is its expected running time? 12. Sort 13,1,4,5,8,7,9,11,3 using Mergesort. Assume the base case is an array of size 2 or less. 13. Perform the partitioning algorithm on 13,1,4,5,8,7,9,11,3 using the median-of-three heuristic to find the pivot. 14. Why is the asymptotic running time of Mergesort independent of the initial ordering of the array elements? Is the same true for Quicksort if the selected pivot is an exact median? What if the selected pivot is obtained using the median-of-three heuristic? 15. Suppose we swap elements a[i] and a[i + k] which were originally out of order (inverted). Under what conditions will only one inversion be removed? Under what conditions will 2k 1 inversions be removed? Argue that 2k 1 is the most number of inversions that can ever be removed. 16. What is the worst-case running time of Quicksort if the pivot is chosen as the first element in the array. Explain. 17. List all the inversions that occur in the array of 13,1,4,5,8,7,9,11,3. 19

20 18. Use the formula T (n) = log b n 1 j=0 a j f( n b j ) + Θ(nlog b a ), to determine the asymptotic growth of T (n) is of the following. T (n) = T (n/2) + 1 T (n) = 3T (n/3) + n 20

21 Exercise Hints and Answers. 1. Since there are 15,001 possible values for an array element, an auxiliary array of size 15,001 is needed. then how large of an auxiliary array should one use? Explain. 2. Linear since it requires zero swaps. 3. 1,3,4,5,7,8,9,11,13 4. After Round 1, the numbers should be ordered as 00100, 01000, 10011, 00001, 00101, 00111, 01001, 01011, After Round 2: 00100, 01000, 00001, 00101, , 10011, 00111, 01011, Continue with Rounds 3, 4, and 5, using the 3rd, 4th, and 5th least significant bits of the numbers. 5. Domain of X = {0, 1, 2, 7}. p(0) = (0.5)(0.4) = 0.2, p(1) = (0.5)(0.4) = 0.2, p(2) = (0.5)(0.6) = 0.3, p(7) = (0.5)(0.6) = E[X] = (0)(0.2) + (1)(0.2) + (2)(0.3) + (7)(0.3) = Domain of W = {0, 1, 2, 3}. P (0) = (0.5)(0.25) = 1/8, p(1) = (0.5)(0.25) + (0.5)(0.75) = 1/2, p(2) = 0, p(3) = (0.5)(0.75) = 3/8. E[W ] = (0)(1/8) + (1)(1/2) + (2)(0) + (3)(3/8) = 13/8. Also, using the linearity of expectation, we have E[W ] = E[X] + E[Y ] + E[Z]. We have E[X] = 1/2, E[Y ] = 3/4, and E[Z] = 3/8. Thus, E[W ] = 1/2 + 3/4 + 3/8 = 13/8. 8. Domain of S = {0, 1,..., n}. The probability distribution is p(i) = 0.4/n, i = 0,..., n 1, and p(n) = 0.6. Using the linearity of expectation we have n 1 E[S] = (0.4/n) i + (n)(0.6) = (0.2)(n 1) + (0.6)n = 0.8n 0.2. i=0 Therefore, on average about 80% of the array needs to be scanned. 9. Let T be the running time of the algorithm. Let S i denote the number of calls to rand() that are needed in order to generate the i th number of the permutation. Then T = (1)S 1 + 2S ns n. This is true since each call to rand() when generating the i the permutation will require an average of Θ(i) steps within the inner-most loop to check if the generated number has been used. Now when generating the i th permutation number using rand(), there is a probability of p i = (i 1)/n that a number will be generated which is already in the permutation. Then the probabiliy of not generating a repeat number is thus 1 (n i + 1)/n = (n i + 1)/n, and the expected number of calls that will be needed to obtain a non-repeat is E[S i ] = n/(n i+1). Thus, by linearity of expectation, E[T ] = n n i i=1. The asymptotic growth of this sum n i+1 can be obtained by evaluating n n x 1 dx, which yields E[T ] = n x+1 Θ(n2 log n). 10. Same analysis as previous problem, but now we have T = S S n, since the returned value of a call to rand() can be checked in O(1) steps as to whether or not it is a repeat. This yields a simplified expectation of E[T ] = n n 1 i=1. But the sum in this expression is the n i+1 harmonic series. Therefore E[T ] = Θ(n log n). 21

22 11. First notice that the only operations performed on a are swaps between two different array entries. Thus, if the array begins with 0,..., n 1, then it will end with a permutation of 0,..., n 1. Moreover, all values are equally likely to be placed at position i, for all i = 0,..., n 1. The expected running time is Θ(n) since the algorithm is accomplished by performing n swaps and n calls to a random-number generator , 3, 4, 5, 7, 8, 9, 11, Pivot is median of (13, 8, 3), which equals 8. a left = 7, 1, 4, 5, 3, a right = 9, 11, 13. Final pivot location is at index Regardless of the ordering, all the merge steps must still be performed in the Mergesort algorithm. The merging step will always take Θ(n 1 + n 2 ), where n 1 is the length of the left sorted array, and n 2 is the length of the right sorted array. The same is true for Quicksort if the median is used as the pivot. Again, the partitioning step will always require Θ(n) steps, where n is the size of the subarray that is to be partitioned. If median-of-three heuristic is used, then it is possible to obtain unfavorable pivots, that lead to an O(n 2 ) worst-case running time. Verify this for an array of size eight. 15. Only one inversion removed: a[i] > a[i + 1] < a[i + 2] < < a[i + k 1] > a[i + k]. 2k 1 inversions removed: a[i],..., a[i + k] is in reversed order. 16. O(n 2 ). Consider what happens if array is already sorted. 17. List all the inversions that occur in the array of 13,1,4,5,8,7,9,11,3. (13, 1),..., (13, 3), (4, 3), (5, 3),..., (11, 3), (8, 7). 18. Use the formula T (n) = log b n 1 j=0 to determine the asymptotic growth of T (n). T (n) = log n 1 j=0 1 j + Θ(n 0 ) = Θ(log n) T (n) = log 3 n 1 j=0 3 j (n/3 j ) + Θ(n 1 ) = Θ(n log n) a j f( n b j ) + Θ(nlog b a ), 22

CSC Design and Analysis of Algorithms

CSC Design and Analysis of Algorithms CSC 8301- Design and Analysis of Algorithms Lecture 6 Divide and Conquer Algorithm Design Technique Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide a problem instance into two

More information

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer CSC 8301- Design and Analysis of Algorithms Lecture 6 Divide and Conquer Algorithm Design Technique Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide a problem instance into two

More information

Lecture 2: Getting Started

Lecture 2: Getting Started Lecture 2: Getting Started Insertion Sort Our first algorithm is Insertion Sort Solves the sorting problem Input: A sequence of n numbers a 1, a 2,..., a n. Output: A permutation (reordering) a 1, a 2,...,

More information

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer Computer Science 385 Analysis of Algorithms Siena College Spring 2011 Topic Notes: Divide and Conquer Divide and-conquer is a very common and very powerful algorithm design technique. The general idea:

More information

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer Module 2: Divide and Conquer Divide and Conquer Control Abstraction for Divide &Conquer 1 Recurrence equation for Divide and Conquer: If the size of problem p is n and the sizes of the k sub problems are

More information

Divide-and-Conquer. The most-well known algorithm design strategy: smaller instances. combining these solutions

Divide-and-Conquer. The most-well known algorithm design strategy: smaller instances. combining these solutions Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original

More information

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved. Chapter 4 Divide-and-Conquer Copyright 2007 Pearson Addison-Wesley. All rights reserved. Divide-and-Conquer The most-well known algorithm design strategy: 2. Divide instance of problem into two or more

More information

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation Comp2711 S1 2006 Correctness Oheads 1 Efficiency Issues Comp2711 S1 2006 Correctness Oheads 2 Run Times An implementation may be correct with respect to the Specification Pre- and Post-condition, but nevertheless

More information

Divide and Conquer 4-0

Divide and Conquer 4-0 Divide and Conquer 4-0 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain

More information

4. Sorting and Order-Statistics

4. Sorting and Order-Statistics 4. Sorting and Order-Statistics 4. Sorting and Order-Statistics The sorting problem consists in the following : Input : a sequence of n elements (a 1, a 2,..., a n ). Output : a permutation (a 1, a 2,...,

More information

Lecture 9: Sorting Algorithms

Lecture 9: Sorting Algorithms Lecture 9: Sorting Algorithms Bo Tang @ SUSTech, Spring 2018 Sorting problem Sorting Problem Input: an array A[1..n] with n integers Output: a sorted array A (in ascending order) Problem is: sort A[1..n]

More information

Divide-and-Conquer. Dr. Yingwu Zhu

Divide-and-Conquer. Dr. Yingwu Zhu Divide-and-Conquer Dr. Yingwu Zhu Divide-and-Conquer The most-well known algorithm design technique: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances independently

More information

Randomized Algorithms, Quicksort and Randomized Selection

Randomized Algorithms, Quicksort and Randomized Selection CMPS 2200 Fall 2017 Randomized Algorithms, Quicksort and Randomized Selection Carola Wenk Slides by Carola Wenk and Charles Leiserson CMPS 2200 Intro. to Algorithms 1 Deterministic Algorithms Runtime for

More information

1 Probabilistic analysis and randomized algorithms

1 Probabilistic analysis and randomized algorithms 1 Probabilistic analysis and randomized algorithms Consider the problem of hiring an office assistant. We interview candidates on a rolling basis, and at any given point we want to hire the best candidate

More information

Question 7.11 Show how heapsort processes the input:

Question 7.11 Show how heapsort processes the input: Question 7.11 Show how heapsort processes the input: 142, 543, 123, 65, 453, 879, 572, 434, 111, 242, 811, 102. Solution. Step 1 Build the heap. 1.1 Place all the data into a complete binary tree in the

More information

Lecture 8: Mergesort / Quicksort Steven Skiena

Lecture 8: Mergesort / Quicksort Steven Skiena Lecture 8: Mergesort / Quicksort Steven Skiena Department of Computer Science State University of New York Stony Brook, NY 11794 4400 http://www.cs.stonybrook.edu/ skiena Problem of the Day Give an efficient

More information

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics 1 Sorting 1.1 Problem Statement You are given a sequence of n numbers < a 1, a 2,..., a n >. You need to

More information

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements. Sorting Sorting is ordering a list of elements. Types of sorting: There are many types of algorithms exist based on the following criteria: Based on Complexity Based on Memory usage (Internal & External

More information

Divide and Conquer. Algorithm Fall Semester

Divide and Conquer. Algorithm Fall Semester Divide and Conquer Algorithm 2014 Fall Semester Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances

More information

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n. Problem 5. Sorting Simple Sorting, Quicksort, Mergesort Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all 1 i j n. 98 99 Selection Sort

More information

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1 DIVIDE & CONQUER Definition: Divide & conquer is a general algorithm design strategy with a general plan as follows: 1. DIVIDE: A problem s instance is divided into several smaller instances of the same

More information

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT. Chapter 3:- Divide and Conquer Compiled By:- Assistant Professor, SVBIT. Outline Introduction Multiplying large Integers Problem Problem Solving using divide and conquer algorithm - Binary Search Sorting

More information

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place? Sorting Binary search works great, but how do we create a sorted array in the first place? Sorting in Arrays Sorting algorithms: Selection sort: O(n 2 ) time Merge sort: O(nlog 2 (n)) time Quicksort: O(n

More information

SORTING AND SELECTION

SORTING AND SELECTION 2 < > 1 4 8 6 = 9 CHAPTER 12 SORTING AND SELECTION ACKNOWLEDGEMENT: THESE SLIDES ARE ADAPTED FROM SLIDES PROVIDED WITH DATA STRUCTURES AND ALGORITHMS IN JAVA, GOODRICH, TAMASSIA AND GOLDWASSER (WILEY 2016)

More information

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014 CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting Aaron Bauer Winter 2014 The main problem, stated carefully For now, assume we have n comparable elements in an array and we want

More information

The divide and conquer strategy has three basic parts. For a given problem of size n,

The divide and conquer strategy has three basic parts. For a given problem of size n, 1 Divide & Conquer One strategy for designing efficient algorithms is the divide and conquer approach, which is also called, more simply, a recursive approach. The analysis of recursive algorithms often

More information

QuickSort

QuickSort QuickSort 7 4 9 6 2 2 4 6 7 9 4 2 2 4 7 9 7 9 2 2 9 9 1 QuickSort QuickSort on an input sequence S with n elements consists of three steps: n n n Divide: partition S into two sequences S 1 and S 2 of about

More information

Analysis of Algorithms - Quicksort -

Analysis of Algorithms - Quicksort - Analysis of Algorithms - Quicksort - Andreas Ermedahl MRTC (Mälardalens Real-Time Reseach Center) andreas.ermedahl@mdh.se Autumn 2003 Quicksort Proposed by C.A.R. Hoare in 962. Divide- and- conquer algorithm

More information

Scribe: Sam Keller (2015), Seth Hildick-Smith (2016), G. Valiant (2017) Date: January 25, 2017

Scribe: Sam Keller (2015), Seth Hildick-Smith (2016), G. Valiant (2017) Date: January 25, 2017 CS 6, Lecture 5 Quicksort Scribe: Sam Keller (05), Seth Hildick-Smith (06), G. Valiant (07) Date: January 5, 07 Introduction Today we ll study another sorting algorithm. Quicksort was invented in 959 by

More information

II (Sorting and) Order Statistics

II (Sorting and) Order Statistics II (Sorting and) Order Statistics Heapsort Quicksort Sorting in Linear Time Medians and Order Statistics 8 Sorting in Linear Time The sorting algorithms introduced thus far are comparison sorts Any comparison

More information

SORTING, SETS, AND SELECTION

SORTING, SETS, AND SELECTION CHAPTER 11 SORTING, SETS, AND SELECTION ACKNOWLEDGEMENT: THESE SLIDES ARE ADAPTED FROM SLIDES PROVIDED WITH DATA STRUCTURES AND ALGORITHMS IN C++, GOODRICH, TAMASSIA AND MOUNT (WILEY 2004) AND SLIDES FROM

More information

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer CSC 8301- Design and Analysis of Algorithms Lecture 6 Divide and Conuer Algorithm Design Techniue Divide-and-Conuer The most-well known algorithm design strategy: 1. Divide a problem instance into two

More information

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang Acknowledgement The set of slides have use materials from the following resources Slides for textbook by Dr. Y.

More information

COSC242 Lecture 7 Mergesort and Quicksort

COSC242 Lecture 7 Mergesort and Quicksort COSC242 Lecture 7 Mergesort and Quicksort We saw last time that the time complexity function for Mergesort is T (n) = n + n log n. It is not hard to see that T (n) = O(n log n). After all, n + n log n

More information

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Acknowledgement. Outline

Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Acknowledgement. Outline Divide and Conquer CISC4080, Computer Algorithms CIS, Fordham Univ. Instructor: X. Zhang Acknowledgement The set of slides have use materials from the following resources Slides for textbook by Dr. Y.

More information

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting Simple Sorting 7. Sorting I 7.1 Simple Sorting Selection Sort, Insertion Sort, Bubblesort [Ottman/Widmayer, Kap. 2.1, Cormen et al, Kap. 2.1, 2.2, Exercise 2.2-2, Problem 2-2 19 197 Problem Algorithm:

More information

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems.

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems. 2.3 Designing algorithms There are many ways to design algorithms. Insertion sort uses an incremental approach: having sorted the subarray A[1 j - 1], we insert the single element A[j] into its proper

More information

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name Chapter 7 Sorting 2 Introduction sorting fundamental task in data management well-studied problem in computer science basic problem given an of items where each item contains a key, rearrange the items

More information

DATA STRUCTURES AND ALGORITHMS

DATA STRUCTURES AND ALGORITHMS DATA STRUCTURES AND ALGORITHMS Fast sorting algorithms Shellsort, Mergesort, Quicksort Summary of the previous lecture Why sorting is needed? Examples from everyday life What are the basic operations in

More information

Lecture 19 Sorting Goodrich, Tamassia

Lecture 19 Sorting Goodrich, Tamassia Lecture 19 Sorting 7 2 9 4 2 4 7 9 7 2 2 7 9 4 4 9 7 7 2 2 9 9 4 4 2004 Goodrich, Tamassia Outline Review 3 simple sorting algorithms: 1. selection Sort (in previous course) 2. insertion Sort (in previous

More information

Test 1 Review Questions with Solutions

Test 1 Review Questions with Solutions CS3510 Design & Analysis of Algorithms Section A Test 1 Review Questions with Solutions Instructor: Richard Peng Test 1 in class, Wednesday, Sep 13, 2017 Main Topics Asymptotic complexity: O, Ω, and Θ.

More information

Selection (deterministic & randomized): finding the median in linear time

Selection (deterministic & randomized): finding the median in linear time Lecture 4 Selection (deterministic & randomized): finding the median in linear time 4.1 Overview Given an unsorted array, how quickly can one find the median element? Can one do it more quickly than bysorting?

More information

Unit-2 Divide and conquer 2016

Unit-2 Divide and conquer 2016 2 Divide and conquer Overview, Structure of divide-and-conquer algorithms, binary search, quick sort, Strassen multiplication. 13% 05 Divide-and- conquer The Divide and Conquer Paradigm, is a method of

More information

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Merge Sort & Quick Sort 1 Divide-and-Conquer Divide-and conquer is a general algorithm

More information

Problem Set 4 Solutions

Problem Set 4 Solutions Design and Analysis of Algorithms March 5, 205 Massachusetts Institute of Technology 6.046J/8.40J Profs. Erik Demaine, Srini Devadas, and Nancy Lynch Problem Set 4 Solutions Problem Set 4 Solutions This

More information

Sorting Algorithms. + Analysis of the Sorting Algorithms

Sorting Algorithms. + Analysis of the Sorting Algorithms Sorting Algorithms + Analysis of the Sorting Algorithms Insertion Sort What if first k elements of array are already sorted? 4, 7, 12, 5, 19, 16 We can shift the tail of the sorted elements list down and

More information

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS

DIVIDE AND CONQUER ALGORITHMS ANALYSIS WITH RECURRENCE EQUATIONS CHAPTER 11 SORTING ACKNOWLEDGEMENT: THESE SLIDES ARE ADAPTED FROM SLIDES PROVIDED WITH DATA STRUCTURES AND ALGORITHMS IN C++, GOODRICH, TAMASSIA AND MOUNT (WILEY 2004) AND SLIDES FROM NANCY M. AMATO AND

More information

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting Introduction Chapter 7 Sorting sorting fundamental task in data management well-studied problem in computer science basic problem given an array of items where each item contains a key, rearrange the items

More information

Divide & Conquer Design Technique

Divide & Conquer Design Technique Divide & Conquer Design Technique Adnan YAZICI Dept. of Computer Engineering Middle East Technical Univ. Ankara - TURKEY 1 The Divide & Conquer strategy can be described in general terms as follows: A

More information

EECS 2011M: Fundamentals of Data Structures

EECS 2011M: Fundamentals of Data Structures M: Fundamentals of Data Structures Instructor: Suprakash Datta Office : LAS 3043 Course page: http://www.eecs.yorku.ca/course/2011m Also on Moodle Note: Some slides in this lecture are adopted from James

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17 5.1 Introduction You should all know a few ways of sorting in O(n log n)

More information

COMP Data Structures

COMP Data Structures COMP 2140 - Data Structures Shahin Kamali Topic 5 - Sorting University of Manitoba Based on notes by S. Durocher. COMP 2140 - Data Structures 1 / 55 Overview Review: Insertion Sort Merge Sort Quicksort

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms CSE 5311 Lecture 8 Sorting in Linear Time Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Sorting So Far

More information

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION DESIGN AND ANALYSIS OF ALGORITHMS Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION http://milanvachhani.blogspot.in EXAMPLES FROM THE SORTING WORLD Sorting provides a good set of examples for analyzing

More information

Divide-and-Conquer CSE 680

Divide-and-Conquer CSE 680 Divide-and-Conquer CSE 680 1 Introduction Given an instance x of a problem, the divide-and-conquer method works as follows. function DAQ(x) if x is sufficiently small or simple then solve it directly else

More information

Mergesort again. 1. Split the list into two equal parts

Mergesort again. 1. Split the list into two equal parts Quicksort Mergesort again 1. Split the list into two equal parts 5 3 9 2 8 7 3 2 1 4 5 3 9 2 8 7 3 2 1 4 Mergesort again 2. Recursively mergesort the two parts 5 3 9 2 8 7 3 2 1 4 2 3 5 8 9 1 2 3 4 7 Mergesort

More information

Algorithm Analysis and Design

Algorithm Analysis and Design Algorithm Analysis and Design Dr. Truong Tuan Anh Faculty of Computer Science and Engineering Ho Chi Minh City University of Technology VNU- Ho Chi Minh City 1 References [1] Cormen, T. H., Leiserson,

More information

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University CS 112 Introduction to Computing II Wayne Snyder Department Boston University Today Recursive Sorting Methods and their Complexity: Mergesort Conclusions on sorting algorithms and complexity Next Time:

More information

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS Department of Computer Science University of Babylon LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS By Faculty of Science for Women( SCIW), University of Babylon, Iraq Samaher@uobabylon.edu.iq

More information

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9 CSC 505, Spring 2005 Week 6 Lectures page 1 of 9 Objectives: learn general strategies for problems about order statistics learn how to find the median (or k-th largest) in linear average-case number of

More information

CPSC 320 Midterm 2 Thursday March 13th, 2014

CPSC 320 Midterm 2 Thursday March 13th, 2014 CPSC 320 Midterm 2 Thursday March 13th, 2014 [12] 1. Answer each question with True or False, and then justify your answer briefly. [2] (a) The Master theorem can be applied to the recurrence relation

More information

Sorting. Weiss chapter , 8.6

Sorting. Weiss chapter , 8.6 Sorting Weiss chapter 8.1 8.3, 8.6 Sorting 5 3 9 2 8 7 3 2 1 4 1 2 2 3 3 4 5 7 8 9 Very many different sorting algorithms (bubblesort, insertion sort, selection sort, quicksort, heapsort, mergesort, shell

More information

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning Lecture 7 Quicksort 15-122: Principles of Imperative Computation (Spring 2018) Frank Pfenning In this lecture we consider two related algorithms for sorting that achieve a much better running time than

More information

Lecture Notes on Quicksort

Lecture Notes on Quicksort Lecture Notes on Quicksort 15-122: Principles of Imperative Computation Frank Pfenning Lecture 8 September 20, 2012 1 Introduction In this lecture we first sketch two related algorithms for sorting that

More information

Algorithms - Ch2 Sorting

Algorithms - Ch2 Sorting Algorithms - Ch2 Sorting (courtesy of Prof.Pecelli with some changes from Prof. Daniels) 1/28/2015 91.404 - Algorithms 1 Algorithm Description Algorithm Description: -Pseudocode see conventions on p. 20-22

More information

Algorithms in Systems Engineering ISE 172. Lecture 12. Dr. Ted Ralphs

Algorithms in Systems Engineering ISE 172. Lecture 12. Dr. Ted Ralphs Algorithms in Systems Engineering ISE 172 Lecture 12 Dr. Ted Ralphs ISE 172 Lecture 12 1 References for Today s Lecture Required reading Chapter 6 References CLRS Chapter 7 D.E. Knuth, The Art of Computer

More information

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4. COMPSCI 330: Design and Analysis of Algorithms 8/28/2014 Lecturer: Debmalya Panigrahi Lecture #2 Scribe: Yilun Zhou 1 Overview This lecture presents two sorting algorithms, quicksort and mergesort, that

More information

Chapter 5. Quicksort. Copyright Oliver Serang, 2018 University of Montana Department of Computer Science

Chapter 5. Quicksort. Copyright Oliver Serang, 2018 University of Montana Department of Computer Science Chapter 5 Quicsort Copyright Oliver Serang, 08 University of Montana Department of Computer Science Quicsort is famous because of its ability to sort in-place. I.e., it directly modifies the contents of

More information

CS 561, Lecture 1. Jared Saia University of New Mexico

CS 561, Lecture 1. Jared Saia University of New Mexico CS 561, Lecture 1 Jared Saia University of New Mexico Quicksort Based on divide and conquer strategy Worst case is Θ(n 2 ) Expected running time is Θ(n log n) An In-place sorting algorithm Almost always

More information

Advanced Algorithms and Data Structures

Advanced Algorithms and Data Structures Advanced Algorithms and Data Structures Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Prerequisites A seven credit unit course Replaced OHJ-2156 Analysis of Algorithms We take things a bit further than

More information

Quicksort (Weiss chapter 8.6)

Quicksort (Weiss chapter 8.6) Quicksort (Weiss chapter 8.6) Recap of before Easter We saw a load of sorting algorithms, including mergesort To mergesort a list: Split the list into two halves Recursively mergesort the two halves Merge

More information

CSC 273 Data Structures

CSC 273 Data Structures CSC 273 Data Structures Lecture 6 - Faster Sorting Methods Merge Sort Divides an array into halves Sorts the two halves, Then merges them into one sorted array. The algorithm for merge sort is usually

More information

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms Algorithm Efficiency & Sorting Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms Overview Writing programs to solve problem consists of a large number of decisions how to represent

More information

Sorting and Selection

Sorting and Selection Sorting and Selection Introduction Divide and Conquer Merge-Sort Quick-Sort Radix-Sort Bucket-Sort 10-1 Introduction Assuming we have a sequence S storing a list of keyelement entries. The key of the element

More information

Outline. Computer Science 331. Three Classical Algorithms. The Sorting Problem. Classical Sorting Algorithms. Mike Jacobson. Description Analysis

Outline. Computer Science 331. Three Classical Algorithms. The Sorting Problem. Classical Sorting Algorithms. Mike Jacobson. Description Analysis Outline Computer Science 331 Classical Sorting Algorithms Mike Jacobson Department of Computer Science University of Calgary Lecture #22 1 Introduction 2 3 4 5 Comparisons Mike Jacobson (University of

More information

Deterministic and Randomized Quicksort. Andreas Klappenecker

Deterministic and Randomized Quicksort. Andreas Klappenecker Deterministic and Randomized Quicksort Andreas Klappenecker Overview Deterministic Quicksort Modify Quicksort to obtain better asymptotic bound Linear-time median algorithm Randomized Quicksort Deterministic

More information

Better sorting algorithms (Weiss chapter )

Better sorting algorithms (Weiss chapter ) Better sorting algorithms (Weiss chapter 8.5 8.6) Divide and conquer Very general name for a type of recursive algorithm You have a problem to solve. Split that problem into smaller subproblems Recursively

More information

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order. Sorting The sorting problem is defined as follows: Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order. Remember that total order

More information

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem Divide-and-conquer method: Divide-and-conquer is probably the best known general algorithm design technique. The principle behind the Divide-and-conquer algorithm design technique is that it is easier

More information

Advanced Algorithms and Data Structures

Advanced Algorithms and Data Structures Advanced Algorithms and Data Structures Prof. Tapio Elomaa Course Basics A new 7 credit unit course Replaces OHJ-2156 Analysis of Algorithms We take things a bit further than OHJ-2156 We will assume familiarity

More information

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013 CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting Dan Grossman Fall 2013 Introduction to Sorting Stacks, queues, priority queues, and dictionaries all focused on providing one element

More information

5. DIVIDE AND CONQUER I

5. DIVIDE AND CONQUER I 5. DIVIDE AND CONQUER I mergesort counting inversions closest pair of points randomized quicksort median and selection Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley Copyright 2013

More information

Lecture 15 : Review DRAFT

Lecture 15 : Review DRAFT CS/Math 240: Introduction to Discrete Mathematics 3/10/2011 Lecture 15 : Review Instructor: Dieter van Melkebeek Scribe: Dalibor Zelený DRAFT Today slectureservesasareviewofthematerialthatwillappearonyoursecondmidtermexam.

More information

Quicksort. Repeat the process recursively for the left- and rightsub-blocks.

Quicksort. Repeat the process recursively for the left- and rightsub-blocks. Quicksort As the name implies, this is the fastest known sorting algorithm in practice. It is excellent for average input but bad for the worst-case input. (you will see later). Basic idea: (another divide-and-conquer

More information

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9) CPSC 311 Lecture Notes Sorting and Order Statistics (Chapters 6-9) Acknowledgement: These notes are compiled by Nancy Amato at Texas A&M University. Parts of these course notes are based on notes from

More information

A 0 A 1... A i 1 A i,..., A min,..., A n 1 in their final positions the last n i elements After n 1 passes, the list is sorted.

A 0 A 1... A i 1 A i,..., A min,..., A n 1 in their final positions the last n i elements After n 1 passes, the list is sorted. CS6402 Design and Analysis of Algorithms _ Unit II 2.1 UNIT II BRUTE FORCE AND DIVIDE-AND-CONQUER 2.1 BRUTE FORCE Brute force is a straightforward approach to solving a problem, usually directly based

More information

CS 310 Advanced Data Structures and Algorithms

CS 310 Advanced Data Structures and Algorithms CS 310 Advanced Data Structures and Algorithms Sorting June 13, 2017 Tong Wang UMass Boston CS 310 June 13, 2017 1 / 42 Sorting One of the most fundamental problems in CS Input: a series of elements with

More information

We can use a max-heap to sort data.

We can use a max-heap to sort data. Sorting 7B N log N Sorts 1 Heap Sort We can use a max-heap to sort data. Convert an array to a max-heap. Remove the root from the heap and store it in its proper position in the same array. Repeat until

More information

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017 CS 137 Part 8 Merge Sort, Quick Sort, Binary Search November 20th, 2017 This Week We re going to see two more complicated sorting algorithms that will be our first introduction to O(n log n) sorting algorithms.

More information

CmpSci 187: Programming with Data Structures Spring 2015

CmpSci 187: Programming with Data Structures Spring 2015 CmpSci 187: Programming with Data Structures Spring 2015 Lecture #22, More Graph Searches, Some Sorting, and Efficient Sorting Algorithms John Ridgway April 21, 2015 1 Review of Uniform-cost Search Uniform-Cost

More information

The complexity of Sorting and sorting in linear-time. Median and Order Statistics. Chapter 8 and Chapter 9

The complexity of Sorting and sorting in linear-time. Median and Order Statistics. Chapter 8 and Chapter 9 Subject 6 Spring 2017 The complexity of Sorting and sorting in linear-time Median and Order Statistics Chapter 8 and Chapter 9 Disclaimer: These abbreviated notes DO NOT substitute the textbook for this

More information

Divide and Conquer. Algorithm D-and-C(n: input size)

Divide and Conquer. Algorithm D-and-C(n: input size) Divide and Conquer Algorithm D-and-C(n: input size) if n n 0 /* small size problem*/ Solve problem without futher sub-division; else Divide into m sub-problems; Conquer the sub-problems by solving them

More information

CS303 (Spring 2008) Solutions to Assignment 7

CS303 (Spring 2008) Solutions to Assignment 7 CS303 (Spring 008) Solutions to Assignment 7 Problem 1 Here is our implementation of a class for long numbers. It includes addition, subtraction, and both ways of multiplying. The rest of the code, not

More information

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6 6.0 Introduction Sorting algorithms used in computer science are often classified by: Computational complexity (worst, average and best behavior) of element

More information

Sorting. Task Description. Selection Sort. Should we worry about speed?

Sorting. Task Description. Selection Sort. Should we worry about speed? Sorting Should we worry about speed? Task Description We have an array of n values in any order We need to have the array sorted in ascending or descending order of values 2 Selection Sort Select the smallest

More information

Lecture Notes on Quicksort

Lecture Notes on Quicksort Lecture Notes on Quicksort 15-122: Principles of Imperative Computation Frank Pfenning Lecture 8 February 5, 2015 1 Introduction In this lecture we consider two related algorithms for sorting that achieve

More information

CSC Design and Analysis of Algorithms. Lecture 5. Decrease and Conquer Algorithm Design Technique. Decrease-and-Conquer

CSC Design and Analysis of Algorithms. Lecture 5. Decrease and Conquer Algorithm Design Technique. Decrease-and-Conquer CSC 8301- Design and Analysis of Algorithms Lecture 5 Decrease and Conquer Algorithm Design Technique Decrease-and-Conquer This algorithm design technique is based on exploiting a relationship between

More information

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson

Lecture Notes 14 More sorting CSS Data Structures and Object-Oriented Programming Professor Clark F. Olson Lecture Notes 14 More sorting CSS 501 - Data Structures and Object-Oriented Programming Professor Clark F. Olson Reading for this lecture: Carrano, Chapter 11 Merge sort Next, we will examine two recursive

More information

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides Jana Kosecka Linear Time Sorting, Median, Order Statistics Many slides here are based on E. Demaine, D. Luebke slides Insertion sort: Easy to code Fast on small inputs (less than ~50 elements) Fast on

More information

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist.

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist. Sorting Popular algorithms: Selection sort* Insertion sort* Bubble sort* Quick sort* Comb-sort Shell-sort Heap sort* Merge sort* Counting-sort Radix-sort Bucket-sort Tim-sort Many algorithms for sorting

More information