LESSON 16: ALGORITHM STRATEGIES
|
|
- Alexia Robinson
- 5 years ago
- Views:
Transcription
1 LESSON 16: ALGORITHM STRATEGIES Objective Divide and Conquer Backtracking Now you are familiar with greedy algorithm Divide and Conquer Definition: An algorithmic technique. To solve a problem on an instance of size n, a solution is found either directly because solving that instance is easy (typically, because the instance is small) or the instance is divided into two or more smaller instances. Each of these smaller instances is recursively solved, and the solutions are combined to produce a solution for the original instance. Note: The name divide and conquer is because the problem is conquered by dividing it into several smaller problems. This technique yields elegant, simple and quite often very efficient algorithms. Well-known examples include heapify, merge sort, quicksort, Strassen s fast matrix multiplication, the Fast Fourier Transform (FFT), and binary search. (Why is binary search included? The dividing part just picks which segment in which to search, and combining the solutions is trivial: take the answer from the segment in which you searched.) Similar principles are at the heart of several important data structures such as binary search tree, multiway search trees, tries, skip lists, multidimensional search trees (k-d trees, quadtrees), etc. Implementation How many odd numbers in an array? In addition to correctness, programming style is also important, so please write modular, well-documented code. 1. Implement the recursive, divide-and-conquer function ADD-ODDS (A,p,r) that returns the sum of the odd numbers found in the positive integer 2. The main part of your program will read in data from a file whose name is given as the first argument to your executable. The first line of the input file contains the number of lines (1-99) in the rest of the file. Each subsequent line begins with the number of elements (1-99) in the array followed by the elements (1-99). Your main program should process each line one at a time by reading in the array, calling ADD-ODDS, and then printing the result. For example, /*Program1*/ /*name:anan Tongprasith*/ /*course: cse2320 sec 501*/ /*compile: cc addodd.c*/ #include<stdio.h> /* The main program reads data from input file and put the data into an*/ /*array name element. Then we call function addodds to recursively add*/ /*odd numbers from the array. */ main(int argc, char *argv[]) FILE *fp; char newline[100],newc; int numofline,i,element[100],a=0; fp=fopen(argv[1], r ); /*Open input file*/ if(fgets(newline,100,fp)!= NULL) /*Read first line*/ numofline=atoi(newline); /*Num of line=first line*/ for (i=numofline;i>0;i-=1) /*Do it line by line*/ fgets(newline,100,fp); /*Read a raw data line*/ a=testme(element,newline); /*Put data into an array*/ printf( %d\n,addodds(1,a,element)); /*Addodds the array*/ else fclose(fp); return 0; /* Function testme reads a data line and put the data into an array.*/ int testme(int *ele,char *new) int i=-1,j=0,k=0,linelength=strlen(new);char *piece; 54 3B.582
2 do /*check for j=0;printf( \0 ); do /*Loop for data separation*/ i++; /*buffer=data*/ piece[j++]=new[i]; while(new[i]!= && new[i]!= \0'); space between data and end of line*/ ele[k]=atoi(piece); /*Put data into an array*/ *piece= \0'; k=k+1; while(i<linelength); k=ele[0];ele=&ele[1]; /*Remove first item (# of data)*/ /*reset buffer*/ return k; / *Return # of data*/ /* Function addodds receives starting point, end point, and data array.*/ /* It sums up all odd numbers in the array between starting point and */ /* end points and return the result.*/ int addodds(int min,int max,int *ele) int x,y; if (min==max) /*Base case*/ if ((ele[min]%2)==1) /*Checking for odds*/ *Return odds*/ return 0; else /*Recursive case*/ else return ele[min]; / /*Return 0 s if even*/ x=addodds(min,(min+max)/2,ele); /*Recursive call 1st half*/ y=addodds(1+(min+max)/2,max,ele); /* 2nd half*/ return x+y; /*Return the sum*/ Three Divide and Conquer Sorting Algorithms Today we ll finish heapsort, and describe both mergesort and quicksort. Why do we need multiple sorting algorithms? Different methods work better in different applications. Heapsort uses close to the right number of comparisons but needs to move data around quite a bit. It can be done in a way that uses very little extra memory. It s probably good when memory is tight, and you are sorting many small items that come stored in an array. Merge sort is good for data that s too big to have in memory at once, because its pattern of storage access is very regular. It also uses even fewer comparisons than heapsort, and is especially suited for data stored as linked lists. Quicksort also uses few comparisons (somewhat more than the other two). Like heapsort it can sort in place by moving data in an array. Heapification Recall the idea of heapsort: heapsort(list L) make heap H from L make empty list X while H nonempty remove smallest from H and add it to X return X Remember that a heap is just a balanced binary tree in which the value at any node is smaller than the values at its children. We went over most of this last time. The total number of comparisons is n + however many are needed to make H. The only missing step: how to make a heap? To start with, we can set up a binary tree of the right size and shape, and put the objects into the tree in any old order. This is all easy and doesn t require any comparisons. Now we have to switch objects around to get them back in order. The divide and conquer idea: find natural subproblems, solve them recursively, and combine them to get an overall solution. Here the obvious subproblems are the subtrees. If we solve them recursively, we get something that is close to being a heap, except that perhaps the root doesn t satisfy the heap property. To make the whole thing a heap, we merely have to percolate that value down to a lower level in the tree. heapify(tree T) if (T is nonempty) heapify(left subtree) heapify(right subtree) let x = value at tree root while node containing x doesn t satisfy heap propert switch values of node and its smallest child 3B
3 The while loop performs two comparisons per iteration, and takes at most iterations, so the time for this satisfies a recurrence T(n) <= 2 T(n/2) + 2 How to Solve It? Divide and Conquer Recurrences In general, divide and conquer is based on the following idea. The whole problem we want to solve may too big to understand or solve at once. We break it up into smaller pieces, solve the pieces separately, and combine the separate pieces together. We analyze this in some generality: suppose we have a pieces, each of size n/b and merging takes time f(n). (In the heapification example a=b=2 and f(n)=o() but it will not always be true that a=b - sometimes the pieces will overlap.) The easiest way to understand what s going on here is to draw a tree with nodes corresponding to subproblems (labeled with the size of the subproblem) n / \ n/b n/b n/b / \ / \ / \ For simplicity, let s assume n is a power of b, and that the recursion stops when n is 1. Notice that the size of a node depends only on its level: size(i) = n/(b^i). What is time taken by a node at level i? time(i) = f(n/b^i) How many levels can we have before we get down to n=1? For bottom level, n/b^i=1, so n=b^i and i=()/(log b). How many items at level i? a^i. So putting these together we have ()/(log b) T(n) = sum a^i f(n/b^i) This looks messy, but it s not too bad. There are only a few terms (logarithmically many) and often the sum is dominated by the terms at one end (f(n)) or the other (n^(log a/log b)). In fact, you will generally only be a logarithmic factor away from the truth if you approximate the solution by the sum of these two, O(f(n) + n^(log a/log b)). Let s use this to analyze heapification. By plugging in parameters a=b=2, f(n)=, we get T(n) = 2 sum 2^i log(n/2^i) Rewriting the same terms in the opposite order, this turns out to equal T(n) = 2 sum n/2^i log(2^i) = 2n sum i/2^i infty <= 2n sum i/2^i = 4n So heapification takes at most 4n comparisons and heapsort takes at most n + 4n. (There s an n n lower bound so we re only within O(n) of the absolute best possible.) This was an example of a sorting algorithm where one part used divide and conquer. What about doing the whole algorithm that way? Merge sort According to Knuth, merge sort was one of the earliest sorting algorithms, invented by John von Neumann in Let s look at the combine step first. Suppose you have some data that s close to sorted - it forms two sorted lists. You want to merge the two sorted lists quickly rather than having to resort to a general purpose sorting algorithm. This is easy enough: merge(l1,l2) list X = empty while (neither L1 nor L2 empty) compare first items of L1 & L2 remove smaller of the two from its list add to end of X catenate remaining list to end of X return X Time analysis: in the worst case both lists empty at about same time, so everything has to be compared. Each comparison adds one item to X so the worst case is X -1 = L1 + L2-1 comparisons. One can do a little better sometimes e.g. if L1 is smaller than most of L2. Once we know how to combine two sorted lists, we can construct a divide and conquer sorting algorithm that simply divides the list in two, sorts the two recursively, and merges the results: merge sort(l) 56 3B.582
4 if (length(l) < 2) return L else split L into lists L1 and L2, each of n/2 elements L1 = merge sort(l1) L2 = merge sort(l2) return merge(l1,l2) This is simpler than heapsort (so easier to program) and works pretty well. How many comparisons does it use? We can use the analysis of the merge step to write down a recurrence: C(n) <= n-1 + 2C(n/2) As you saw in homework 1.31, for n = power of 2, the solution to this is n - n + 1. For other n, it s similar but more complicated. To prove this (at least the power of 2 version), you can use the formula above to produce C(N) <= sum 2^i (n/2^i - 1) = sum n - 2^i = n( + 1) - (2n - 1) = n - n + 1 So the number of comparisons is even less than heapsort. Quicksort Quicksort, invented by Tony Hoare, follows a very similar divide and conquer idea: partition into two lists and put them back together again It does more work on the divide side, less on the combine side. Merge sort worked no matter how you split the lists (one obvious way is to take first n/2 and last n/2 elements, another is to take every other element). But if you could perform the splits so that everything in one list was smaller than everything in the other, this information could be used to make merging much easier: you could merge just by concatenating the lists. How to split so one list smaller than the other? e.g. for alphabetical order, you could split into A-M, N-Z so could use some split depending on what data looks like, but we want a comparison sorting algorithm that works for any data. Quicksort uses a simple idea: pick one object x from the list, and split the rest into those before x and those after x. quicksort(l) if (length(l) < 2) return L else pick some x in L L1 = y in L : y < x L2 = y in L : y > x L3 = y in L : y = x quicksort(l1) quicksort(l2) return concatenation of L1, L3, and L2 (We don t need to sort L3 because everything in it is equal). Quicksort analysis The partition step of quicksort takes n-1 comparisons. So we can write a recurrence for the total number of comparisons done by quicksort: C(n) = n-1 + C(a) + C(b) where a and b are the sizes of L1 and L2, generally satisfying a+b=n-1. In the worst case, we might pick x to be the minimum element in L. Then a=0, b=n-1, and the recurrence simplifies to C(n)=n-1 + C(n-1) = O(n^2). So this seems like a very bad algorithm. Why do we call it quicksort? How can we make it less bad? Randomization! Suppose we pick x=a[k] where k is chosen randomly. Then any value of a is equally likely from 0 to n-1. To do average case analysis, we write out the sum over possible random choices of the probability of that choice times the time for that choice. Here the choices are the values of k, the probabilities are all 1/n, and the times can be described by formulas involving the time for the recursive calls to the algorithm. So average case analysis of a randomized algorithm gives a randomized recurrence: n-1 C(n) = sum (1/n)[n C(a) + C(n-a-1)] a=0 To simplify the recurrence, note that if C(a) occurs one place in the sum, the same number will occur as C(n-a-1) in another term - we rearrange the sum to group the two together. We can also take the (n-1) parts out of the sum since the sum of 1/n copies of 1/n times n-1 is just n-1. n-1 C(n) = n sum (2/n) C(a) a=0 The book gives two proofs that this is O(n ). Of these, induction is easier. One useful idea here: we want to prove f(n) is O(g(n)). The O() hides too much information, instead we need to prove f(n) <= a g(n) but we don t know what value a should take. We work it out with a left as a variable then use the analysis to see what values of a work. We have C(1) = 0 = a (1 log 1) for all a. Suppose C(i) <= a i log i for some a, all i<n. Then C(n) = n-1 + sum(2/n) C(a) <= n-1 + sum(2/n)ai log i 3B
5 = n-1 + 2a/n sum(i=2 to n-1) (i log i) <= n-1 + 2a/n integral(i=2 to n)(i log i) = n-1 + 2a/n (n^2 / 2 - n^2/4-2 ln 2 + 1) = n-1 + a n - an/2 - O(1) and this will work if n-1 < an/2, and in particular if a=2. So we can conclude that C(n) <= 2 n. Note that this is worse than either merge sort or heap sort, and requires random number generator to avoid being really bad. But it s pretty commonly used, and can be tuned in various ways to work better. (For instance, let x be the median of three randomly chosen values rather than just one value). Backtracking Definition An algorithmic technique to find solutions by trying one of several choices. If the choice proves incorrect, computation backtracks or restarts at the point of choice and tries another choice. It is often convenient to maintain choice points and alternate choices using recursion. Note: Conceptually, a backtracking algorithm does a depth-first search of a tree of possible (partial) solutions. Each choice is a node in the tree. The backtracking method is based on the systematically inquisition of the possible solutions where through the procedure,set of possible solutions are rejected before even examined so their number is getting a lot smaller. An important requirement which must be fulfilled is that there must be the proper hierarchy in the systematically produce of solutions so that sets of solutions that do not fulfill a certain requirement are rejected before the solutions are produced. For this reason the examination and produce of the solutions follows a model of non-cycle graph for which in this case we will consider as a tree. The root of the tree represents the set of all the solutions. Nodes in lower levels represent even smaller sets of solutions,based on their properties. Obviously,leaves will be isolated solutions.it is easily understood that the tree (or any other graph) is produced during the examination of the solutions so that no rejected solutions are produced. When a node is rejected, the whole sub-tree is rejected, and we backtrack to the ancestor of the node so that more children are produced and examined. Because this method is expected to produce subsets of solutions which are difficult to process, the method itself is not very popular. The Queens Problem We consider a grid of squares, dimensioned nxn, partly equivalent to a chessboard containing n 2 places. A queen placed in any of the n 2 squares controls all the squares that are on its row, its column and the 45 0 diagonals. The problem asked, is how to put n queens on the chessboard, so that any other queen does not control the square of every queen. Obviously for n=2 there is no problem to the solution, while for n=4 a valid solution is given by the drawing below. A possible position on the grid is set by the pair of pointers (i,j) where 1<i,j<n, and i stands for the number of column and j stands for the number of row. Up to this point, for the same i there are n valid values for j. For a candidate solution though, only one queen can be on each column, that is only one value j=v(i).therefor the solutions are represented with the n values of the matrix V=[V(1),...V(n)].All the solutions for which V(i)=V(j) are rejected because 2 queens can not be on the same row. Now the solutions are the permutes of n pointers, which is n!,still a forbiddingly big number. Out of all these solution the correct one is the one which satisfies the last requirement:2 queens will not belong in the same diagonal, which is: V(j)-V(i)<>±(i-j) for i<>j. (5.8-1) A backtracking algorithm or this problem constructs the permutes [V(1),...V(n)] of the 1,...,n pointers, and examines them as to the property (5.8-1).For example there are (n-2)! permutes in the shape of [3,4...].These will not be produced and examined if the systematically construction of them has already ordered them in a sub-tree with root [3,4] which will be rejected by the condition, and will also reject all the (n-2)! permutes. On the contrary,the same way of producingexamining will go even further to the examination of more permutes in the shape of p=1,4,2,... since, so far the condition is satisfied. The next node to be inserted that is j:=v(4) must also satisfies these:j-1<>3,j-4<>2,j-4<>-2,j-2<>1,j-2<>- 1.All the j pointers satisfying these requirements produce the following permutes:[1,4,2,j,...] which connect to the tree as children of Mean while large sets of permutes such as [1,4,2,6,...] have already been rejected. A typical declaration of this algorithm: The root of all solutions, has as children n nodes [1],...,[n],where [j] represents all the permutes starting with j(and whose number is (n-1)! for every j).inductive if a node includes the k nodes j 1,...j k we attempt to increase it with another node j 1,...,j k,j k+1 so that the condition (5.8-1) is fulfilled. For n=4 this method produces the indirect graph of the following picture, and does not produce the 4!=24 leaves of all the candidate solutions. 58 3B.582
6 Point to Ponder The name divide and conquer is because the problem is conquered by dividing it into several smaller problems. Quicksort, invented by Tony Hoare, follows a very similar divide and conquer idea: A backtracking algorithm does a depth-first search of a tree of possible (partial) solutions. Each choice is a node in the tree. Question Explain the Divide & Conquer Algorithm? Explain the Backtracking Algorithm? References Fundamentals of Algorithmic by Notes Gilles Brassard Paul Bratley 3B
Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.
Sorting The sorting problem is defined as follows: Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order. Remember that total order
More informationSorting is a problem for which we can prove a non-trivial lower bound.
Sorting The sorting problem is defined as follows: Sorting: Given a list a with n elements possessing a total order, return a list with the same elements in non-decreasing order. Remember that total order
More informationComputer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer
Computer Science 385 Analysis of Algorithms Siena College Spring 2011 Topic Notes: Divide and Conquer Divide and-conquer is a very common and very powerful algorithm design technique. The general idea:
More informationLecture 5. Treaps Find, insert, delete, split, and join in treaps Randomized search trees Randomized search tree time costs
Lecture 5 Treaps Find, insert, delete, split, and join in treaps Randomized search trees Randomized search tree time costs Reading: Randomized Search Trees by Aragon & Seidel, Algorithmica 1996, http://sims.berkeley.edu/~aragon/pubs/rst96.pdf;
More informationQuestion 7.11 Show how heapsort processes the input:
Question 7.11 Show how heapsort processes the input: 142, 543, 123, 65, 453, 879, 572, 434, 111, 242, 811, 102. Solution. Step 1 Build the heap. 1.1 Place all the data into a complete binary tree in the
More informationCSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014
CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting Aaron Bauer Winter 2014 The main problem, stated carefully For now, assume we have n comparable elements in an array and we want
More informationRun Times. Efficiency Issues. Run Times cont d. More on O( ) notation
Comp2711 S1 2006 Correctness Oheads 1 Efficiency Issues Comp2711 S1 2006 Correctness Oheads 2 Run Times An implementation may be correct with respect to the Specification Pre- and Post-condition, but nevertheless
More information17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer
Module 2: Divide and Conquer Divide and Conquer Control Abstraction for Divide &Conquer 1 Recurrence equation for Divide and Conquer: If the size of problem p is n and the sizes of the k sub problems are
More informationCSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019
CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting Ruth Anderson Winter 2019 Today Sorting Comparison sorting 2/08/2019 2 Introduction to sorting Stacks, queues, priority queues, and
More informationHow much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;
How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A; } if (n
More informationCSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013
CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting Dan Grossman Fall 2013 Introduction to Sorting Stacks, queues, priority queues, and dictionaries all focused on providing one element
More informationLecture 8: Mergesort / Quicksort Steven Skiena
Lecture 8: Mergesort / Quicksort Steven Skiena Department of Computer Science State University of New York Stony Brook, NY 11794 4400 http://www.cs.stonybrook.edu/ skiena Problem of the Day Give an efficient
More informationCSE100. Advanced Data Structures. Lecture 8. (Based on Paul Kube course materials)
CSE100 Advanced Data Structures Lecture 8 (Based on Paul Kube course materials) CSE 100 Treaps Find, insert, delete, split, and join in treaps Randomized search trees Randomized search tree time costs
More informationMidterm CSE 21 Fall 2012
Signature Name Student ID Midterm CSE 21 Fall 2012 Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 _ (20 points) _ (15 points) _ (21 points) _ (13 points) _ (9 points) _ (7 points) Total _ (85 points) (80 points
More informationDivide and Conquer Algorithms
Divide and Conquer Algorithms T. M. Murali February 19, 2009 Divide and Conquer Break up a problem into several parts. Solve each part recursively. Solve base cases by brute force. Efficiently combine
More informationThe Limits of Sorting Divide-and-Conquer Comparison Sorts II
The Limits of Sorting Divide-and-Conquer Comparison Sorts II CS 311 Data Structures and Algorithms Lecture Slides Monday, October 12, 2009 Glenn G. Chappell Department of Computer Science University of
More informationQuick Sort. CSE Data Structures May 15, 2002
Quick Sort CSE 373 - Data Structures May 15, 2002 Readings and References Reading Section 7.7, Data Structures and Algorithm Analysis in C, Weiss Other References C LR 15-May-02 CSE 373 - Data Structures
More informationComparison Sorts. Chapter 9.4, 12.1, 12.2
Comparison Sorts Chapter 9.4, 12.1, 12.2 Sorting We have seen the advantage of sorted data representations for a number of applications Sparse vectors Maps Dictionaries Here we consider the problem of
More informationCSE 373 MAY 24 TH ANALYSIS AND NON- COMPARISON SORTING
CSE 373 MAY 24 TH ANALYSIS AND NON- COMPARISON SORTING ASSORTED MINUTIAE HW6 Out Due next Wednesday ASSORTED MINUTIAE HW6 Out Due next Wednesday Only two late days allowed ASSORTED MINUTIAE HW6 Out Due
More informationPrinciples of Algorithm Design
Principles of Algorithm Design When you are trying to design an algorithm or a data structure, it s often hard to see how to accomplish the task. The following techniques can often be useful: 1. Experiment
More informationSorting & Growth of Functions
Sorting & Growth of Functions CSci 588: Data Structures, Algorithms and Software Design Introduction to Algorithms, Cormen et al., Chapter 3 All material not from online sources or text copyright Travis
More informationDIVIDE & CONQUER. Problem of size n. Solution to sub problem 1
DIVIDE & CONQUER Definition: Divide & conquer is a general algorithm design strategy with a general plan as follows: 1. DIVIDE: A problem s instance is divided into several smaller instances of the same
More information/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17
601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17 5.1 Introduction You should all know a few ways of sorting in O(n log n)
More informationCSC 505, Spring 2005 Week 6 Lectures page 1 of 9
CSC 505, Spring 2005 Week 6 Lectures page 1 of 9 Objectives: learn general strategies for problems about order statistics learn how to find the median (or k-th largest) in linear average-case number of
More informationPROGRAM EFFICIENCY & COMPLEXITY ANALYSIS
Lecture 03-04 PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS By: Dr. Zahoor Jan 1 ALGORITHM DEFINITION A finite set of statements that guarantees an optimal solution in finite interval of time 2 GOOD ALGORITHMS?
More informationData Structures and Algorithms Week 4
Data Structures and Algorithms Week. About sorting algorithms. Heapsort Complete binary trees Heap data structure. Quicksort a popular algorithm very fast on average Previous Week Divide and conquer Merge
More informationChapter 7 Sorting. Terminology. Selection Sort
Chapter 7 Sorting Terminology Internal done totally in main memory. External uses auxiliary storage (disk). Stable retains original order if keys are the same. Oblivious performs the same amount of work
More informationLecture 15 : Review DRAFT
CS/Math 240: Introduction to Discrete Mathematics 3/10/2011 Lecture 15 : Review Instructor: Dieter van Melkebeek Scribe: Dalibor Zelený DRAFT Today slectureservesasareviewofthematerialthatwillappearonyoursecondmidtermexam.
More informationSorting. Riley Porter. CSE373: Data Structures & Algorithms 1
Sorting Riley Porter 1 Introduction to Sorting Why study sorting? Good algorithm practice! Different sorting algorithms have different trade-offs No single best sort for all scenarios Knowing one way to
More informationData Structures and Algorithms Chapter 4
Data Structures and Algorithms Chapter. About sorting algorithms. Heapsort Complete binary trees Heap data structure. Quicksort a popular algorithm very fast on average Previous Chapter Divide and conquer
More informationO(n): printing a list of n items to the screen, looking at each item once.
UNIT IV Sorting: O notation efficiency of sorting bubble sort quick sort selection sort heap sort insertion sort shell sort merge sort radix sort. O NOTATION BIG OH (O) NOTATION Big oh : the function f(n)=o(g(n))
More informationThe divide and conquer strategy has three basic parts. For a given problem of size n,
1 Divide & Conquer One strategy for designing efficient algorithms is the divide and conquer approach, which is also called, more simply, a recursive approach. The analysis of recursive algorithms often
More informationReading for this lecture (Goodrich and Tamassia):
COMP26120: Algorithms and Imperative Programming Basic sorting algorithms Ian Pratt-Hartmann Room KB2.38: email: ipratt@cs.man.ac.uk 2017 18 Reading for this lecture (Goodrich and Tamassia): Secs. 8.1,
More informationComputational Optimization ISE 407. Lecture 16. Dr. Ted Ralphs
Computational Optimization ISE 407 Lecture 16 Dr. Ted Ralphs ISE 407 Lecture 16 1 References for Today s Lecture Required reading Sections 6.5-6.7 References CLRS Chapter 22 R. Sedgewick, Algorithms in
More informationCSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours: Wed 4-6 pm (CSEB 3043), or by appointment.
CSE 3101: Introduction to the Design and Analysis of Algorithms Instructor: Suprakash Datta (datta[at]cse.yorku.ca) ext 77875 Lectures: Tues, BC 215, 7 10 PM Office hours: Wed 4-6 pm (CSEB 3043), or by
More informationAlgorithm classification
Types of Algorithms Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We ll talk about a classification scheme for algorithms This classification scheme
More informationUnit-2 Divide and conquer 2016
2 Divide and conquer Overview, Structure of divide-and-conquer algorithms, binary search, quick sort, Strassen multiplication. 13% 05 Divide-and- conquer The Divide and Conquer Paradigm, is a method of
More informationAlgorithms and Data Structures
Algorithms and Data Structures Spring 2019 Alexis Maciel Department of Computer Science Clarkson University Copyright c 2019 Alexis Maciel ii Contents 1 Analysis of Algorithms 1 1.1 Introduction.................................
More informationLecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort
Lecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort Instructor: Outline 1 Divide and Conquer 2 Merge sort 3 Quick sort In-Class Quizzes URL: http://m.socrative.com/ Room Name: 4f2bb99e Divide
More informationDesign and Analysis of Algorithms
Design and Analysis of Algorithms CSE 5311 Lecture 8 Sorting in Linear Time Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Sorting So Far
More informationIS 709/809: Computational Methods in IS Research. Algorithm Analysis (Sorting)
IS 709/809: Computational Methods in IS Research Algorithm Analysis (Sorting) Nirmalya Roy Department of Information Systems University of Maryland Baltimore County www.umbc.edu Sorting Problem Given an
More informationModule 2: Classical Algorithm Design Techniques
Module 2: Classical Algorithm Design Techniques Dr. Natarajan Meghanathan Associate Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Module
More informationPseudo code of algorithms are to be read by.
Cs502 Quiz No1 Complete Solved File Pseudo code of algorithms are to be read by. People RAM Computer Compiler Approach of solving geometric problems by sweeping a line across the plane is called sweep.
More information1 ICS 161: Design and Analysis of Algorithms Lecture notes for January 23, Bucket Sorting
1 ICS 161: Design and Analysis of Algorithms Lecture notes for January 23, 1996 2 Bucket Sorting We ve seen various algorithms for sorting in O(n log n) time and a lower bound showing that O(n log n) is
More informationCSE 100 Advanced Data Structures
CSE 100 Advanced Data Structures Overview of course requirements Outline of CSE 100 topics Review of trees Helpful hints for team programming Information about computer accounts Page 1 of 25 CSE 100 web
More informationRecurrences and Divide-and-Conquer
Recurrences and Divide-and-Conquer Frits Vaandrager Institute for Computing and Information Sciences 16th November 2017 Frits Vaandrager 16th November 2017 Lecture 9 1 / 35 Algorithm Design Strategies
More informationAlgorithms in Systems Engineering ISE 172. Lecture 16. Dr. Ted Ralphs
Algorithms in Systems Engineering ISE 172 Lecture 16 Dr. Ted Ralphs ISE 172 Lecture 16 1 References for Today s Lecture Required reading Sections 6.5-6.7 References CLRS Chapter 22 R. Sedgewick, Algorithms
More informationII (Sorting and) Order Statistics
II (Sorting and) Order Statistics Heapsort Quicksort Sorting in Linear Time Medians and Order Statistics 8 Sorting in Linear Time The sorting algorithms introduced thus far are comparison sorts Any comparison
More informationIntroduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16
600.463 Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 11.1 Introduction Dynamic programming can be very confusing until you ve used it a
More informationFigure 4.1: The evolution of a rooted tree.
106 CHAPTER 4. INDUCTION, RECURSION AND RECURRENCES 4.6 Rooted Trees 4.6.1 The idea of a rooted tree We talked about how a tree diagram helps us visualize merge sort or other divide and conquer algorithms.
More informationCSE 373: Data Structures and Algorithms
CSE 373: Data Structures and Algorithms Lecture 20: More Sorting Instructor: Lilian de Greef Quarter: Summer 2017 Today: More sorting algorithms! Merge sort analysis Quicksort Bucket sort Radix sort Divide
More informationProgramming in Haskell Aug-Nov 2015
Programming in Haskell Aug-Nov 2015 LECTURE 11 SEPTEMBER 10, 2015 S P SURESH CHENNAI MATHEMATICAL INSTITUTE Measuring efficiency Measuring efficiency Computation is reduction Application of definitions
More informationSorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?
Sorting Binary search works great, but how do we create a sorted array in the first place? Sorting in Arrays Sorting algorithms: Selection sort: O(n 2 ) time Merge sort: O(nlog 2 (n)) time Quicksort: O(n
More informationCPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)
CPSC 311 Lecture Notes Sorting and Order Statistics (Chapters 6-9) Acknowledgement: These notes are compiled by Nancy Amato at Texas A&M University. Parts of these course notes are based on notes from
More informationData Structures and Algorithms. Roberto Sebastiani
Data Structures and Algorithms Roberto Sebastiani roberto.sebastiani@disi.unitn.it http://www.disi.unitn.it/~rseba - Week 0 - B.S. In Applied Computer Science Free University of Bozen/Bolzano academic
More informationMPATE-GE 2618: C Programming for Music Technology. Unit 4.2
MPATE-GE 2618: C Programming for Music Technology Unit 4.2 Quiz 1 results (out of 25) Mean: 19.9, (standard deviation = 3.9) Equivalent to 79.1% (SD = 15.6) Median: 21.5 High score: 24 Low score: 13 Pointer
More informationSolutions to Exam Data structures (X and NV)
Solutions to Exam Data structures X and NV 2005102. 1. a Insert the keys 9, 6, 2,, 97, 1 into a binary search tree BST. Draw the final tree. See Figure 1. b Add NIL nodes to the tree of 1a and color it
More informationCSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)
Name: Email address: Quiz Section: CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators) Instructions: Read the directions for each question carefully before answering. We will
More informationCS302 Topic: Algorithm Analysis #2. Thursday, Sept. 21, 2006
CS302 Topic: Algorithm Analysis #2 Thursday, Sept. 21, 2006 Analysis of Algorithms The theoretical study of computer program performance and resource usage What s also important (besides performance/resource
More informationCS301 - Data Structures Glossary By
CS301 - Data Structures Glossary By Abstract Data Type : A set of data values and associated operations that are precisely specified independent of any particular implementation. Also known as ADT Algorithm
More informationObject-oriented programming. and data-structures CS/ENGRD 2110 SUMMER 2018
Object-oriented programming 1 and data-structures CS/ENGRD 2110 SUMMER 2018 Lecture 8: Sorting http://courses.cs.cornell.edu/cs2110/2018su Lecture 7 Recap 2 Introduced a formal notation for analysing the
More informationIntro to Algorithms. Professor Kevin Gold
Intro to Algorithms Professor Kevin Gold What is an Algorithm? An algorithm is a procedure for producing outputs from inputs. A chocolate chip cookie recipe technically qualifies. An algorithm taught in
More informationCS 171: Introduction to Computer Science II. Quicksort
CS 171: Introduction to Computer Science II Quicksort Roadmap MergeSort Recursive Algorithm (top-down) Practical Improvements Non-recursive algorithm (bottom-up) Analysis QuickSort Algorithm Analysis Practical
More informationLecture 9: Sorting Algorithms
Lecture 9: Sorting Algorithms Bo Tang @ SUSTech, Spring 2018 Sorting problem Sorting Problem Input: an array A[1..n] with n integers Output: a sorted array A (in ascending order) Problem is: sort A[1..n]
More informationCS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics
CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics 1 Sorting 1.1 Problem Statement You are given a sequence of n numbers < a 1, a 2,..., a n >. You need to
More information(Refer Slide Time: 01.26)
Data Structures and Algorithms Dr. Naveen Garg Department of Computer Science and Engineering Indian Institute of Technology, Delhi Lecture # 22 Why Sorting? Today we are going to be looking at sorting.
More informationScribe: Sam Keller (2015), Seth Hildick-Smith (2016), G. Valiant (2017) Date: January 25, 2017
CS 6, Lecture 5 Quicksort Scribe: Sam Keller (05), Seth Hildick-Smith (06), G. Valiant (07) Date: January 5, 07 Introduction Today we ll study another sorting algorithm. Quicksort was invented in 959 by
More informationSelection (deterministic & randomized): finding the median in linear time
Lecture 4 Selection (deterministic & randomized): finding the median in linear time 4.1 Overview Given an unsorted array, how quickly can one find the median element? Can one do it more quickly than bysorting?
More informationJana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides
Jana Kosecka Linear Time Sorting, Median, Order Statistics Many slides here are based on E. Demaine, D. Luebke slides Insertion sort: Easy to code Fast on small inputs (less than ~50 elements) Fast on
More informationSorting. CSE 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation. CSE143 Au
CSE 43 Java Sorting Reading: Ch. 3 & Sec. 7.3 Sorting Binary search is a huge speedup over sequential search But requires the list be sorted Slight Problem: How do we get a sorted list? Maintain the list
More informationNext. 1. Covered basics of a simple design technique (Divideand-conquer) 2. Next, more sorting algorithms.
Next 1. Covered basics of a simple design technique (Divideand-conquer) Ch. 2 of the text. 2. Next, more sorting algorithms. Sorting Switch from design paradigms to applications. Sorting and order statistics
More informationWe will give examples for each of the following commonly used algorithm design techniques:
Review This set of notes provides a quick review about what should have been learned in the prerequisite courses. The review is helpful to those who have come from a different background; or to those who
More informationMergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015
CS161, Lecture 2 MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015 1 Introduction Today, we will introduce a fundamental algorithm design paradigm, Divide-And-Conquer,
More informationSAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms
SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6 6.0 Introduction Sorting algorithms used in computer science are often classified by: Computational complexity (worst, average and best behavior) of element
More informationHere is a recursive algorithm that solves this problem, given a pointer to the root of T : MaxWtSubtree [r]
CSE 101 Final Exam Topics: Order, Recurrence Relations, Analyzing Programs, Divide-and-Conquer, Back-tracking, Dynamic Programming, Greedy Algorithms and Correctness Proofs, Data Structures (Heap, Binary
More informationSEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Spring 2016
1 SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 11 CS2110 Spring 2016 Time spent on A2 2 Histogram: [inclusive:exclusive) [0:1): 0 [1:2): 24 ***** [2:3): 84 ***************** [3:4): 123 *************************
More informationCOMP 250 Fall recurrences 2 Oct. 13, 2017
COMP 250 Fall 2017 15 - recurrences 2 Oct. 13, 2017 Here we examine the recurrences for mergesort and quicksort. Mergesort Recall the mergesort algorithm: we divide the list of things to be sorted into
More informationIntroduction to Algorithms
Lecture 1 Introduction to Algorithms 1.1 Overview The purpose of this lecture is to give a brief overview of the topic of Algorithms and the kind of thinking it involves: why we focus on the subjects that
More informationLecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.
COMPSCI 330: Design and Analysis of Algorithms 8/28/2014 Lecturer: Debmalya Panigrahi Lecture #2 Scribe: Yilun Zhou 1 Overview This lecture presents two sorting algorithms, quicksort and mergesort, that
More informationPresentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort
Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Merge Sort & Quick Sort 1 Divide-and-Conquer Divide-and conquer is a general algorithm
More informationCS473 - Algorithms I
CS473 - Algorithms I Lecture 4 The Divide-and-Conquer Design Paradigm View in slide-show mode 1 Reminder: Merge Sort Input array A sort this half sort this half Divide Conquer merge two sorted halves Combine
More informationCS Data Structures and Algorithm Analysis
CS 483 - Data Structures and Algorithm Analysis Lecture VI: Chapter 5, part 2; Chapter 6, part 1 R. Paul Wiegand George Mason University, Department of Computer Science March 8, 2006 Outline 1 Topological
More informationLecture 5: Sorting Part A
Lecture 5: Sorting Part A Heapsort Running time O(n lg n), like merge sort Sorts in place (as insertion sort), only constant number of array elements are stored outside the input array at any time Combines
More informationComputer Science Foundation Exam
Computer Science Foundation Exam January 13, 2018 Section I A DATA STRUCTURES SOLUTIONS NO books, notes, or calculators may be used, and you must work entirely on your own. Question # Max Pts Category
More informationComputer Science Foundation Exam
Computer Science Foundation Exam December 16, 2016 Section I A DATA STRUCTURES NO books, notes, or calculators may be used, and you must work entirely on your own. SOLUTION Question # Max Pts Category
More informationHow many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.
Chapter 8. Sorting in Linear Time Types of Sort Algorithms The only operation that may be used to gain order information about a sequence is comparison of pairs of elements. Quick Sort -- comparison-based
More informationLecture Notes for Chapter 2: Getting Started
Instant download and all chapters Instructor's Manual Introduction To Algorithms 2nd Edition Thomas H. Cormen, Clara Lee, Erica Lin https://testbankdata.com/download/instructors-manual-introduction-algorithms-2ndedition-thomas-h-cormen-clara-lee-erica-lin/
More informationMergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri
CS161, Lecture 2 MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri 1 Introduction Today, we will introduce a fundamental algorithm design paradigm,
More informationSorting. Popular algorithms: Many algorithms for sorting in parallel also exist.
Sorting Popular algorithms: Selection sort* Insertion sort* Bubble sort* Quick sort* Comb-sort Shell-sort Heap sort* Merge sort* Counting-sort Radix-sort Bucket-sort Tim-sort Many algorithms for sorting
More informationCSE 373: Data Structures and Algorithms
CSE 373: Data Structures and Algorithms Lecture 19: Comparison Sorting Algorithms Instructor: Lilian de Greef Quarter: Summer 2017 Today Intro to sorting Comparison sorting Insertion Sort Selection Sort
More informationAPCS-AB: Java. Recursion in Java December 12, week14 1
APCS-AB: Java Recursion in Java December 12, 2005 week14 1 Check point Double Linked List - extra project grade Must turn in today MBCS - Chapter 1 Installation Exercises Analysis Questions week14 2 Scheme
More informationProblem solving paradigms
Problem solving paradigms Bjarki Ágúst Guðmundsson Tómas Ken Magnússon Árangursrík forritun og lausn verkefna School of Computer Science Reykjavík University Today we re going to cover Problem solving
More informationCOMP Data Structures
COMP 2140 - Data Structures Shahin Kamali Topic 5 - Sorting University of Manitoba Based on notes by S. Durocher. COMP 2140 - Data Structures 1 / 55 Overview Review: Insertion Sort Merge Sort Quicksort
More informationTotal Points: 60. Duration: 1hr
CS800 : Algorithms Fall 201 Nov 22, 201 Quiz 2 Practice Total Points: 0. Duration: 1hr 1. (,10) points Binary Heap. (a) The following is a sequence of elements presented to you (in order from left to right):
More informationLecture 19 Sorting Goodrich, Tamassia
Lecture 19 Sorting 7 2 9 4 2 4 7 9 7 2 2 7 9 4 4 9 7 7 2 2 9 9 4 4 2004 Goodrich, Tamassia Outline Review 3 simple sorting algorithms: 1. selection Sort (in previous course) 2. insertion Sort (in previous
More informationCS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017
CS 137 Part 8 Merge Sort, Quick Sort, Binary Search November 20th, 2017 This Week We re going to see two more complicated sorting algorithms that will be our first introduction to O(n log n) sorting algorithms.
More information16 Greedy Algorithms
16 Greedy Algorithms Optimization algorithms typically go through a sequence of steps, with a set of choices at each For many optimization problems, using dynamic programming to determine the best choices
More informationAlgorithms in Systems Engineering ISE 172. Lecture 12. Dr. Ted Ralphs
Algorithms in Systems Engineering ISE 172 Lecture 12 Dr. Ted Ralphs ISE 172 Lecture 12 1 References for Today s Lecture Required reading Chapter 6 References CLRS Chapter 7 D.E. Knuth, The Art of Computer
More informationAlgorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms
Algorithm Efficiency & Sorting Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms Overview Writing programs to solve problem consists of a large number of decisions how to represent
More informationSorting. There exist sorting algorithms which have shown to be more efficient in practice.
Sorting Next to storing and retrieving data, sorting of data is one of the more common algorithmic tasks, with many different ways to perform it. Whenever we perform a web search and/or view statistics
More information