CS126 Final Exam Review

Similar documents
CS134 Spring 2005 Final Exam Mon. June. 20, 2005 Signature: Question # Out Of Marks Marker Total

Sorting Pearson Education, Inc. All rights reserved.

Assignment 1 (concept): Solutions

Prelim 2 Solution. CS 2110, November 19, 2015, 7:30 PM Total. Sorting Invariants Max Score Grader

Prelim 2. CS 2110, November 19, 2015, 7:30 PM Total. Sorting Invariants Max Score Grader

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

Sorting. Task Description. Selection Sort. Should we worry about speed?

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

CS 261 Data Structures. Big-Oh Analysis: A Review

Prelim 2 Solution. CS 2110, November 19, 2015, 5:30 PM Total. Sorting Invariants Max Score Grader

Lecture Notes on Quicksort

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Spring 2016

Question 7.11 Show how heapsort processes the input:

Algorithmic Analysis. Go go Big O(h)!

Algorithms and Data Structures

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

Lecture Notes on Quicksort

How fast can we sort? Sorting. Decision-tree example. Decision-tree example. CS 3343 Fall by Charles E. Leiserson; small changes by Carola

Data Structure and Algorithm Midterm Reference Solution TA

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Measuring algorithm efficiency

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

ENGI 4892: Data Structures Assignment 2 SOLUTIONS

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Q1 Q2 Q3 Q4 Q5 Q6 Total

Scientific Computing. Algorithm Analysis

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Algorithm for siftdown(int currentposition) while true (infinite loop) do if the currentposition has NO children then return

THE UNIVERSITY OF WESTERN AUSTRALIA

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

CS 112 Introduction to Computing II. Wayne Snyder Computer Science Department Boston University

Divide and Conquer Algorithms: Advanced Sorting

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY

II (Sorting and) Order Statistics

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

SORTING. Insertion sort Selection sort Quicksort Mergesort And their asymptotic time complexity

Programming in Haskell Aug-Nov 2015

Programming II (CS300)

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Sorting and Searching

Sorting. CMPS 2200 Fall Carola Wenk Slides courtesy of Charles Leiserson with small changes by Carola Wenk

Outline and Reading. Analysis of Algorithms 1

Exam I Principles of Imperative Computation, Summer 2011 William Lovas. May 27, 2011

Suppose that the following is from a correct C++ program:

Prelim 2, CS2110. SOLUTION

Test 1 SOLUTIONS. June 10, Answer each question in the space provided or on the back of a page with an indication of where to find the answer.

CS 137 Part 7. Big-Oh Notation, Linear Searching and Basic Sorting Algorithms. November 10th, 2017

COMP 250 Fall recurrences 2 Oct. 13, 2017

Sorting. Bringing Order to the World

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

Solutions. (a) Claim: A d-ary tree of height h has at most 1 + d +...

Asymptotic Analysis of Algorithms

Asymptotic Analysis Spring 2018 Discussion 7: February 27, 2018

Divide and Conquer Algorithms: Advanced Sorting. Prichard Ch. 10.2: Advanced Sorting Algorithms

Section 05: Solutions

Design and Analysis of Algorithms

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 3 Parallel Prefix, Pack, and Sorting

Comparison of x with an entry in the array

O(n): printing a list of n items to the screen, looking at each item once.

Prelim 2, CS :30 PM, 25 April Total Question Name Short Search/ Collections Trees Graphs

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

Faster Sorting Methods

Reading for this lecture (Goodrich and Tamassia):

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

CS314 Exam 2 - Spring Suggested Solution and Criteria 1

CS 506, Sect 002 Homework 5 Dr. David Nassimi Foundations of CS Due: Week 11, Mon. Apr. 7 Spring 2014

Second Examination Solution

Sorting. CSE 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation. CSE143 Au

Week - 03 Lecture - 18 Recursion. For the last lecture of this week, we will look at recursive functions. (Refer Slide Time: 00:05)

CSE373 Fall 2013, Midterm Examination October 18, 2013

UNIT 1 ANALYSIS OF ALGORITHMS

CS171 Midterm Exam. October 29, Name:

Sorting. There exist sorting algorithms which have shown to be more efficient in practice.

CS 171: Introduction to Computer Science II. Quicksort

Lecture 2: Algorithm Analysis

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Multiple-choice (35 pt.)

Sorting L7.2 Recall linear search for an element in an array, which is O(n). The divide-and-conquer technique of binary search divides the array in ha

CSC 421: Algorithm Design & Analysis. Spring 2015

Sorting. CSC 143 Java. Insert for a Sorted List. Insertion Sort. Insertion Sort As A Card Game Operation CSC Picture

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

Total Score /1 /20 /41 /15 /23 Grader

Comparison Sorts. Chapter 9.4, 12.1, 12.2

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Lecture 11: In-Place Sorting and Loop Invariants 10:00 AM, Feb 16, 2018

Programming II (CS300)

Test 1 Review Questions with Solutions

Binary Search Trees Treesort

CmpSci 187: Programming with Data Structures Spring 2015

Big-O-ology 1 CIS 675: Algorithms January 14, 2019

Section 05: Solutions

For searching and sorting algorithms, this is particularly dependent on the number of data elements.

ASYMPTOTIC COMPLEXITY

Recitation 9. Prelim Review

COT 5407: Introduction to Algorithms. Giri Narasimhan. ECS 254A; Phone: x3748

CSCI 2170 Algorithm Analysis

Section 1: True / False (2 points each, 30 pts total)

CIS 121 Data Structures and Algorithms with Java Spring Code Snippets and Recurrences Monday, January 29/Tuesday, January 30

Transcription:

CS126 Final Exam Review Fall 2007 1 Asymptotic Analysis (Big-O) Definition. f(n) is O(g(n)) if there exists constants c, n 0 > 0 such that f(n) c g(n) n n 0 We have not formed any theorems dealing with big-o, so whenever we attempt to prove or disprove anything to do with big-o, we must use the definition. Note the following derivations: f(n) c g(n) n n 0 f(n) g(n) c g(n) 0, n n 0 f(n) lim n g(n) c Any polynomial a k n k + a k 1 n k 1 + + a 1 n + a 0 is O(n k ) 1.1 General Approaches to Big-O Proofs Never state that a proof is obvious by the graph. We are looking for mathematical proofs, not plausibility arguments. You must use the definition in any proofs about Big-O. 1.1.1 Proving f(n) is O(g(n)) Prove the inequality by finding a suitable c and n 0 1.1.2 Proving f(n) is not O(g(n)) Always use a proof by contradiction Can use the limit derivation above: show that the left-hand side limit is unbounded, thus it cannot be bounded by any constant c 1.1.3 Inequalities that can be stated without proof c log(n) n n k k n for any constants c, k where k > 1 Problem 1. Prove 3 sin 2 ( n π ) is O(log(n)) Problem 2. Prove n 2 is not O(2 n ) Problem 3. Prove (n 2 1) 1 2 is O(3n) Problem 4. Use the definition of Big-O to prove the following statement: If f(x) is O(h(x)) and g(x) is O(h(x)) then af(x) + bg(x) is O(h(x)) also, for any constants a > 0, b > 0 1

1.2 Determining Big-O Runtimes of Code You can always determine the Big-O runtime of a line of code by summing the total number of times that line is run. The Big-O runtime of a method is the maximum runtime of all the lines of code in the method. The line that runs the maximum number of times is typically in the inner-most loop. When we have nested loops, if the loops are independent of each other, we can multiply together the Big-O runtimes of the two loops to determine the number of times the code inside the loops will be run. If the inner loop s runtime is dependent on the outer loop, then we cannot simply multiply the two runtimes together; we must sum the total number of times that we will reach the inside of the inner loop. Example 1. for(int i = 0; i < n*n; i++) { for (int j = i; j < n; j++) { // (***) Determine runtime of this line In this case, it would be a mistake to say that the outer loop has runtime O(n 2 ) and the inner loop has runtime O(n), thus the total runtime is O(n 3 ). The problem with this reasoning is that the number of times that the inner loop runs depends on the iteration of the outer loop. Note that we are referring to i in the inner loop. Thus we must sum the total runtime of the line inside the inner loop. This is done as follows: Values of i 0 1... n 1 n... n 2 1 Values of j 0... n 1 1... n 1... n 1 n... n 2 1 # of times (***) runs n n 1... 1 0 0 0 It would seem then, that since (***) is not run when i = n..n 2, the runtime must be n + (n 1) + + 1 + 0 + + 0 = n(n + 1) 2 which is O(n 2 ). However, when i = n..n 2, j is still assigned to be equal to i, and the comparison j < n is still made (in the inner for-loop s definition). These operations are O(1), and are run whenever i = n..n 2, which is n 2 n times. Thus, the total runtime is. O(n 2 ) + (n 2 n)o(1) = O(n 2 ) + O(n 2 ) O(n) = O(n 2 ) Some examples to remember (let c be a constant, and assume there is no way to exit the loop): for(int i = 0; i < n; i++) runs O(n) times for(int i = 0; i < n; i+=c) runs O(n) times for(int i = 0; i < n*n; i+=c) runs O(n 2 ) times for(int i = 0; i < n; i=i*c) runs O(log(n)) times (for c > 1) for(int i = n; i > 1; i=i/c) runs O(log(n)) times (for c > 1) 1.3 Code Analysis Problem 5. Find the run-time complexity on input of size n: for(double i = 1; i <= n; i*=4.0/3.0) { System.out.println ( i ); 2

Problem 6. Find the run-time complexity on input of size n: for(int i = n; i > 1; i = i - i/2) { for (int j = 1; j < Math.pow(n, 2); j = j + j) { System.out.println ( i ); Problem 7. Find the run-time complexity on input of size n: for(int i = 1; i < n; i = i*2) { for (int j = 0; j < Math.pow(n, 3); j = j + i) { System.out.println ( i ); Problem 8. Find the auxiliary space and run-time complexity on input of size n. int num1 = 0; int num2 = 0; int[] array = new int[n]; for(int i = 1; i <= n; i++) { if ( i == 1 ) { for ( int j = i; j <= n; j=j*3 ) { System.out.println ( j ); else { System.out.println ( i ); num1++; num2++; 3

2 Recursion 2.1 Recursive Definitions For a recursive definition, we need a base case (or cases) to define the smallest possible instance, and a recursive case (or cases) to define all other possible instances. Problem 9. Give a recursive definition of an ordered sublist. Note. <4, <5, <1, <2, <9, <>>>>>> is not ordered. <1, <2, <4, <5, <9, <>>>>>> is ordered. 2.2 Recursive Programs Problem 10. Provide a postcondition for the following recursive method. //pre: q is not null //post:??? public static void mystery(queueinterface q) { if (q.isempty()) return; Object temp = q.dequeue(); mystery(q); q.enqueue(temp); 4

3 Table ADT 3.1 Properties An ADT Table consists of KeyedItems. A KeyedItem consists of a Comparable object, and an Object. The Comparable object is the KeyedItem s key and the Object is the KeyedItem s item or value. Comparable objects have a compareto method. In general, tables cannot be traversed. However, tables may be traversed using an iterator if it exists or if you know the implementation of the table. In this latter case it is necessary to write an iterator method as part of the table s implementation. Problem 11. Implement the following method, which merges the contents of two tables into one: public static void mergetables(tableinterface t1, TableInterface t2) { // pre: t1, t2!= null // post: All KeyedItems from t2 that are not already in t1 are added // to t1. t2 remains unchanged. Iterator iterator = t2.getiterator(); 5

4 Trees and Searching 4.1 Binary Tree ADT Note. A binary tree s left subtree is empty if leftsubtree().isempty() is true, not if leftsubtree() == null. The same applies for the right subtree. Problem 12. Given a binary tree (BinaryTreeInterface), write a recursive method to count its number of empty subtrees. 4.2 Tree Traversals It is important to be familiar with the four different types of traversals that were shown in lectures and tutorials. Problem 13. State the level-order and the pre-order traversal for the following tree: Problem 14. Using the following post- and in-order traversals, construct the binary tree that they represent. In-order Traversal: 1 2 4 8 16 32 64 128 Post-order Traversal: 1 4 2 64 32 16 128 8 6

4.3 Binary Search Suppose that we want to improve on the binary search algorithm. Instead of repetitively dividing our array in halves, we will divide it into thirds, and call the method recursively on the appropriate third. We will call this method trisearch. Implement the trisearch method below. What is the Big-O runtime for this method? Is it more efficient than binary search? Problem 15. Implement the following method trisearch : public int trisearch(int lo, int hi, Comparable key, Comparable[] data) { // pre: 0 <= lo <= hi <= data.length, key!= null, data!= null, // data is filled with non-null values // post: returns ideal position of key in data from index lo // to index hi // your code goes here 7

4.4 Binary Search Trees Recall that Binary Search Trees are special Binary Trees such that for every subtree, the following property holds: Consider the following Binary Search Tree: Problem 16. Show the tree after 5, 15, 20 and 25 are inserted Problem 17. Show the tree after 8 and 10 are removed from the resulting tree above 5 Sorting 5.1 Selection Sort Suppose we have the following array of ints: Problem 18. What will the array look like after one pass of Selection Sort, if we choose to select the smallest item each time instead of the largest? Problem 19. After 2 passes? Problem 20. After 6 passes? Problem 21. What is the best case run-time for Selection Sort? Problem 22. What is the worst case run-time? 8

5.2 Insertion Sort Problem 23. Give an array of length n which will cause insertion sort to run in O(n) time, assuming that elements are to be sorted in ascending order. Problem 24. Trace the Insertion Sort algorithm on the following array of length 4, showing the state of the array after each insertion. Also count the number of positions of the array that are considered sorted. Iterations a[0] a[1] a[2] a[3] Amount Sorted 0 3 2 4 1 1 2 3 5.3 Mergesort and Quicksort Consider the following input array: Problem 25. Using Mergesort: What does the array look like just before the final call to merge? Problem 26. Using Quicksort: What does the array look like just after the first call to partition, using the first element as pivot? 5.3.1 Efficiency We know that Mergesort and Quicksort both run in O(n log n) time in the best/average case. Assume that Quicksort is implemented such that the pivot element is the first element in the array. Problem 27. When would Quicksort be preferred over Mergesort? Problem 28. When would Mergesort be preferred over Quicksort? 5.4 Sorting Demos See the links for the last tutorial for the PowerPoint slide demonstractions of the different sorting algorithms. 9