Data Structures and Algorithms Key to Homework 1

Similar documents
Math 501 Solutions to Homework Assignment 10

Asymptotic Analysis Spring 2018 Discussion 7: February 27, 2018

Data Structure Lecture#5: Algorithm Analysis (Chapter 3) U Kang Seoul National University

Big-O-ology 1 CIS 675: Algorithms January 14, 2019

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

Algorithms and Data Structures

Chapter 2: Complexity Analysis

Recall from Last Time: Big-Oh Notation

Assignment 1 (concept): Solutions

CS 137 Part 7. Big-Oh Notation, Linear Searching and Basic Sorting Algorithms. November 10th, 2017

Questions. 6. Suppose we were to define a hash code on strings s by:

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

Adam Blank Lecture 2 Winter 2017 CSE 332. Data Structures and Parallelism

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Elementary maths for GMT. Algorithm analysis Part II

Data Structures and Algorithms. Part 2

Solutions. (a) Claim: A d-ary tree of height h has at most 1 + d +...

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

UNIT 1 ANALYSIS OF ALGORITHMS

Introduction to Algorithms

Data Structures and Algorithms Chapter 2

Introduction to Computers & Programming

CSCI 104 Log Structured Merge Trees. Mark Redekopp

Comparison of x with an entry in the array

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

Lecture 2: Getting Started

You should know the first sum above. The rest will be given if you ever need them. However, you should remember that,, and.

Lecture 2: Algorithm Analysis

HOMEWORK FILE SOLUTIONS

Complexity of Algorithms

Big-O-ology. Jim Royer January 16, 2019 CIS 675. CIS 675 Big-O-ology 1/ 19

The growth of functions. (Chapter 3)

CS S-02 Algorithm Analysis 1

Data Structures Lecture 8

COSC 311: ALGORITHMS HW1: SORTING

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

Remember to also pactice: Homework, quizzes, class examples, slides, reading materials.

9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

Complexity of Algorithms. Andreas Klappenecker

Sorting. Bubble Sort. Selection Sort

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Algorithm Analysis. Algorithm Efficiency Best, Worst, Average Cases Asymptotic Analysis (Big Oh) Space Complexity

Time Complexity for Loops

Introduction to TURING

Analysis of Algorithms. 5-Dec-16

Chapter 8 Sort in Linear Time

CSE 332 Autumn 2013: Midterm Exam (closed book, closed notes, no calculators)

Data Structures and Algorithms

O(n): printing a list of n items to the screen, looking at each item once.

Sorting & Growth of Functions

CS:3330 (22c:31) Algorithms

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

Big-O-ology. How do you tell how fast a program is? = How do you tell how fast a program is? (2nd Try) Answer? Run some test cases.

UNIT-1. Chapter 1(Introduction and overview) 1. Asymptotic Notations 2. One Dimensional array 3. Multi Dimensional array 4. Pointer arrays.

Complexity Analysis of an Algorithm

CS 506, Sect 002 Homework 5 Dr. David Nassimi Foundations of CS Due: Week 11, Mon. Apr. 7 Spring 2014

Analysis of Algorithms. CSE Data Structures April 10, 2002

CS 261 Data Structures. Big-Oh Analysis: A Review

CSE wi: Practice Midterm

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Why study algorithms? CS 561, Lecture 1. Today s Outline. Why study algorithms? (II)

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

Elementary maths for GMT. Algorithm analysis Part I

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Lecture 5: Running Time Evaluation

Introduction to Data Structure

Basic Data Structures (Version 7) Name:

CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)

Searching Algorithms/Time Analysis

(Refer Slide Time: 01.26)

Outline. runtime of programs algorithm efficiency Big-O notation List interface Array lists

Lecture 2: Analyzing Algorithms: The 2-d Maxima Problem

Chapter 6 INTRODUCTION TO DATA STRUCTURES AND ALGORITHMS

Test 1 Review Questions with Solutions

Outline and Reading. Analysis of Algorithms 1

Time and space behaviour. Chapter 20

March 13/2003 Jayakanth Srinivasan,

Intro. Speed V Growth

CSE 373 APRIL 3 RD ALGORITHM ANALYSIS

DATA STRUCTURES AND ALGORITHMS. Asymptotic analysis of the algorithms

Analysis of algorithms

COMP Analysis of Algorithms & Data Structures

Data Structures Question Bank Multiple Choice

CS 310: Order Notation (aka Big-O and friends)

INF2220: algorithms and data structures Series 1

Programming for Engineers Introduction to C

Chapter 2, Part III Arithmetic Operators and Decision Making

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

Lower Bound on Comparison-based Sorting

Chapter 4: SQL. Basic Structure

CSC148H Week 3. Sadia Sharmin. May 24, /20

Faculty of Science FINAL EXAMINATION COMP-250 A Introduction to Computer Science School of Computer Science, McGill University

Today s Outline. CSE 326: Data Structures Asymptotic Analysis. Analyzing Algorithms. Analyzing Algorithms: Why Bother? Hannah Takes a Break

Classic Data Structures Introduction UNIT I

Algorithm Design and Time Analysis

Engineering 9867 Advanced Computing Concepts

3. Java - Language Constructs I

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Computer Science 302 Spring 2007 Practice Final Examination: Part I

CSCI 104 Runtime Complexity. Mark Redekopp David Kempe Sandra Batista

Transcription:

Data Structures and Algorithms Key to Homework 1 January 31, 2005 15 Define an ADT for a set of integers (remember that a set may not contain duplicates) Your ADT should consist of the functions that can be performed on the sets, with each function defined in terms of its input and output Here are some possible operations (a) Insert Input: a set A and an integer n Output: the set A {n} (b) Delete Input: a set A and an integer n Output: the set A {n}, if n A, and the set A together with a message to the effect that n isn t in A, if n A (c) Member Input: a set A and an integer n Output: true if n A, and false otherwise (d) Union Input: two sets A and B Output: their union A B Note that implementing this may involve searching for duplicates (e) Intersection Input: two sets A and B Output: their intersection A B (f) Set difference Input: two sets A and B Output: their difference A B That is, the output is {a A : a B} Note that in an object-oriented setting (eg, Java), the set (or one of the sets in the case of union, intersection, and set difference) will probably be an implicit input, since the operation will most likely be a method on a set object 113 A common problem for compilers and text editors is to determine if the parentheses (or other brackets) in a string are balanced and properly nested For example, the string ((())())() contains properly nested pairs of parentheses, but the string )()( does not; and the string ()) does not contain properly matching parentheses (a) Give an algorithm that returns true if a string contains properly nested and balanced parentheses, and false otherwise Hint: At no time while scanning a legal string from left to right will you have encountered more right parentheses than left parentheses The algorithm should scan the string from left to right, keeping track of the number of unmatched left parentheses If, at any time, the number is negative, there s an unmatched right parenthesis and the algorithm can terminate with an error message If, after the entire string has been scanned, the number of unmatched left parentheses is positive, there s an unmatched left parenthesis and the algorithm should terminate with an error If, after the entire string has been scanned, the number is 0, the string is OK See Parensjava on the class website for details 1

Data Structures and Algorithms, Homework 1 2 33 Arrange the following expressions by growth rate from slowest to fastest Where does n! fit into this ordering? 4n 2 log 3 (n) 3 n 20n 2 log 2 (n) n 2 3 2 < log 3 (n) log 2 (n) < n 2 3 < 20n < 4n 2 < 3 n You can directly calculate that 6! = 720, 3 6 = 729, 7! = 5040, and 3 7 = 2187 So for n 7, n! > 3 n 34 (a) Suppose that a particular algorithm has time complexity T (n) = 3 2 n, and that executing an implementation of it on a particular machine takes T seconds for n inputs Now suppose that we are presented with a machine that is 64 times as fast How many inputs could we process on the new machine in T seconds? (b) Suppose that another algorithm has time complexity T (n) = n 2, and that executing an implementation of it on a particular machine takes T seconds for n inputs Now suppose that we are presented with a machine that is 64 times as fast How many inputs could we process on the new machine in T seconds? (c) Suppose that a third algorithm has time complexity T (n) = 8n, and that executing an implementation of it on a particular machine takes T seconds for n inputs Now suppose that we are presented with a machine that is 64 times as fast How many inputs could we process on the new machine in T seconds? The key to working these problems is observing that the runtime T tells you roughly the number of instructions executed For example, suppose T = 1000 seconds and the old machine will execute 10 6 instructions/second Then in T seconds, the old machine will execute 10 6 10 3 = 10 9 instructions, and the new machine will execute 64 10 9 instructions Thus, in any given period of time the new machine will do 64 times as much work as the old machine Hence, if T old (n) is the runtime of a program on the old machine, its runtime will be on the new machine Tnew(n) = 1 64 T old (n) In the following discussion suppose that n 0 is the problem size on the old machine, and n 1 is the problem size on the new machine (a) We have T old (n) = 3 2 n, and T old (n 0 ) = T We want to know for what value of n = n 1 will we have Tnew(n 1 ) = T Using our formulas, we have So and since 64 = 2 6, we have 3 2 n 0 = 1 64 3 2n 1 64 2 n 0 = 2 n 1, 2 6+n 0 = 2 n 1 Thus, n 1 = 6 + n 0 If, for example, we could previously solve a problem of size 100, we can now solve a problem of size 106

Data Structures and Algorithms, Homework 1 3 (b) Using reasoning similar to that in part (a), we have So and n 2 0 = 1 64 n2 1 64n 2 0 = n 2 1, 8n 0 = n 1 So we can solve a problem 8 times larger on the new machine (c) Reasoning as in part (a) again, we have and 8n 0 = 1 64 8n 1, 64n 0 = n 1 So we can solve a problem 64 times larger on the new machine 36 Using the definitions of big-oh and Omega, find the upper and lower bounds for the following expressions Be sure to state appropriate values for c and n 0 (a) c 1 n (b) c 2 n 3 + c 3 Since we re interested in runtimes, I ll assume that c 1 and c 2 are positive (a) c 1 n is O(n) and Ω(n) For if we let c = c 1 and n 0 = 1, then we have that for all n > n 0, and for all n > n 0 c 1 n cn, c 1 n cn, (b) c 2 n 3 + c 3 is O(n 3 ) and Ω(n 3 ) First consider the case in which c 3 0 In this case we have c 2 n 3 c 2 n 3 + c 3 (c 2 + c 3 )n 3 for all n The second inequality holds because n 1 and hence n 3 1 So c 3 n 3 c 3 If c 3 < 0, then clearly c 2 n 3 + c 3 c 2 n 3, for all n So we only need to find c and n 0 so that cn 3 c 2 n 3 + c 3, for all n n 0 To make things a little clearer, let c 4 = c 3 Then c 2 n 3 + c 3 = c 2 n 3 c 4, and let T (n) = c 2 n 3 c 4 Now observe that if we choose c > 0 so that at some point n 1 > 0, we have cn 3 1 < c 2 n 3 1 c 4,

Data Structures and Algorithms, Homework 1 4 then we ll have cn 3 < c 2 n 3 c 4, for all n n 1 We know this because of the following observations Consider the function h(n) = c 2 n 3 cn 3 c 4 = (c 2 c)n 3 c 4 Then h(n) > 0 iff cn 3 < c 2 n 3 c 4 So if cn 3 1 < c 2n 3 1 c 4, h(n 1 ) > 0 But since n 1 > 0, n 3 1 > 0, in order for h(n 1) > 0, we must have that c 2 c > 0 But c 2 c is constant, and if n 2 > n 1, then n 3 2 > n3 1 So if n > n 1, we must have h(n) > 0 In other words, cn < c 2 n 3 c 4, for all n > n 1 So we only need to find a point n 1 where T (n 1 ) is positive, and then choose c so that cn 3 1 < T (n 1) But we know that T (n) is increasing (why?) So all we have to do is (say) solve the equation T (n 1 ) = 1 We can use basic algebra to get ( ) 1 1 + c4 3 n 1 = c 2 Of course this may not be a whole number So we just choose n 0 = (1 + c4 c 2 ) 1 3 Note that since T (0) = c 4 < 0, n 1 > 0 and hence n 0 1 Also note that since T (n) is increasing, we must have T (n 0 ) 1 Now let Then c > 0 and So by our preceding observations we have for all n n 0 c = 1 2n 3 0 cn 3 0 = 1 2n 3 n 3 0 = 1 0 2 < 1 = T (n 0) cn 3 c 2 n 3 + c 3 39 Determine big-theta for the following code fragments in the average case Assume that all variables are of type int [We also assume that n 1 Note that for these problems the number of statements executed is the same in both the best case, worst case, and average case] (a) a = b+c; d = a+e; This is two statements Hence it s Θ(1)

Data Structures and Algorithms, Homework 1 5 (b) sum = 0; for (i = 0; i < 3; i++) for (j = 0; j < n; j++) sum++; The inner loop is executed n times So it executes an+b statements for some constants a and b with a > 0 The outer loop executes 3 times So when we add in the initialization of sum, the entire code segment will execute c(an + b) + d statements for some constants c and d with c > 0 Multiplying this out, we see that the number of statements executed is en + f, for some constants e and f with e > 0 We saw in class that this function is Θ(n) (c) sum = 0; for (i = 0; i < n*n; i++) sum++; Let m = n 2 Then the number of statements executed is am + b for some constants a and b with a > 0 So this is Θ(m), and since m = n 2, we see that this is Θ(n 2 ) (d) Assume array A contains n values for (i = 0; i < n; i++) { for (j = 0; j < n; j++) A[i] = DSutilrandom(n) sort(a, n); } // random takes constant time // sort takes n log n time The inner loop executes an + b statements for constants a, b with a > 0 The call to sort takes time n log n So the total number of statements executed in the body of the loop is cn log(n) + dn + e, for constants c, d, e with c > 0 Since the outer loop is executed n times, the total number of statements executed is fn(cn log(n) + dn + e) + g Multiplying this out, we see that the code is Θ(n 2 log(n)) Note that this argument can be considerably simplified by using the rules on page 56 of the text (which also apply to big-ω and big-θ) The inner loop is Θ(n) The sort is Θ(n log n) So the body of the outer loop is Θ(n log n) (by rule 3) Thus, by rule 4, the code is Θ(n 2 log n)