CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Similar documents
10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

CSE 332: Data Structures & Parallelism Lecture 5: Algorithm Analysis II. Ruth Anderson Winter 2018

CSE 373 MAY 24 TH ANALYSIS AND NON- COMPARISON SORTING

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

CSE 373: Data Structures and Algorithms

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

Today s Outline. CSE 326: Data Structures Asymptotic Analysis. Analyzing Algorithms. Analyzing Algorithms: Why Bother? Hannah Takes a Break

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Recall from Last Time: Big-Oh Notation

RUNNING TIME ANALYSIS. Problem Solving with Computers-II

Lecture 5: Running Time Evaluation

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Introduction to the Analysis of Algorithms. Algorithm

Algorithm Analysis. Part I. Tyler Moore. Lecture 3. CSE 3353, SMU, Dallas, TX

Analysis of Algorithms. CSE Data Structures April 10, 2002

CMPSCI 187: Programming With Data Structures. Lecture 5: Analysis of Algorithms Overview 16 September 2011

Adam Blank Lecture 2 Winter 2017 CSE 332. Data Structures and Parallelism

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Algorithms and Data Structures

Elementary maths for GMT. Algorithm analysis Part II

Chapter 2: Complexity Analysis

ASYMPTOTIC COMPLEXITY

CS:3330 (22c:31) Algorithms

Choice of C++ as Language

UNIT 1 ANALYSIS OF ALGORITHMS

The growth of functions. (Chapter 3)

Algorithmic Analysis. Go go Big O(h)!

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

CSE 373 APRIL 3 RD ALGORITHM ANALYSIS

The Limits of Sorting Divide-and-Conquer Comparison Sorts II

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Measuring algorithm efficiency

CSE 373: Data Structures and Algorithms More Asympto<c Analysis; More Heaps

Algorithm Analysis. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

Introduction to Data Structure

Algorithm Analysis. CENG 707 Data Structures and Algorithms

Introduction to Analysis of Algorithms

CS302 Topic: Algorithm Analysis #2. Thursday, Sept. 21, 2006

ASYMPTOTIC COMPLEXITY

CSE 373: Data Structures and Algorithms

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Spring 2016

Computer Science Approach to problem solving

Algorithms. Algorithms 1.4 ANALYSIS OF ALGORITHMS

Algorithm. Lecture3: Algorithm Analysis. Empirical Analysis. Algorithm Performance

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

CS 310: Order Notation (aka Big-O and friends)

Lecture 2: Algorithm Analysis

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Algorithms A Look At Efficiency

CS S-02 Algorithm Analysis 1

Analysis of Algorithm. Chapter 2

CS302 Topic: Algorithm Analysis. Thursday, Sept. 22, 2005

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Elementary maths for GMT. Algorithm analysis Part I

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

Algorithms: Efficiency, Analysis, techniques for solving problems using computer approach or methodology but not programming

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY

Algorithm Analysis. Applied Algorithmics COMP526. Algorithm Analysis. Algorithm Analysis via experiments

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

CSCI-1200 Data Structures Spring 2018 Lecture 7 Order Notation & Basic Recursion

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019

CS583 Lecture 01. Jana Kosecka. some materials here are based on Profs. E. Demaine, D. Luebke A.Shehu, J-M. Lien and Prof. Wang s past lecture notes

Algorithm Analysis. Big Oh

Checking for duplicates Maximum density Battling computers and algorithms Barometer Instructions Big O expressions. John Edgar 2

csci 210: Data Structures Program Analysis

CS 506, Sect 002 Homework 5 Dr. David Nassimi Foundations of CS Due: Week 11, Mon. Apr. 7 Spring 2014

Outline and Reading. Analysis of Algorithms 1

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

Sum this up for me. Let s write a method to calculate the sum from 1 to some n. Gauss also has a way of solving this. Which one is more efficient?

COE428 Lecture Notes Week 1 (Week of January 9, 2017)

CS 261 Data Structures. Big-Oh Analysis: A Review

Data Structure Lecture#5: Algorithm Analysis (Chapter 3) U Kang Seoul National University

Algorithm Analysis. Spring Semester 2007 Programming and Data Structure 1

Data Structures Lecture 3 Order Notation and Recursion

Algorithm Analysis. Algorithm Efficiency Best, Worst, Average Cases Asymptotic Analysis (Big Oh) Space Complexity

Complexity of Algorithms

Introduction to Computers & Programming

Data Structures Lecture 8

How do we compare algorithms meaningfully? (i.e.) the same algorithm will 1) run at different speeds 2) require different amounts of space

Data Structures and Algorithms CSE 465

CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)

Algorithm Analysis. Performance Factors

Outline. runtime of programs algorithm efficiency Big-O notation List interface Array lists

The Complexity of Algorithms (3A) Young Won Lim 4/3/18

9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

Complexity of Algorithms. Andreas Klappenecker

CSC236 Week 5. Larry Zhang

Scientific Computing. Algorithm Analysis

Theory and Algorithms Introduction: insertion sort, merge sort

Why study algorithms? CS 561, Lecture 1. Today s Outline. Why study algorithms? (II)

Introduction to Computer Science

Complexity Analysis of an Algorithm

Another Sorting Algorithm

CSE373: Data Structures & Algorithms Lecture 5: Dictionary ADTs; Binary Trees. Linda Shapiro Spring 2016

DESIGN AND ANALYSIS OF ALGORITHMS. Unit 1 Chapter 4 ITERATIVE ALGORITHM DESIGN ISSUES

Transcription:

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014

Previously, on CSE 373 We want to analyze algorithms for efficiency (in time and space) And do so generally and rigorously not timing an implementation We will primarily consider worst-case running time Example: find an integer in a sorted array Linear search: O(n) Binary search: O(log n) Had to solve a recurrence relation to see this 2

Another example: sum array Two obviously linear algorithms: T(n) = O(1) + T(n-1) Iterative: int sum(int[] arr){ int ans = 0; for(int i=0; i<arr.length; ++i) ans += arr[i]; return ans; } Recursive: Recurrence is k + k + + k for n times int sum(int[] arr){ return help(arr,0); } int help(int[]arr,int i) { if(i==arr.length) return 0; return arr[i] + help(arr,i+1); } 3

What about a binary version? int sum(int[] arr){ return help(arr,0,arr.length); } int help(int[] arr, int lo, int hi) { if(lo==hi) return 0; if(lo==hi-1) return arr[lo]; int mid = (hi+lo)/2; return help(arr,lo,mid) + help(arr,mid,hi); } Recurrence is T(n) = O(1) + 2T(n/2) 1 + 2 + 4 + 8 + for log n times 2 (log n) 1 which is proportional to n (definition of logarithm) Easier explanation: it adds each number once while doing little else Obvious : You can t do better than O(n) have to read whole array 4

Parallelism teaser But suppose we could do two recursive calls at the same time Like having a friend do half the work for you! int sum(int[]arr){ return help(arr,0,arr.length); } int help(int[]arr, int lo, int hi) { if(lo==hi) return 0; if(lo==hi-1) return arr[lo]; int mid = (hi+lo)/2; return help(arr,lo,mid) + help(arr,mid,hi); } If you have as many friends of friends as needed the recurrence is now T(n) = O(1) + 1T(n/2) O(log n) : same recurrence as for find 5

Really common recurrences Should know how to solve recurrences but also recognize some really common ones: T(n) = O(1) + T(n-1) linear T(n) = O(1) + 2T(n/2) linear T(n) = O(1) + T(n/2) logarithmic T(n) = O(1) + 2T(n-1) exponential T(n) = O(n) + T(n-1) quadratic (see previous lecture) T(n) = O(n) + 2T(n/2) O(n log n) Note big-oh can also use more than one variable Example: can sum all elements of an n-by-m matrix in O(nm) 6

Asymptotic notation About to show formal definition, which amounts to saying: 1. Eliminate low-order terms 2. Eliminate coefficients Examples: 4n + 5 0.5n log n + 2n + 7 n 3 + 2 n + 3n n log (10n 2 ) 7

Big-Oh relates functions We use O on a function f(n) (for example n 2 ) to mean the set of functions with asymptotic behavior less than or equal to f(n) So (3n 2 +17) is in O(n 2 ) 3n 2 +17 and n 2 have the same asymptotic behavior Confusingly, we also say/write: (3n 2 +17) is O(n 2 ) (3n 2 +17) = O(n 2 ) But we would never say O(n 2 ) = (3n 2 +17) 8

Big-O, formally Definition: g(n) is in O( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 To show g(n) is in O( f(n) ), pick a c large enough to cover the constant factors and n 0 large enough to cover the lower-order terms Example: Let g(n) = 3n 2 +17 and f(n) = n 2 c=5 and n 0 =10 is more than good enough This is less than or equal to So 3n 2 +17 is also O(n 5 ) and O(2 n ) etc. 9

More examples, using formal definition Let g(n) = 1000n and f(n) = n 2 A valid proof is to find valid c and n 0 The cross-over point is n=1000 So we can choose n 0 =1000 and c=1 Many other possible choices, e.g., larger n 0 and/or c Definition: g(n) is in O( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 10

More examples, using formal definition Let g(n) = n 4 and f(n) = 2 n A valid proof is to find valid c and n 0 We can choose n 0 =20 and c=1 Definition: g(n) is in O( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 11

What s with the c The constant multiplier c is what allows functions that differ only in their largest coefficient to have the same asymptotic complexity Example: g(n) = 7n+5 and f(n) = n For any choice of n 0, need a c > 7 (or more) to show g(n) is in O( f(n) ) Definition: g(n) is in O( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 12

What you can drop Eliminate coefficients because we don t have units anyway 3n 2 versus 5n 2 doesn t mean anything when we have not specified the cost of constant-time operations (can re-scale) Eliminate low-order terms because they have vanishingly small impact as n grows Do NOT ignore constants that are not multipliers n 3 is not O(n 2 ) 3 n is not O(2 n ) (This all follows from the formal definition) 13

Big-O: Common Names (Again) O(1) constant (same as O(k) for constant k) O(log n) logarithmic (probing) O(n) linear (single-pass) O(n log n) n log n (mergesort) O(n 2 ) quadratic (nested loops) O(n 3 ) cubic (more nested loops) O(n k ) polynomial (where is k is any constant) O(k n ) exponential (where k is any constant > 1) 14

Big-O running times For a processor capable of one million instructions per second 15

More Asymptotic Notation Upper bound: O( f(n) ) is the set of all functions asymptotically less than or equal to f(n) g(n) is in O( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 Lower bound: Ω( f(n) ) is the set of all functions asymptotically greater than or equal to f(n) g(n) is in Ω( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 Tight bound: θ( f(n) ) is the set of all functions asymptotically equal to f(n) Intersection of O( f(n) ) and Ω( f(n) ) (use different c values) 16

Correct terms, in theory A common error is to say O( f(n) ) when you mean θ( f(n) ) Since a linear algorithm is also O(n 5 ), it s tempting to say this algorithm is exactly O(n) That doesn t mean anything, say it is θ(n) That means that it is not, for example O(log n) Less common notation: little-oh : intersection of big-oh and not big-theta For all c, there exists an n 0 such that Example: array sum is o(n 2 ) but not o(n) little-omega : intersection of big-omega and not big-theta For all c, there exists an n 0 such that Example: array sum is ω(log n) but not ω(n) 17

What we are analyzing The most common thing to do is give an O or θ bound to the worst-case running time of an algorithm Example: binary-search algorithm Common: θ(log n) running-time in the worst-case Less common: θ(1) in the best-case (item is in the middle) Less common: Algorithm is Ω(log log n) in the worst-case (it is not really, really, really fast asymptotically) Less common (but very good to know): the find-in-sortedarray problem is Ω(log n) in the worst-case No algorithm can do better A problem cannot be O(f(n)) since you can always find a slower algorithm, but can mean there exists an algorithm 18

Other things to analyze Space instead of time Remember we can often use space to gain time Average case Sometimes only if you assume something about the probability distribution of inputs Sometimes uses randomization in the algorithm Will see an example with sorting Sometimes an amortized guarantee Average time over any sequence of operations Will discuss in a later lecture 19

Summary Analysis can be about: The problem or the algorithm (usually algorithm) Time or space (usually time) Or power or dollars or Best-, worst-, or average-case (usually worst) Upper-, lower-, or tight-bound (usually upper or tight) 20

Usually asymptotic is valuable Asymptotic complexity focuses on behavior for large n and is independent of any computer / coding trick But you can abuse it to be misled about trade-offs Example: n 1/10 vs. log n Asymptotically n 1/10 grows more quickly But the cross-over point is around 5 * 10 17 So if you have input size less than 2 58, prefer n 1/10 For small n, an algorithm with worse asymptotic complexity might be faster Here the constant factors can matter, if you care about performance for small n 21

Timing vs. Big-Oh Summary Big-oh is an essential part of computer science s mathematical foundation Examine the algorithm itself, not the implementation Reason about (even prove) performance as a function of n Timing also has its place Compare implementations Focus on data sets you care about (versus worst case) Determine what the constant factors really are 22