ASYMPTOTIC COMPLEXITY

Similar documents
ASYMPTOTIC COMPLEXITY

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Spring 2016

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Fall 2014

SEARCHING AND SORTING HINT AT ASYMPTOTIC COMPLEXITY

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Introduction to the Analysis of Algorithms. Algorithm

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

Scientific Computing. Algorithm Analysis

Test 1 SOLUTIONS. June 10, Answer each question in the space provided or on the back of a page with an indication of where to find the answer.

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

SORTING. Insertion sort Selection sort Quicksort Mergesort And their asymptotic time complexity

CMPSCI 187: Programming With Data Structures. Lecture 5: Analysis of Algorithms Overview 16 September 2011

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

EE 368. Week 6 (Notes)

Data Structures Lecture 8

CSE 373 APRIL 3 RD ALGORITHM ANALYSIS

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Algorithm Analysis. CENG 707 Data Structures and Algorithms

Choice of C++ as Language

Algorithm Analysis. Performance Factors

CS Lecture 19: Loop invariants

Lecture 5 Sorting Arrays

"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne SORTING

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

Outline. runtime of programs algorithm efficiency Big-O notation List interface Array lists

COMP 161 Lecture Notes 16 Analyzing Search and Sort

What did we talk about last time? Finished hunters and prey Class variables Constants Class constants Started Big Oh notation

LECTURE 9 Data Structures: A systematic way of organizing and accessing data. --No single data structure works well for ALL purposes.

Asymptotic Analysis of Algorithms

STA141C: Big Data & High Performance Statistical Computing

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

Algorithm Analysis. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

CS 1110: Introduction to Computing Using Python Loop Invariants

"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne SORTING

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri

Algorithm. Algorithm Analysis. Algorithm. Algorithm. Analyzing Sorting Algorithms (Insertion Sort) Analyzing Algorithms 8/31/2017

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

(Refer Slide Time: 1:27)

How invariants help writing loops Author: Sander Kooijmans Document version: 1.0

Algorithmic Analysis. Go go Big O(h)!

STA141C: Big Data & High Performance Statistical Computing

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Outline and Reading. Analysis of Algorithms 1

Total Score /1 /20 /41 /15 /23 Grader

CS 231 Data Structures and Algorithms Fall Algorithm Analysis Lecture 16 October 10, Prof. Zadia Codabux

Data Structures Lecture 3 Order Notation and Recursion

Algorithm Analysis. Big Oh

Prelim 2 Solution. CS 2110, November 19, 2015, 7:30 PM Total. Sorting Invariants Max Score Grader

Introduction to Computer Science

Today s Outline. CSE 326: Data Structures Asymptotic Analysis. Analyzing Algorithms. Analyzing Algorithms: Why Bother? Hannah Takes a Break

Prelim 2, CS2110. SOLUTION

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Prelim 2 Solution. CS 2110, November 19, 2015, 5:30 PM Total. Sorting Invariants Max Score Grader

Sum this up for me. Let s write a method to calculate the sum from 1 to some n. Gauss also has a way of solving this. Which one is more efficient?

Recitation 9. Prelim Review

"Organizing is what you do before you do something, so that when you do it, it is not all mixed up." ~ A. A. Milne SORTING

Chapter 2: Complexity Analysis

Lecture Notes on Sorting

CS:3330 (22c:31) Algorithms

Elementary maths for GMT. Algorithm analysis Part I

UNIT 1 ANALYSIS OF ALGORITHMS

Why study algorithms? CS 561, Lecture 1. Today s Outline. Why study algorithms? (II)

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

Algorithms and Data Structures

Analysis of Algorithms

Data Structures and Algorithms CSE 465

Big-O-ology 1 CIS 675: Algorithms January 14, 2019

Algorithmic Complexity

Section 05: Solutions

Search Lesson Outline

Data Structures and Algorithms. Part 2

O(1) How long does a function take to run? CS61A Lecture 6

What is an algorithm?

Measuring algorithm efficiency

Data Structures and Algorithms Chapter 2

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

Lecture Notes for Chapter 2: Getting Started

Data structure and algorithm in Python

The Complexity of Algorithms (3A) Young Won Lim 4/3/18

Prelim 2. CS 2110, 5:30 PM, November 17, 2016

CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)

Data Structures and Algorithms

Algorithm Analysis. Applied Algorithmics COMP526. Algorithm Analysis. Algorithm Analysis via experiments

PRIORITY QUEUES AND HEAPS

Organisation. Assessment

0.1 Welcome. 0.2 Insertion sort. Jessica Su (some portions copied from CLRS)

CSCI-1200 Data Structures Spring 2018 Lecture 7 Order Notation & Basic Recursion

Introduction to Data Structure

Analysis of Algorithms. 5-Dec-16

CS 261 Data Structures. Big-Oh Analysis: A Review

CORRECTNESS ISSUES AND LOOP INVARIANTS

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

Week 12: Running Time and Performance

Introduction to Computer Science

RUNNING TIME ANALYSIS. Problem Solving with Computers-II

csci 210: Data Structures Program Analysis

Transcription:

Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it. And to make matters worse: complexity sells better. - Edsger Dijkstra ASYMPTOTIC COMPLEXITY Lecture 10 CS2110 Spring 2018

What Makes a Good Algorithm? 2 Suppose you have two possible algorithms that do the same thing; which is better? What do we mean by better? Faster? Less space? Easier to code? Easier to maintain? Required for homework? FIRST, Aim for simplicity, ease of understanding, correctness. SECOND, Worry about efficiency only when it is needed. How do we measure speed of an algorithm?

Basic Step: one constant time operation 3 Constant time operation: its time doesn t depend on the size or length of anything. Always roughly the same. Time is bounded above by some number Basic step: Input/output of a number Access value of primitive-type variable, array element, or object field assign to variable, array element, or object field do one arithmetic or logical operation method call (not counting arg evaluation and execution of method body)

Counting Steps 4 // Store sum of 1..n in sum sum= 0; // inv: sum = sum of 1..(k-1) for (int k= 1; k <= n; k= k+1){ sum= sum + k; Statement: # times done sum= 0; 1 k= 1; 1 k <= n n+1 k= k+1; n sum= sum + k; n Total steps: 3n + 3 All basic steps take time 1. There are n loop iterations. Therefore, takes time proportional to n. 350 300 250 200 150 100 50 0 Linear algorithm in n 0 20 40 60 80 100

Not all operations are basic steps 5 // Store n copies of c in s s= ""; // inv: s contains k-1 copies of c for (int k= 1; k <= n; k= k+1){ s= s + 'c'; Statement: # times done s= ""; 1 k= 1; 1 k <= n n+1 k= k+1; n s= s + 'c'; n Total steps: 3n + 3 Concatenation is not a basic step. For each k, catenation creates and fills k array elements.

String Concatenation 6 s= s + c ; is NOT constant time. It takes time proportional to 1 + length of s String@00 b char[] String s String@90 b char[] String char[]@02 0 d 1 x char[] char[]@018 0 d 1 x 2 c char[]

Not all operations are basic steps 7 // Store n copies of c in s s= ""; // inv: s contains k-1 copies of c for (int k= 1; k <= n; k= k+1){ s= s + 'c'; Statement: # times # times # steps done s= ""; 1 1 1 k= 1; 1 1 1 k <= n n+1 n+11 k= k+1; n n 1 s= s + 'c'; n n k Total steps: n*(n-1)/2 3n + + 3 2n + 3 Concatenation is not a basic step. For each k, catenation creates and fills k array elements. 350 300 250 200 150 100 50 0 Quadratic algorithm in n 0 20 40 60 80 100

Linear versus quadractic 8 // Store sum of 1..n in sum sum= 0; // inv: sum = sum of 1..(k-1) for (int k= 1; k <= n; k= k+1) sum= sum + n Linear algorithm // Store n copies of c in s s= ; // inv: s contains k-1 copies of c for (int k= 1; k = n; k= k+1) s= s + c ; Quadratic algorithm In comparing the runtimes of these algorithms, the exact number of basic steps is not important. What s important is that One is linear in n takes time proportional to n One is quadratic in n takes time proportional to n 2

Looking at execution speed 9 Number of operations executed 2n+2, n+2, n are all linear in n, proportional to n n*n ops 2n + 2 ops n + 2 ops n ops Constant time 0 1 2 3 size n of the array

What do we want from a definition of runtime complexity? 10 Number of operations executed 0 1 2 3 n*n ops 2+n ops 5 ops size n of problem 1. Distinguish among cases for large n, not small n 2. Distinguish among important cases, like n*n basic operations n basic operations log n basic operations 5 basic operations 3. Don t distinguish among trivially different cases. 5 or 50 operations n, n+2, or 4n operations

"Big O" Notation 11 Formal definition: f(n) is O(g(n)) if there exist constants c > 0 and N 0 such that for all n N, f(n) c g(n) c g(n) f(n) Get out far enough (for n N) f(n) is at most c g(n) Intuitively, f(n) is O(g(n)) means that f(n) grows like g(n) or slower N

Prove that (2n 2 + n) is O(n 2 ) 12 Formal definition: f(n) is O(g(n)) if there exist constants c > 0 and N 0 such that for all n N, f(n) c g(n) Example: Prove that (2n 2 + n) is O(n 2 ) Methodology: Start with f(n) and slowly transform into c g(n): Use = and <= and < steps At appropriate point, can choose N to help calculation At appropriate point, can choose c to help calculation

Prove that (2n 2 + n) is O(n 2 ) 13 Formal definition: f(n) is O(g(n)) if there exist constants c > 0 and N 0 such that for all n N, f(n) c g(n) Example: Prove that (2n 2 + n) is O(n 2 ) f(n) = <definition of f(n)> 2n 2 + n <= <for n 1, n n 2 > 2n 2 + n 2 = <arith> 3*n 2 = <definition of g(n) = n 2 > 3*g(n) Transform f(n) into c g(n): Use =, <=, < steps Choose N to help calc. Choose c to help calc Choose N = 1 and c = 3

Prove that 100 n + log n is O(n) 14 Formal definition: f(n) is O(g(n)) if there exist constants c > 0 and N 0 such that for all n N, f(n) c g(n) f(n) = <put in what f(n) is> 100 n + log n <= <We know log n n for n 1> 100 n + n = <arith> 101 n = <g(n) = n> 101 g(n) Choose N = 1 and c = 101

O( ) Examples 15 Let f(n) = 3n 2 + 6n 7 f(n) is O(n 2 ) f(n) is O(n 3 ) f(n) is O(n 4 ) p(n) = 4 n log n + 34 n 89 p(n) is O(n log n) p(n) is O(n 2 ) h(n) = 20 2 n + 40n h(n) is O(2 n ) a(n) = 34 a(n) is O(1) Only the leading term (the term that grows most rapidly) matters If it s O(n 2 ), it s also O(n 3 ) etc! However, we always use the smallest one

Do NOT say or write f(n) = O(g(n)) 16 Formal definition: f(n) is O(g(n)) if there exist constants c > 0 and N 0 such that for all n N, f(n) c g(n) f(n) = O(g(n)) is simply WRONG. Mathematically, it is a disaster. You see it sometimes, even in textbooks. Don t read such things. Here s an example to show what happens when we use = this way. We know that n+2 is O(n) and n+3 is O(n). Suppose we use = n+2 = O(n) n+3 = O(n) But then, by transitivity of equality, we have n+2 = n+3. We have proved something that is false. Not good.

Problem-size examples 17 Suppose a computer can execute 1000 operations per second; how large a problem can we solve? operations 1 second 1 minute 1 hour n 1000 60,000 3,600,000 n log n 140 4893 200,000 n 2 31 244 1897 3n 2 18 144 1096 n 3 10 39 153 2 n 9 15 21

Commonly Seen Time Bounds 18 O(1) constant excellent O(log n) logarithmic excellent O(n) linear good O(n log n) n log n pretty good O(n 2 ) quadratic maybe OK O(n 3 ) cubic maybe OK O(2 n ) exponential too slow

Java Lists 19 java.util defines an interface List<E> implemented by multiple classes: ArrayList LinkedList

Linear search for v in b[0..] 20 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted If v in b, set i to index of first occurrence of v If v not in b, set i so that all elements of b that are < v are to the left of index i. Methodology: 1. Define pre and post conditions. 2. Draw the invariant as a combination of pre and post. 3. Develop loop using 4 loopy questions. Practice doing this!

Linear search for v in b[0..] 21 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted pre:b post: b inv: b 0 b.length sorted 0 i b.length < " " 0 i b.length < " sorted Methodology: 1. Define pre and post conditions. 2. Draw the invariant as a combination of pre and post. 3. Develop loop using 4 loopy questions. Practice doing this!

The Four Loopy Questions 22 Does it start right? Is {Q init {P true? Does it continue right? Is {P && B S {P true? Does it end right? Is P &&!B => R true? Will it get to the end? Does it make progress toward termination?

Linear search for v in b[0..] 23 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted pre:b post: b inv: b 0 b.length sorted 0 i b.length < " " 0 i b.length < " sorted Linear algorithm: O(b.length) i= 0; while ( i < b.length && b[i] < v ) { i= i+1; Each iteration takes constant time. Worst case: b.length iterations

Binary search for v in b[0..] 24 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted pre:b post: b inv: b 0 b.length sorted 0 i b.length < " " 0 k i b.length < " sorted " Methodology: 1. Define pre and post conditions. 2. Draw the invariant as a combination of pre and post. 3. Develop loop using 4 loopy questions. Practice doing this!

Binary search for v in b[0..] 25 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted pre:b 0 b.length sorted k= -1; i= b.length; inv: b 0 k i b.length < " sorted " Make invariant true initially

Binary search for v in b[0..] 26 post: b // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted inv: b 0 i b.length < " " 0 k i b.length < " sorted " k= -1; i= b.length; while ( k < i-1) { Determine loop condition B:!B && inv imply post

Binary search for v in b[0..] 27 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted inv: b 0 k j i b.length < " sorted " k= -1; i= b.length; while ( k < i-1) { int j= (k+i)/2; // k < j < i Set one of k, i to j Figure out how to make progress toward termination. Envision cutting size of b[k+1..i-1] in half

Binary search for v in b[0..] 28 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted 0 k j i b.length k= -1; i= b.length; inv: b < " sorted " while ( k < i-1) { int j= (k+i)/2; 0 k j i // k < j < i inv: b < " < " < " " if (b[j] < v) k= j else i= j; Figure out how to make progress toward termination. Cut size of b[k+1..i-1] in half

Binary search for v in b[0..] 29 // Store value in i to truthify b[0..i-1] < v <= b[i..] // Precondition: b is sorted This algorithm 0 is better than b.length binary searches pre:b that stop? when v is found. 1. Gives good info when v not in b. 0 i b.length 2. Works when b is empty. post: 3. Finds b first < " occurrence " of v, not arbitrary 0 ione. j k b.length 4. inv: Correctness, b < " including " making progress, easily seen using invariant Logarithmic: O(log(b.length)) k= -1; i= b.length; while (k < i-1) { int j= (k+i)/2; if (b[j]<v) k= j; else i= j; Each iteration takes constant time. Worst case: log(b.length) iterations

30 Dutch National Flag Algorithm

Dutch National Flag Algorithm Dutch national flag. Swap b[0..n-1] to put the reds first, then the whites, then the blues. That is, given precondition Q, swap values of b[0.n] to truthify postcondition R: Q: b 0 n? R: b P1: b P2: b 0 n reds whites blues 0 n reds whites blues? 0 n reds whites? blues

Dutch National Flag Algorithm: invariant P1 Q: b R: b 0 n? 0 n reds whites blues 0 h k p n P1: b reds whites blues? h= 0; k= h; p= k; while ( p!= n ) { if (b[p] blue) p= p+1; else if (b[p] white) { swap b[p], b[k]; p= p+1; k= k+1; else { // b[p] red swap b[p], b[h]; swap b[p], b[k]; p= p+1; h=h+1; k= k+1;

Dutch National Flag Algorithm: invariant P2 Q: b R: b 0 n? 0 n reds whites blues 0 h k p n P2: b reds whites? blues h= 0; k= h; p= n; while ( k!= p ) { if (b[k] white) k= k+1; else if (b[k] blue) { p= p-1; swap b[k], b[p]; else { // b[k] is red swap b[k], b[h]; h= h+1; k= k+1; 33

Asymptotically, which algorithm is faster? 34 Invariant 1 Invariant 2 0 h k p n reds whites blues? h= 0; k= h; p= k; while ( p!= n ) { if (b[p] blue) p= p+1; else if (b[p] white) { swap b[p], b[k]; p= p+1; k= k+1; else { // b[p] red swap b[p], b[h]; swap b[p], b[k]; p= p+1; h=h+1; k= k+1; 0 h k p n reds whites? blues h= 0; k= h; p= n; while ( k!= p ) { if (b[k] white) else if (b[k] blue) { k= k+1; p= p-1; swap b[k], b[p]; else { // b[k] is red swap b[k], b[h]; h= h+1; k= k+1;