Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4) Dr. Yingwu Zhu

Similar documents
Recursion. Example R1

Data Structures Lecture 3 Order Notation and Recursion

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

Introduction to the Analysis of Algorithms. Algorithm

LECTURE 9 Data Structures: A systematic way of organizing and accessing data. --No single data structure works well for ALL purposes.

Department of Computer Science Admission Test for PhD Program. Part I Time : 30 min Max Marks: 15

Algorithm Analysis. CENG 707 Data Structures and Algorithms

Sorting Pearson Education, Inc. All rights reserved.

CS 216 Exam 1 Fall SOLUTION

DATA STRUCTURES AND ALGORITHMS. Asymptotic analysis of the algorithms

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Scientific Computing. Algorithm Analysis

Topics Applications Most Common Methods Serial Search Binary Search Search by Hashing (next lecture) Run-Time Analysis Average-time analysis Time anal

CSE 143. Complexity Analysis. Program Efficiency. Constant Time Statements. Big Oh notation. Analyzing Loops. Constant Time Statements (2) CSE 143 1

Algorithm Analysis. Spring Semester 2007 Programming and Data Structure 1

CSCI-1200 Data Structures Spring 2018 Lecture 7 Order Notation & Basic Recursion

Ch 8. Searching and Sorting Arrays Part 1. Definitions of Search and Sort

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

CS302 Topic: Algorithm Analysis #2. Thursday, Sept. 21, 2006

Recall from Last Time: Big-Oh Notation

COMP171 Data Structures and Algorithms Fall 2006 Midterm Examination

Choice of C++ as Language

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

You should know the first sum above. The rest will be given if you ever need them. However, you should remember that,, and.

CSCI 104 Runtime Complexity. Mark Redekopp David Kempe

UNIT 1 ANALYSIS OF ALGORITHMS

Sorting. Two types of sort internal - all done in memory external - secondary storage may be used

Java How to Program, 9/e. Copyright by Pearson Education, Inc. All Rights Reserved.

Algorithm. Lecture3: Algorithm Analysis. Empirical Analysis. Algorithm Performance

The Running Time of Programs

g(n) time to computer answer directly from small inputs. f(n) time for dividing P and combining solution to sub problems

CSCI 104 Runtime Complexity. Mark Redekopp David Kempe Sandra Batista

Programming in Haskell Aug-Nov 2015

Chapter 2: Complexity Analysis

Lecture Notes on Sorting

Introduction to Analysis of Algorithms

Outline and Reading. Analysis of Algorithms 1

Lesson 12: Recursion, Complexity, Searching and Sorting. Modifications By Mr. Dave Clausen Updated for Java 1_5

CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)

Today s Outline. CSE 326: Data Structures Asymptotic Analysis. Analyzing Algorithms. Analyzing Algorithms: Why Bother? Hannah Takes a Break

CSE 332 Autumn 2013: Midterm Exam (closed book, closed notes, no calculators)

Computer Programming

CSCE 110 PROGRAMMING FUNDAMENTALS. Prof. Amr Goneid AUC Part 7. 1-D & 2-D Arrays

Chapter 13. Recursion. Copyright 2016 Pearson, Inc. All rights reserved.

Measuring algorithm efficiency

CS Algorithms and Complexity

CSE 332 Winter 2015: Midterm Exam (closed book, closed notes, no calculators)

Algorithm Analysis. Algorithm Analysis 1

Analysis of Algorithms

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Algorithms and complexity analysis

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem

What is an Algorithm?

Computer Science Foundation Exam. May 6, Computer Science. Section 1A. No Calculators! KEY. Score: 50

Chapter 10 - Notes Applications of Arrays

CSCI 2170 Algorithm Analysis

How do we compare algorithms meaningfully? (i.e.) the same algorithm will 1) run at different speeds 2) require different amounts of space

Chapter 01 Arrays Prepared By: Dr. Murad Magableh 2013

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Lecture 2: Algorithm Analysis

Data Structures Lecture 8

Divide-and-Conquer. Dr. Yingwu Zhu

Unit 1 Chapter 4 ITERATIVE ALGORITHM DESIGN ISSUES

CSCI-1200 Data Structures Fall 2017 Lecture 7 Order Notation & Basic Recursion

The Complexity of Algorithms (3A) Young Won Lim 4/3/18

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

ESc101 : Fundamental of Computing

CS 231 Data Structures and Algorithms Fall Algorithm Analysis Lecture 16 October 10, Prof. Zadia Codabux

Same idea, different implementation. Is it important?

(D) There is a constant value n 0 1 such that B is faster than A for every input of size. n n 0.

Algorithm Performance. (the Big-O)

CSE 373 Winter 2009: Midterm #1 (closed book, closed notes, NO calculators allowed)

CSE 373 Spring 2010: Midterm #1 (closed book, closed notes, NO calculators allowed)

CSCI 261 Computer Science II

Complexity, General. Standard approach: count the number of primitive operations executed.

CS302 Topic: Algorithm Analysis. Thursday, Sept. 22, 2005

(6) The specification of a name with its type in a program. (7) Some memory that holds a value of a given type.

Lecture 6 Sorting and Searching

CSCI-1200 Data Structures Spring 2016 Lecture 7 Iterators, STL Lists & Order Notation

THE UNIVERSITY OF WESTERN AUSTRALIA

Lecture 19 Sorting Goodrich, Tamassia

INTRODUCTION. Analysis: Determination of time and space requirements of the algorithm

csci 210: Data Structures Program Analysis

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

Sorting. Task Description. Selection Sort. Should we worry about speed?

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri

Data Structures and Algorithms

Algorithm Analysis. Big Oh

Sorting and Searching. Sudeshna Sarkar 7 th Feb 2017

DESIGN AND ANALYSIS OF ALGORITHMS. Unit 1 Chapter 4 ITERATIVE ALGORITHM DESIGN ISSUES

Priority queues. Priority queues. Priority queue operations

Advanced Algorithms and Data Structures

O(n): printing a list of n items to the screen, looking at each item once.

Principles of Algorithm Analysis. Biostatistics 615/815

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

Algorithm Analysis. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

Classic Data Structures Introduction UNIT I

Algorithm Efficiency and Big-O

Columns A[0] A[0][0] = 20 A[0][1] = 30

Transcription:

Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4) Dr. Yingwu Zhu

Measure Algorithm Efficiency Space utilization: amount of memory required Time efficiency: amount of time required to accomplish the task As space is not a problem nowadays Time efficiency is more emphasized But, in embedded computers or sensor nodes, space efficiency is still important

Time Efficiency Time efficiency depends on : size of input speed of machine quality of source code quality of compiler These vary from one platform to another So, we cannot express time efficiency meaningfully in real time units such as seconds!

Time Efficiency Time efficiency = the number of times instructions executed A measurement of efficiency of an algorithm Measure computing time T(n) as T(n) = number of times the instructions executed T(n): computing time of an algorithm for input of size n

Example: calculating a mean Task # times executed 1. Initialize the sum to 0 1 2. Initialize index i to 0 1 3. While i < n do following n+1 4. a) Add x[i] to sum n 5. b) Increment i by 1 n 6. Return mean = sum/n 1 Total 3n + 4

T(n): Order of Magnitude As number of inputs increases T(n) = 3n + 4 grows at a rate proportional to n Thus T(n) has the "order of magnitude" n

T(n): Order of Magnitude T(n)on input of size n, If there is some constant C such that T(n) < C f(n) for all sufficiently large values of n Then T(n) said to have order of magnitude f(n), Written as: T(n) = O(f(n)) Big-Oh notation

Big Oh Notation Another way of saying this: The complexity of the algorithm is O(f(n)). Example: For the Mean-Calculation Algorithm:

Mean-Calculation Algorithm [1] int sum = 0; [2] int i = 0; [3] While(i < n) { [4] sum += a[i]; [5] i++; } [6] return sum/n;

Big Oh Notation Example: For the Mean-Calculation Algorithm: T(n) is O(n) Note that constants and multiplicative factors are ignored. Simple function: f(n) = n

Measure T(n) Not only depends on the input size, but also depends on the arrangement of the input items Best case: not informative Average value of T: difficult to determine Worst case: is used to measure an algorithm s performance

Simplifying the complexity analysis Identify the statement executed most often and determine its execution count Ignore items with lower degree Only the highest power of n that affects Big-O estimate Big-O estimate gives an approximate measure of the computing time of an algorithm for large inputs

Get a Taste int search (int a[n], int x) { // pseudocode for search x for (int i=0; i<n; i++) if (a[i]==x) return i; return -1; } // assuming the elements in a[] are unique Complexity analysis T(n): can you figure it out? -- best case:? -- average value:? -- worst case:?

Get a Taste Answer: Best case: O(1) Average case: O(n) Worst case: O(n) How to get this?

Simple Selection Sorting Algorithm,p554 // Algorithm to sort x[0] x[n-1] into ascending order 1. for (int i=0; i<n-1; i++) { /*On the ith pass, first find the smallest element in the sublist x[i],,x[n-1]. */ 2. int spos = i; 3. int smallest = x[spos]; 4. for (int j=i+1; j<n; j++) { 5. if (x[j] < smallest) { 6. spos = j; 7. smallest = x[j]; } // end if } // end for 8. x[spos] = x[i]; 9. X[i] = smallest }// end for

What s the worst case T(n)? T(n) = O(n 2 ) How do we get that? Let s try!

Exercise 1 for (int k=0; k < n; k++) for (int i = 0; i < n; i++) m[i][j] = a[i][j] + b[i][j];

Exercise 2 for (int k=0; k < n; k++) for (int i = k+1; i < n; i++) m[i][j] = a[i][j] + b[i][j];

n = 0; sum = 0; cin >> x; while (x!= -999) { n++; sum += x; cin >> x; } mean = sum / n; Exercise 3

Binary Search Algorithm int bin_search(int a[], int item) // a [0 n-1] bool found = false; int first = 0; int last = n-1; while (first <= last &&!found) { int loc = (first + last) / 2; if (item < a[loc]) last = loc 1; else if (item > a[loc]) first = loc + 1; else found =!found; } //end while } //end bin_search

Binary Search Algorithm See p556. Step 1: identify the statement executed most often Step 2: determine the execution count for that statement (the worst case!) T(n)= O(log 2 n) Better approach: how binary search is performed?

f(n) is usually simple: n, n 2, n 3,... 2^n, 1, log 2 n n log 2 n log 2 log 2 n Big-Oh notation

How about recursive algorithms? double power(double x, unsigned n) { if (n==0) return 1.0; return x * power(x, n-1); }

How to compute Big-Oh? Recurrence relation: expresses the computing time for input of size n in terms of smaller-sized inputs How to solve recurrence relations? Using telescoping principle: repeatedly apply the relation until the anchor case is reached

Exercise 5 double fun(double x, unsigned n) { if (n==0) return 1.0; return x*fun(x, n/2); }

Exercise 6 for (int i = 0; i < n * n; ++i) { sum = sum/n; for (int j = 0; j < i; ++j) j >> cout; }

Exercise 7 for (int j = 4; j < n; ++j) { cin >> val; for (int i = 0; i < j; ++i) { b = b * val; for (int k = 0; k < n; ++k) c = b + c; } }

for (int i = 1; i<n-1; i++) { temp = a[i]; for (int j = i-1; j >= 0; j--) if (temp < a[j]) a[j+1] = a[j]; else break; a[j+1] = temp; } Exercise 8

Review of Lecture How to measure algorithm efficiency? How to compute time efficiency T(n)? Big-Oh notation For an algorithm, give the Big-Oh notation Simplified analysis Non-recursive Recursive: telescoping principle