Same idea, different implementation. Is it important?

Similar documents
Computational Complexity: Measuring the Efficiency of Algorithms

Algorithm Analysis. CENG 707 Data Structures and Algorithms

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Algorithmic Analysis. Go go Big O(h)!

2/14/13. Outline. Part 5. Computational Complexity (2) Examples. (revisit) Properties of Growth-rate functions(1/3)

Introduction to Analysis of Algorithms

What is an algorithm?

Walls & Mirrors Chapter 9. Algorithm Efficiency and Sorting

CS 261 Data Structures. Big-Oh Analysis: A Review

Chapter 2: Complexity Analysis

9/10/12. Outline. Part 5. Computational Complexity (2) Examples. (revisit) Properties of Growth-rate functions(1/3)

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

Internal Sorting by Comparison

Measuring algorithm efficiency

Outline. runtime of programs algorithm efficiency Big-O notation List interface Array lists

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Choice of C++ as Language

Internal Sorting by Comparison

Comparison of x with an entry in the array

Algorithm Analysis. Performance Factors

Data Structures. Topic #7

CSE 373: Data Structures and Algorithms

Checking for duplicates Maximum density Battling computers and algorithms Barometer Instructions Big O expressions. John Edgar 2

Sorting. Order in the court! sorting 1

CMPSCI 187: Programming With Data Structures. Lecture 5: Analysis of Algorithms Overview 16 September 2011

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

Sorting. Order in the court! sorting 1

Armstrong Atlantic State University Engineering Studies MATLAB Marina Sorting Primer

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Sorting. Bubble Sort. Selection Sort

Analysis of Algorithm. Chapter 2

ECE 2574: Data Structures and Algorithms - Big-O Notation. C. L. Wyatt

csci 210: Data Structures Program Analysis

Algorithm Analysis. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

26 How Many Times Must a White Dove Sail...

Analysis of Algorithms Part I: Analyzing a pseudo-code

Sorting Pearson Education, Inc. All rights reserved.

UNIT 1 ANALYSIS OF ALGORITHMS

Asymptotic Analysis Spring 2018 Discussion 7: February 27, 2018

EE 368. Week 6 (Notes)

Algorithm Complexity Analysis: Big-O Notation (Chapter 10.4) Dr. Yingwu Zhu

Algorithm Performance. (the Big-O)

CSE 332 Winter 2015: Midterm Exam (closed book, closed notes, no calculators)

Complexity of Algorithms. Andreas Klappenecker

CS 137 Part 7. Big-Oh Notation, Linear Searching and Basic Sorting Algorithms. November 10th, 2017

Introduction to Data Structure

IT 4043 Data Structures and Algorithms

CS S-02 Algorithm Analysis 1

Sorting. Task Description. Selection Sort. Should we worry about speed?

Data structure and algorithm in Python

CSE 143. Complexity Analysis. Program Efficiency. Constant Time Statements. Big Oh notation. Analyzing Loops. Constant Time Statements (2) CSE 143 1

Cpt S 223 Course Overview. Cpt S 223, Fall 2007 Copyright: Washington State University

Asymptotic Analysis of Algorithms

7.3 Arrays of Strings (Objects) 7.3 Arrays of Strings (Objects) 7.3 Tunes.java. 7.3 Arrays of Objects 9/11/13. ! A UML diagram for the Tunes program

CS126 Final Exam Review

Chapter 11. Analysis of Algorithms

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

THE UNIVERSITY OF WESTERN AUSTRALIA

Recursion Chapter 17. Instructor: Scott Kristjanson CMPT 125/125 SFU Burnaby, Fall 2013

An algorithm is a sequence of instructions that one must perform in order to solve a wellformulated

12/1/2016. Sorting. Savitch Chapter 7.4. Why sort. Easier to search (binary search) Sorting used as a step in many algorithms

Analysis of Algorithms

Complexity Analysis of an Algorithm

CS 216 Exam 1 Fall SOLUTION

Overview of Sorting Algorithms

Analysis of Algorithms. CS 1037a Topic 13

Arrays.

Sorting & Searching (and a Tower)

Analysis of Algorithms

CSCI-1200 Data Structures Spring 2018 Lecture 7 Order Notation & Basic Recursion

Topics for CSCI 151 Final Exam Wednesday, May 10


Test 1 SOLUTIONS. June 10, Answer each question in the space provided or on the back of a page with an indication of where to find the answer.

Big-O-ology 1 CIS 675: Algorithms January 14, 2019

Big-O-ology. Jim Royer January 16, 2019 CIS 675. CIS 675 Big-O-ology 1/ 19

Chapter 1 Programming: A General Overview

Recursion. Example R1

Algorithm Analysis. Algorithm Efficiency Best, Worst, Average Cases Asymptotic Analysis (Big Oh) Space Complexity

Data Structures and Algorithms

Introduction to Computer Science

COMP Analysis of Algorithms & Data Structures

Assignment 1 (concept): Solutions

COMP 161 Lecture Notes 16 Analyzing Search and Sort

Algorithm Analysis. Big Oh

CS302 - Data Structures using C++

CSC 273 Data Structures

Data Structures Lecture 3 Order Notation and Recursion

CS 6402 DESIGN AND ANALYSIS OF ALGORITHMS QUESTION BANK

Chapter Contents. An Introduction to Sorting. Selection Sort. Selection Sort. Selection Sort. Iterative Selection Sort. Chapter 9

Algorithm Efficiency and Big-O

University of Petra Faculty of Information Technology

Algorithm Analysis. Spring Semester 2007 Programming and Data Structure 1

CSCI 136 Data Structures & Advanced Programming. Lecture 7 Spring 2018 Bill and Jon

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

Computer Algorithms. Introduction to Algorithm

Complexity of Algorithms

Class Note #02. [Overall Information] [During the Lecture]

Chapter 1 Programming: A General Overview

Recursion & Performance. Recursion. Recursion. Recursion. Where Recursion Shines. Breaking a Problem Down

Transcription:

Analysis of algorithms Major field that provides tools for evaluating the efficiency of different solutions What is an efficient algorithm? Faster is better How do you measure time? Wall clock? Computer clock? Using less space is better But if you need to get data out of main memory it takes time Algorithm analysis should be independent of Specific implementations and coding tricks (programming language, control statements) Specific Computers (hardware chip, OS, clock speed) Particular set of data But size of data should matter public void selectionsort (int[] data) int maxindex; // index of max integer static void swap (int[] data, int i, int j) //exchanges the contents of data[i] and data[j] public void selectionsort (int[] thearray, int n) for (int last = n-1; last >= 1; last--) int largest = indexoflargest(thearray, last+1); int temp = thearray[largest]; thearray[largest] = thearray[last]; thearray[last] = temp; // end for // end selectionsort private static int indexoflargest (int[] thearray, int size) int indexsofar = 0; for (int currindex = 1; currindex < size; ++currindex) if(thearray[currindex]>thearray[indexsofar]) indexsofar = currindex; // end if // end for return indexsofar; // index of largest item Same idea, different implementation. Is it important? public void selectionsort (int[] data) int maxindex; // index of max integer void swap (int[] data, int i, int j) // exchanges the contents // of data[i] and data[j] public void insertionsort(int[] thearray, int n) for (int unsorted = 1; unsorted < n; ++unsorted) int nextitem = thearray[unsorted]; int loc = unsorted; while ((loc > 0) && (thearray[loc-1]> nextitem)) // shift thearray[loc-1] to right thearray[loc--] = thearray[loc-1]; // insert nextitem into sorted region thearray[loc] = nextitem; // end for // end insertionsort Both algorithms sort correctly. Is one better than the other?

public void selectionsort (int[] data) int maxindex; // index of max void swap (int[] data, int i, int j) // exchanges the contents of // data[i] and data[j] public void solvehanoitowers (int n, char source, char dest, char spare) if (n==1) System.out.println("Move top disk from " + source + " to " + dest); else solvetowers(n-1, source, spare, dest); solvetowers(1, source, dest, spare); solvetowers(n-1, spare, dest, source); Algorithms solving two different problems. Is one more difficult than the other? Counting an algorithm's operations is a good way to assess its efficiency An algorithm s execution time is related to the number of operations it requires in a worst case scenario Examples of worst case scenarios Searching a linked list Operations: about as many as elements Selection sort Operations: as we go through the array to select the next minimum, we traverse most of the array again The Towers of Hanoi Operations: to solve an instance of n disks, we need to solve 2 instances of n-1 disks We could also consider an average case scenario, but is harder A statement that the computer can execute in one or a few (fixed number of) instructions, we count it as 1 step : O(1) = order of one This includes arithmetic, logical operations, assignments, but not necessarily function calls, recursive steps, etc. For example: // code with O(1) steps int i = 100; if ( (i < n) && (n%2 == 0) ) i = i/n; else i = (i+1)/x*2; Each of these statements is considered O(1). Thus the overall order of this piece of code is k * O(1) = O(1) where k is the number of individual statements Need to determine how often a set of statements gets executed to determine the order of an algorithm To analyze loop execution, first determine the order of the body of the loop, and then multiply that by the number of times the loop will execute for (int i = 0; i < n; i++) // some sequence of O(1) steps The loop executes n times, and the body of the loop is O(1). Thus the overall order is O(n) * O(1) = O(n)

When loops are nested, we must multiply the complexity of the outer loop by the complexity of the inner loop for (int i = 0; i < n; i++) for (int j = 0; j < n; j++) // some sequence of O(1) steps Both the inner and outer loops have complexity of O(n) When multiplied together, the order becomes O(n 2 ) Determining the order of a recursive algorithm determine the order of the recursion (# of times recursive definition is followed) and multiply it by the order of the body of the recursive method Example: Consider the recursive method to compute the product of integers from 1 to some n > 1 public int fact (int n) int result; if (n == 1) else result = 1; result = n * fact(n-1); return result; Size of the problem is n, the number of values to be multiplied Operation of interest is the multiplication operation The body of the method performs one multiplication and therefore is O(1) Each time the recursive method is invoked, n is decreased by 1, thus the recursive method is called n times The order of the entire algorithm is O(n) An algorithm s time requirements can be measured as a function of the problem size n Growth rate enables the comparison between algorithms Examples Searching a sorted array requires time proportional to lg n Searching a list requires time proportional to n Selection sort requires time proportional to n 2 1200 1000 2^n n^3 n^2 n log n Notation: Big-Oh aka order of Searching a sorted array requires time O(lg n) Searching a list requires time O(n) Selection sort requires time O(n 2 ) Time 800 600 400 n Algorithm efficiency is typically a concern for large 0 problems only (as n grows ) 200 0 10 20 30 40 50 60 70 80 90 100 Problem Size (n)

Definition of the order of an algorithm Algorithm A is order f(n), denoted O(f(n)), if constants k and n 0 exist such that A requires no more than k * f(n) time units to solve a problem of size n n 0 Growth-rate function A mathematical function used to specify an algorithm s order in terms of the size of the problem Big-Oh notation A notation that uses the capital letter O to specify an algorithm s order Example: O(n), O(n 2 *log n), O(n 4 ), in general, O(f(n)) Order of growth of some common functions O(1) < O(log 2 n) < O(n) < O(n * log 2 n) < O(n 2 ) < O(n 3 ) < O(2 n ) Properties of growth-rate functions Summing orders is dominated by the larger order O(f(n)) + O(g(n)) = O(f(n) + g(n)) = O( maxf(n),g(n) ) Therefore you can ignore low-order terms O(n 2 + n) = O(n 2 ) Multiplying orders means multiplying terms O(f(n)) * O(g(n)) = O(f(n) * g(n)) Therefore you can ignore a multiplicative constant in the high-order term O(12*n 2 ) = O(n 2 ) /** * Adds a CD to the collection of n CDs, increasing the * size of the collection if necessary */ public void addcd (String title, String artist, double cost, int tracks) if (count == collection.length) increasesize(); collection[count] = new CD(title, artist, cost, tracks); totalcost += cost; count++; /** * Increases the capacity of the collection by * creating a larger array and copying */ private void increasesize() CD[] temp = new CD[collection.length * 2]; for (int cd = 0; cd < collection.length; cd++) temp[cd] = collection[cd]; collection = temp;

An algorithm can require different times to solve different problems of the same size Worst-case analysis A determination of the maximum amount of time that an algorithm requires to solve problems of size n Average-case analysis A determination of the average amount of time that an algorithm requires to solve problems of size n Throughout the course of an analysis, keep in mind that you are interested only in significant differences in efficiency When choosing an ADT s implementation, consider how frequently particular ADT operations occur in a given application Some seldom-used but critical operations must be efficient If the problem size is always small, you can probably ignore an algorithm s efficiency Weigh the trade-offs between an algorithm s time requirements and its memory requirements Compare algorithms for both style and efficiency Order-of-magnitude analysis focuses on large problems