Measuring Efficiency

Similar documents
61A LECTURE 16 TREES, ORDERS OF GROWTH. Steven Tang and Eric Tzeng July 22, 2013

Resources matter. Orders of Growth of Processes. R(n)= (n 2 ) Orders of growth of processes. Partial trace for (ifact 4) Partial trace for (fact 4)

61A LECTURE 15 MEMOIZATION, RECURSIVE DATA, SETS

MEMOIZATION, RECURSIVE DATA, AND SETS

Module 05: Types of recursion

# track total function calls using a global variable global fibcallcounter fibcallcounter +=1

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

Lecture #5: Higher-Order Functions. A Simple Recursion. Avoiding Recalculation. Redundant Calculation. Tail Recursion and Repetition


Computer Algorithms. Introduction to Algorithm

Analysis of Algorithms. CSE Data Structures April 10, 2002

Introduction to Programming I

ENVIRONMENT DIAGRAMS AND RECURSION 2

61A LECTURE 7 DATA ABSTRACTION. Steven Tang and Eric Tzeng July 3, 2013

CSCI 121: Recursive Functions & Procedures

Homework 1. Problem 1. The following code fragment processes an array and produces two values in registers $v0 and $v1:

Decorators in Python

Data Structures and Algorithms. Dynamic Programming

Recurrences and Memoization: The Fibonacci Sequence

Analysis of Algorithms & Big-O. CS16: Introduction to Algorithms & Data Structures Spring 2018

61A Lecture 4. Monday, September 9

The power of logarithmic computations. Recursive x y. Calculate x y. Can we reduce the number of multiplications?

6.001 Notes: Section 4.1

CS1 Recitation. Week 2

Programming Iterative Loops. for while

Environment Diagrams and Recursion Fall 2017 Discussion 2: September 6, 2017 Solutions. 1 More Environment Diagrams

Complexity of Algorithms. Andreas Klappenecker

ENVIRONMENT DIAGRAMS AND RECURSION 2

Intro. Speed V Growth

Algorithm Analysis CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

The Running Time of Programs

61A LECTURE 17 ORDERS OF GROWTH, EXCEPTIONS. Steven Tang and Eric Tzeng July 23, 2013

TAIL RECURSION, SCOPE, AND PROJECT 4 11

INTERPRETERS 8. 1 Calculator COMPUTER SCIENCE 61A. November 3, 2016

Introduction to Analysis of Algorithms

TREES AND ORDERS OF GROWTH 8

CONTROL AND HIGHER ORDER FUNCTIONS 1

The Knapsack Problem an Introduction to Dynamic Programming. Slides based on Kevin Wayne / Pearson-Addison Wesley

Any Integer Can Be Written as a Fraction

Algorithms and Data Structures

CSE : Python Programming

UNIT 5B Binary Search


Class 6: Efficiency in Scheme

CS1 Lecture 15 Feb. 18, 2019

CSE 143. Complexity Analysis. Program Efficiency. Constant Time Statements. Big Oh notation. Analyzing Loops. Constant Time Statements (2) CSE 143 1

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

20 Some terminology regarding trees: Parent node: A node that has branches. Parent nodes can have multiple branches.

TREES AND ORDERS OF GROWTH 7

What is an algorithm?

Complexity of Algorithms

Problem Set CVO 103, Spring 2018

Algorithm Analysis. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

IT 4043 Data Structures and Algorithms

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

61A Lecture 3. Friday, September 5

Copyright 2012 by Pearson Education, Inc. All Rights Reserved.

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Module 7 Highlights. Mastered Reviewed. Sections ,

Recursion & Performance. Recursion. Recursion. Recursion. Where Recursion Shines. Breaking a Problem Down

Checking for duplicates Maximum density Battling computers and algorithms Barometer Instructions Big O expressions. John Edgar 2

CMPSCI 187: Programming With Data Structures. Lecture 5: Analysis of Algorithms Overview 16 September 2011

Algorithmic Complexity

Summer 2017 Discussion 10: July 25, Introduction. 2 Primitives and Define

All about Fibonacci: A python approach

CSI33 Data Structures

CS116 - Module 5 - Accumulative Recursion

TREES AND ORDERS OF GROWTH 7

DM502 Programming A. Peter Schneider-Kamp.

Control and Environments Fall 2017 Discussion 1: August 30, 2017 Solutions. 1 Control. If statements. Boolean Operators

CS 4800: Algorithms & Data. Lecture 1 January 10, 2017

RECURSION 3. 1 Recursion COMPUTER SCIENCE 61A. June 30, 2016

Chapter 3 - Functions

Fall 2018 Discussion 8: October 24, 2018 Solutions. 1 Introduction. 2 Primitives

Measuring algorithm efficiency

Algorithmic Analysis. Go go Big O(h)!

Chapter 4 Functions By C.K. Liang

CSCE 110 Dr. Amr Goneid Exercise Sheet (7): Exercises on Recursion

Algorithm Design and Analysis

CSI33 Data Structures

Algorithm Design and Recursion. Search and Sort Algorithms

Structure and Interpretation of Computer Programs

CS 61A Control and Environments Spring 2018 Discussion 1: January 24, Control. If statements. Boolean Operators

Control and Environments Fall 2017 Discussion 1: August 30, Control. If statements. Boolean Operators

CS/COE 1501

CONTROL AND HIGHER ORDER FUNCTIONS 2

Dynamic programming 4/9/18

CONTROL AND ENVIRONMENTS 1

Software Analysis. Asymptotic Performance Analysis

Loops / Repetition Statements

ITERATORS AND STREAMS 9

Programming & Data Structure Laboratory. Day 2, July 24, 2014

Twelve Simple Algorithms to Compute Fibonacci Numbers

INTERPRETERS AND TAIL CALLS 9

For 100% Result Oriented IGNOU Coaching and Project Training Call CPD: ,

COP 4516: Math for Programming Contest Notes

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Complexity, Induction, and Recurrence Relations. CSE 373 Help Session 4/7/2016

For searching and sorting algorithms, this is particularly dependent on the number of data elements.

Turbo Charge CPU Utilization in Fork/Join Using the ManagedBlocker

Transcription:

Growth

Announcements

Measuring Efficiency

Recursive Computation of the Fibonacci Sequence Our first example of tree recursion: fib(3) fib(5) fib(4) def fib(n): if n == 0: return 0 elif n == 1: return 1 else: return fib(n-2) + fib(n-1) 1 fib(3) 1 (Demo) http://en.wikipedia.org/wiki/file:fibonacci.jpg 4

Memoization

Memoization Idea: Remember the results that have been computed before def memo(f): cache = {} def memoized(n): if n not in cache: cache[n] = f(n) return cache[n] return memoized Keys are arguments that map to return values Same behavior as f, if f is a pure function (Demo) 6

Memoized Tree Recursion fib(5) Call to fib Found in cache fib(3) fib(4) Skipped 1 fib(3) 1 7

Space

The Consumption of Space Which environment frames do we need to keep during evaluation? At any moment there is a set of active environments Values and frames in active environments consume memory Memory that is used for other values and frames can be recycled Active environments: Environments for any function calls currently being evaluated Parent environments of functions named in active environments (Demo) Interactive Diagram 9

Fibonacci Space Consumption fib(5) fib(3) fib(4) 1 fib(3) 1 Assume we have reached this step 10

Fibonacci Space Consumption fib(5) Has an active environment Can be reclaimed Hasn't yet been created fib(3) fib(4) 1 fib(3) 1 Assume we have reached this step 11

Time

Comparing Implementations Implementations of the same functional abstraction can require different resources Problem: How many factors does a positive integer n have? A factor k of n is a positive integer that evenly divides n def factors(n): Time (number of divisions) Slow: Test each k from 1 through n n Fast: Test each k from 1 to square root n For every k, n/k is also a factor! Greatest integer less than p n Question: How many time does each implementation use division? (Demo) 13

Orders of Growth

Order of Growth A method for bounding the resources used by a function by the "size" of a problem n: size of the problem R(n): measurement of some resource used (time or space) R(n) = (f(n)) means that there are positive constants k1 and k2 such that k 1 f(n) R(n) k 2 f(n) for all n larger than some minimum m 15

Order of Growth of Counting Factors Implementations of the same functional abstraction can require different amounts of time Problem: How many factors does a positive integer n have? A factor k of n is a positive integer that evenly divides n def factors(n): Time Space Slow: Test each k from 1 through n Fast: Test each k from 1 to square root n For every k, n/k is also a factor! (n) (1) ( p n) (1) Assumption: integers occupy a fixed amount of space (Demo) 16

Exponentiation

Exponentiation Goal: one more multiplication lets us double the problem size def exp(b, n): if n == 0: return 1 else: return b * exp(b, n-1) b n = 1 if n =0 b b n 1 otherwise def square(x): return x*x def exp_fast(b, n): if n == 0: return 1 elif n % 2 == 0: return square(exp_fast(b, n//2)) else: return b * exp_fast(b, n-1) b n = 1 if n =0 (b 1 2 n ) 2 if n is even b b n 1 if n is odd (Demo) 18

Exponentiation Goal: one more multiplication lets us double the problem size def exp(b, n): if n == 0: return 1 else: return b * exp(b, n-1) Time (n) Space (n) def square(x): return x*x def exp_fast(b, n): if n == 0: return 1 elif n % 2 == 0: return square(exp_fast(b, n//2)) else: return b * exp_fast(b, n-1) (log n) (log n) 19

Comparing Orders of Growth

Properties of Orders of Growth Constants: Constant terms do not affect the order of growth of a process (n) (500 n) ( 1 500 n) Logarithms: The base of a logarithm does not affect the order of growth of a process (log 2 n) (log 10 n) (ln n) Nesting: When an inner process is repeated for each step in an outer process, multiply the steps in the outer and inner processes to find the total number of steps def overlap(a, b): count = 0 for item in a: if item in b: count += 1 return count Outer: length of a Inner: length of b If a and b are both length n, then overlap takes (n 2 ) steps Lower-order terms: The fastest-growing part of the computation dominates the total (n 2 ) (n 2 + n) (n 2 + 500 n + log 2 n + 1000) 21

Comparing orders of growth (n is the problem size) (b n ) (n 2 ) (n) Exponential growth. Recursive fib takes ( n ) = 1+ 5 steps, where 1.61828 2 Incrementing the problem scales R(n) by a factor Quadratic growth. E.g., overlap Incrementing n increases R(n) by the problem size n Linear growth. E.g., slow factors or exp ( p n) Square root growth. E.g., factors_fast (log n) (1) Logarithmic growth. E.g., exp_fast Doubling the problem only increments R(n). Constant. The problem size doesn't matter 22