CS221: Algorithms and Data Structures. Asymptotic Analysis. Alan J. Hu (Borrowing slides from Steve Wolfman)

Similar documents
Intro. Speed V Growth

Why study algorithms? CS 561, Lecture 1. Today s Outline. Why study algorithms? (II)

Midterms Save the Dates!

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

Introduction to Analysis of Algorithms

Midterms Save the Dates!

Last Time. University of British Columbia CPSC 111, Intro to Computation Alan J. Hu. Readings

Introduction to Scientific Computing

(Refer Slide Time: 02:59)

CPSC 221: Algorithms and Data Structures ADTs, Stacks, and Queues

A Sophomoric Introduction to Shared-Memory Parallelism and Concurrency Lecture 2 Analysis of Fork-Join Parallel Programs

CSCI 270: Introduction to Algorithms and Theory of Computing Fall 2017 Prof: Leonard Adleman Scribe: Joseph Bebel

CSE / ENGR 142 Programming I

CS 4349 Lecture October 18th, 2017

(Refer Slide Time: 1:27)

Week - 01 Lecture - 04 Downloading and installing Python

COMP 161 Lecture Notes 16 Analyzing Search and Sort

Design and Analysis of Algorithms Prof. Madhavan Mukund Chennai Mathematical Institute. Module 02 Lecture - 45 Memoization

4.1 Review - the DPLL procedure

x 2 + 3, r 4(x) = x2 1

CPE 101. Overview. Programming vs. Cooking. Key Definitions/Concepts B-1

CS 206 Introduction to Computer Science II

COMP 410 Lecture 1. Kyle Dewey

CS61A Lecture 7 Complexity and Orders of Growth. Jon Kotker and Tom Magrino UC Berkeley EECS June 27, 2012

Computer Science 210 Data Structures Siena College Fall Topic Notes: Recursive Methods

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

COMPUTER SCIENCE IN THE NEWS. CS61A Lecture 7 Complexity and Orders of Growth TODAY PROBLEM SOLVING: ALGORITHMS COMPARISON OF ALGORITHMS 7/10/2012

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

CPSC 221: Algorithms and Data Structures Lecture #1: Stacks and Queues

Midterms Save the Dates!

Midterms Save the Dates!

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri

Lecture 1: Overview

Design and Analysis of Algorithms Prof. Madhavan Mukund Chennai Mathematical Institute. Week 02 Module 06 Lecture - 14 Merge Sort: Analysis

Lecture 14: Exceptions 10:00 AM, Feb 26, 2018

Lecture 3: Recursion; Structural Induction

} Evaluate the following expressions: 1. int x = 5 / 2 + 2; 2. int x = / 2; 3. int x = 5 / ; 4. double x = 5 / 2.

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

CSCI-1200 Data Structures Spring 2018 Lecture 7 Order Notation & Basic Recursion

These are notes for the third lecture; if statements and loops.

ASYMPTOTIC COMPLEXITY

Big-O-ology 1 CIS 675: Algorithms January 14, 2019

Lecture Transcript While and Do While Statements in C++

Embedded Device Generation

CSE 12 Abstract Syntax Trees

Typing Control. Chapter Conditionals

Problem Solving through Programming In C Prof. Anupam Basu Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur

(Refer Slide Time: 01.26)

MITOCW watch?v=kz7jjltq9r4

EECS 203 Spring 2016 Lecture 8 Page 1 of 6

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

Case Study: Undefined Variables

Survey #2. Variable Scope. University of British Columbia CPSC 111, Intro to Computation Alan J. Hu. Readings. Scope Static.

Outline. runtime of programs algorithm efficiency Big-O notation List interface Array lists

CSE 303: Concepts and Tools for Software Development

TREES AND ORDERS OF GROWTH 7

Math 4242 Fibonacci, Memoization lecture 13

Admin. How's the project coming? After these slides, read chapter 13 in your book. Quizzes will return

CSC 220: Computer Organization Unit 12 CPU programming

Hi everyone. Starting this week I'm going to make a couple tweaks to how section is run. The first thing is that I'm going to go over all the slides

CS1 Lecture 2 Jan. 16, 2019

Chapter 2. Data Representation in Computer Systems

Goal of the course: The goal is to learn to design and analyze an algorithm. More specifically, you will learn:

Notes - Recursion. A geeky definition of recursion is as follows: Recursion see Recursion.

ASYMPTOTIC COMPLEXITY

Lecture 2: Analyzing Algorithms: The 2-d Maxima Problem

CS 4349 Lecture August 21st, 2017

CS61A Notes Week 6: Scheme1, Data Directed Programming You Are Scheme and don t let anyone tell you otherwise

6.001 Notes: Section 15.1

11 and 12 Arithmetic Sequence notes.notebook September 14, 2017

Theory and Frontiers of Computer Science. Fall 2013 Carola Wenk

CS1 Lecture 3 Jan. 22, 2018

Today s Outline. CSE 326: Data Structures Asymptotic Analysis. Analyzing Algorithms. Analyzing Algorithms: Why Bother? Hannah Takes a Break

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

Computer Science 1000: Part #2. Algorithms

Lecture 19 CSE August You taught me Language, and my profit on t is I know how to curse. William Shakspere, The Tempest, I, ii.

CS/IT 114 Introduction to Java, Part 1 FALL 2016 CLASS 10: OCT. 6TH INSTRUCTOR: JIAYIN WANG

Hyperparameter optimization. CS6787 Lecture 6 Fall 2017

CS1622. Semantic Analysis. The Compiler So Far. Lecture 15 Semantic Analysis. How to build symbol tables How to use them to find

CS1 Lecture 22 Mar. 6, 2019

Slide Set 6. for ENCM 339 Fall 2017 Section 01. Steve Norman, PhD, PEng

introduction to Programming in C Department of Computer Science and Engineering Lecture No. #40 Recursion Linear Recursion

COSC 2P95. Procedural Abstraction. Week 3. Brock University. Brock University (Week 3) Procedural Abstraction 1 / 26

Slide Set 1. for ENEL 339 Fall 2014 Lecture Section 02. Steve Norman, PhD, PEng

Week 12: Running Time and Performance

9 R1 Get another piece of paper. We re going to have fun keeping track of (inaudible). Um How much time do you have? Are you getting tired?

Recursive Definitions

Recursion. Recursion [Bono] 1

Outline. When we last saw our heros. Language Issues. Announcements: Selecting a Language FORTRAN C MATLAB Java

Hi everyone. I hope everyone had a good Fourth of July. Today we're going to be covering graph search. Now, whenever we bring up graph algorithms, we

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

COMP 250 Fall Recursive algorithms 1 Oct. 2, 2017

Intro to Algorithms. Professor Kevin Gold

CS/COE 1501

Material from Recitation 1

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

CPSC 320: Intermediate Algorithm Design and Analysis. Tutorial: Week 3

Organisation. Assessment

CS 4349 Lecture September 13th, 2017

Class Note #02. [Overall Information] [During the Lecture]

Transcription:

CS221: Algorithms and Data Structures Asymptotic Analysis Alan J. Hu (Borrowing slides from Steve Wolfman) 1

Learning Goals By the end of this unit, you will be able to Define which program operations we measure in an algorithm in order to approximate its efficiency. Define input size and determine the effect (in terms of performance) that input size has on an algorithm. Give examples of common practical limits of problem size for each complexity class. Give examples of tractable, intractable, and undecidable problems. Given code, write a formula which measures the number of steps executed as a function of the size of the input (N). Continued 2

Learning Goals By the end of this unit, you will be able to Compute the worst-case asymptotic complexity of an algorithm (e.g., the worst possible running time based on the size of the input (N)). Categorize an algorithm into one of the common complexity classes. Explain the differences between best-, worst-, and averagecase analysis. Describe why best-case analysis is rarely relevant and how worst-case analysis may never be encountered in practice. Given two or more algorithms, rank them in terms of their time and space complexity. 3

Today s Learning Goals/Outline Why and on what criteria you might want to compare algorithms Performance (time, space) is a function of the inputs. We usually simplify that to be a function of the size of the input. What are worst-case, average-case, common case, and best case analysis? What is and why do asymptotic analysis? Examples of asymptotic behavior to build intuition. 4

Comparing Algorithms Why? What do you judge them on?

Comparing Algorithms Why? What do you judge them on? Many possibilities Time (How long does it take to run?) Space (How much memory does it take?) Other attributes? Expensive operations, e.g. I/O Elegance, Cleverness Energy, Power Ease of programming, legal issues, etc.

Analyzing Runtime Iterative Fibonacci: old2 = 1 old1 = 1 for (i=3; i<n; i++) { result = old2+old1 old1 = old2 old2 = result } How long does this take? A second? A minute?

Analyzing Runtime Iterative Fibonacci: old2 = 1 old1 = 1 for (i=3; i<n; i++) { result = old2+old1 old1 = old2 old2 = result } How long does this take? A second? A minute? Runtime depends on n! Therefore, we will write it as a function of n. More generally, it will be a function of the input.

Analyzing Runtime Iterative Fibonacci: old2 = 1 old1 = 1 for (i=3; i<n; i++) { result = old2+old1 old1 = old2 old2 = result } What machine do you run on? What language? What compiler? How was it programmed?

Analyzing Runtime Iterative Fibonacci: old2 = 1 old1 = 1 for (i=3; i<n; i++) { result = old2+old1 old1 = old2 old2 = result } What machine do you run on? What language? What compiler? How was it programmed? We want to analyze algorithm, ignore these details! Therefore, just count basic operations, like arithmetic, memory access, etc.

Analyzing Runtime Iterative Fibonacci: old2 = 1 old1 = 1 for (i=3; i<n; i++) { result = old2+old1 old1 = old2 old2 = result } How many operations does this take?

Analyzing Runtime Iterative Fibonacci: old2 = 1 old1 = 1 for (i=3; i<n; i++) { result = old2+old1 old1 = old2 old2 = result } How many operations does this take? If we re ignoring details, does it make sense to be so precise? We ll see later how to do this much simpler!

Run Time as a Function of Input Run time of iterative Fibonacci is (depending on details of how we count and our implementation): 3+(n-3)(6)+1, simplified to 6n-14

Run Time as a Function of Input Run time of iterative Fibonacci is (depending on details of how we count and our implementation): 3+(n-3)(6)+1, simplified to 6n-14 Since we ve abstracted away exactly how long different operations take, and on what computer we re running, does it make sense to say 6n-14 instead 6n-10 or 5n-20 or 3.14n-6.02???

Run Time as a Function of Input Run time of iterative Fibonacci is (depending on details of how we count and our implementation): 3+(n-3)(6)+1, simplified to 6n-14 Since we ve abstracted away exactly how long different operations take, and on what computer we re running, does it make sense to say 6n-14 instead 6n-10 or 5n-20 or 3.14n-6.02??? What matters is its linear in n. (We will formalize this soon.)

Run Time as a Function of Input What if we have lots of inputs? E.g., what is the run time for linear search in a list?

Run Time as a Function of Input What if we have lots of inputs? E.g., what is the run time for linear search in a list? We could compute some complicated function f(key,list) = but that will be too complicated to compare.

Run Time as a Function of Size of Input What if we have lots of inputs? E.g., what is the run time for linear search in a list? Instead, we usually simplify to take the run time only in terms of the size of the input. Intuitively, this is e.g., the length of a list, etc. Formally, it s the number of bits of input This keeps our analysis simpler

Run Time as a Function of Size of Input But, which input? Different inputs of same size have different run times E.g., what is run time of linear search in a list? If the item is the first in the list? If it s the last one? If it s not in the list at all? What should we report?

Which Run Time? There are different kinds of analysis, e.g., Best Case Worst Case Average Case (Expected Time) Common Case Amortized etc.

Which Run Time? There are different kinds of analysis, e.g., Best Case Worst Case Average Case (Expected Time) Common Case Amortized etc. Mostly useless

Which Run Time? There are different kinds of analysis, e.g., Best Case Worst Case Average Case (Expected Time) Common Case Amortized etc. Useful, pessimistic

Which Run Time? There are different kinds of analysis, e.g., Best Case Worst Case Average Case (Expected Time) Common Case Amortized etc. Useful, hard to do right

Which Run Time? There are different kinds of analysis, e.g., Best Case Worst Case Average Case (Expected Time) Common Case Amortized etc. Very useful, but ill-defined

Which Run Time? There are different kinds of analysis, e.g., Best Case Worst Case Average Case (Expected Time) Common Case Amortized etc. Useful, you ll see this in more advanced courses

Multiple Inputs (or Sizes of Inputs) Sometime, it s handy to have the function be in terms of multiple inputs E.g., run time of counting how many times string A appears in string B It would make sense to write the result as a function of both A.length and B.length

Which BigFib is faster? We saw an exponential time, simple recursive Fibonacci, and a log time, more complex Fibonacci.

Which BigFib is faster? We saw an exponential time, simple recursive Fibonacci, and a log time, more complex Fibonacci. At n=5, simple version is faster. At n=35, complex version is faster. What s more important?

Scalability! Computer science is about solving problems people couldn t solve before. Therefore, the emphasis is almost always on solving the big versions of problems. (In computer systems, they always talk about scalability, which is the ability of a solution to work when things get really big.)

Asymptotic Analysis Asymptotic analysis is analyzing what happens to the run time (or other performance metric) as the input size n goes to infinity. The word comes from asymptotes, which is where you look at the limiting behavior of a function as something goes to infinity. This gives a solid mathematical way to capture the intuition of emphasizing scalable performance. It also makes the analysis a lot simpler!

Interpreters, Compilers, Linkers Steve tells me that 221 students often find linker errors to be mysterious. So, what s a linker?

Separate Compilation A compiler translates a program in a high-level language into machine language. A big program can be many millions of lines of code. (e.g., Windows Vista was 50MLoC) Compiling something that big takes hours or days. The source code is in many files, and most changes affect only a few files. Therefore, we compile each file separately!

Symbol Tables How can you compile an incomplete program? Header files tell you the types of the missing functions These are the.h file in C and C++ programs The object code includes a list of missing functions, and where they are called. The object code also includes a list of all public functions declared in it. These lists are called the symbol table.

Linking The linker puts all these files together into a single executable file, using the symbol tables to hook up missing functions with their definitions. In C and C++, the executable starts with a function called main, like in Java.