Theory and Frontiers of Computer Science. Fall 2013 Carola Wenk

Similar documents
Sorting. CMPS 2200 Fall Carola Wenk Slides courtesy of Charles Leiserson with small changes by Carola Wenk

Encoding/Decoding, Counting graphs

CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION

CMPSCI 187: Programming With Data Structures. Lecture 5: Analysis of Algorithms Overview 16 September 2011

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Spring 2016

Time Complexity of an Algorithm

Computer Algorithms. Introduction to Algorithm

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

O(n): printing a list of n items to the screen, looking at each item once.

Algorithm Analysis. Fall 2013 Carola Wenk

How fast can we sort? Sorting. Decision-tree example. Decision-tree example. CS 3343 Fall by Charles E. Leiserson; small changes by Carola

The Limits of Sorting Divide-and-Conquer Comparison Sorts II

COSC 311: ALGORITHMS HW1: SORTING

Randomized Algorithms, Quicksort and Randomized Selection

Practice Problems for the Final

Sorting is ordering a list of objects. Here are some sorting algorithms

9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

Intro to Algorithms. Professor Kevin Gold

CSCA48 Winter 2018 Week 10:Algorithm Analysis. Marzieh Ahmadzadeh, Nick Cheng University of Toronto Scarborough

EECS 203 Spring 2016 Lecture 8 Page 1 of 6

Divide and Conquer Algorithms

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

Theory and Algorithms Introduction: insertion sort, merge sort

Day 10. COMP1006/1406 Summer M. Jason Hinek Carleton University

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

II (Sorting and) Order Statistics

Scientific Computing. Algorithm Analysis

Algorithm. Algorithm Analysis. Algorithm. Algorithm. Analyzing Sorting Algorithms (Insertion Sort) Analyzing Algorithms 8/31/2017

Algorithms and Data Structures: Lower Bounds for Sorting. ADS: lect 7 slide 1

Data Structures Lecture 8

Chapter 8 Sort in Linear Time

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Classic Data Structures Introduction UNIT I

EECS 2011M: Fundamentals of Data Structures

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Introduction to Computer Science

Problems, Algorithms, Programs

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri

The Running Time of Programs

Lecture 15: Algorithms. AP Computer Science Principles

LIMITATIONS OF COMPUTING. Introduction to Computer Engineering 2015 Spring by Euiseong Seo

Linked Structures Songs, Games, Movies Part III. Fall 2013 Carola Wenk

CSE 421: Introduction to Algorithms Complexity

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

CS302 Topic: Algorithm Analysis #2. Thursday, Sept. 21, 2006

ECE 5775 (Fall 17) High-Level Digital Design Automation. Fixed-Point Types Analysis of Algorithms

Introduction to Algorithms

Comparison Based Sorting Algorithms. Algorithms and Data Structures: Lower Bounds for Sorting. Comparison Based Sorting Algorithms

Algorithm Analysis. Gunnar Gotshalks. AlgAnalysis 1

Analysis of Algorithm. Chapter 2

What is an algorithm?

Algorithm Analysis. This is based on Chapter 4 of the text.

Algorithmic Analysis. Go go Big O(h)!

Lecture 5 Sorting Arrays

6.856 Randomized Algorithms

11. Divide and Conquer

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Run Times. Efficiency Issues. Run Times cont d. More on O( ) notation

Midterm CSE 21 Spring 2012

Hiroki Yasuga, Elisabeth Kolp, Andreas Lang. 25th September 2014, Scientific Programming

Lecture 7 Quicksort : Principles of Imperative Computation (Spring 2018) Frank Pfenning

ANALYSIS OF ALGORITHMS. ANALYSIS OF ALGORITHMS IB DP Computer science Standard Level ICS3U

Introduction to Computers & Programming

UNIT 1 ANALYSIS OF ALGORITHMS

Lecture 2: Algorithms, Complexity and Data Structures. (Reading: AM&O Chapter 3)

Lecture Notes on Quicksort

Analysis of Algorithms

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Midterm CSE 21 Fall 2012

Algorithms. MIS, KUAS, 2007 Ding

Introduction to Algorithms 6.046J/18.401J/SMA5503

Recursion. COMS W1007 Introduction to Computer Science. Christopher Conway 26 June 2003

PESIT Bangalore South Campus Hosur road, 1km before Electronic City, Bengaluru -100 Department of MCA

Encoding/Decoding and Lower Bound for Sorting

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

MATH Iris Loeb.

IT 4043 Data Structures and Algorithms

Introduction to Computers and Programming. Today

CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

Introduction to the Theory of Computation, Sipser, PWS, ISBN X, 1996

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Fall 2014

SORTING. Insertion sort Selection sort Quicksort Mergesort And their asymptotic time complexity

«Computer Science» Requirements for applicants by Innopolis University

CSE 326: Data Structures Sorting Conclusion

Algorithms: Efficiency, Analysis, techniques for solving problems using computer approach or methodology but not programming

Introduction to Algorithms and Complexity Theory

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

D.K.M.COLLEGE FOR WOMEN (AUTONOMOUS), VELLORE-1.

CSE 373 APRIL 3 RD ALGORITHM ANALYSIS

Data Structures and Algorithms

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Introduction to Algorithms 6.046J/18.401J

Introduction to Programming I

CSE 100: B+ TREE, 2-3 TREE, NP- COMPLETNESS

Transcription:

Theory and Frontiers of Computer Science Fall 2013 Carola Wenk

We have seen so far Computer Architecture and Digital Logic (Von Neumann Architecture, binary numbers, circuits) Introduction to Python (if, loops, functions) Algorithm Analysis (Min, Searching, Sorting; Runtimes) Linked Structures (Lists, Trees, Huffman Coding) Graphs (Adjacency Lists, BFS, Connected Components) Data Mining (Finding patterns, supervised learning)

The Big Picture Input Program Output So far, we have been designing algorithms for problems that meet given specifications. There are many programs that can implement a particular algorithm, but we can make our picture even more abstract.

The Big Picture Input Algorithm Output We can think even more abstractly: for any particular problem we can come up with many algorithms. A natural way to categorize algorithms is by the problems they solve.

The Big Picture Problem Input... Output We can think even more abstractly: for any particular problem we can come up with many algorithms. A natural way to categorize algorithms is by the problems they solve.

The Big Picture Problem Input... Output We can think even more abstractly: for any particular problem we can come up with many algorithms. A natural way to categorize algorithms is by the problems they solve.

The Big Picture Problem Input... Output Then, for a particular problem, we are interested in finding an efficient algorithm. Is this always possible? What does efficient mean?

(Worst-Case) Asymptotic Runtime Analysis Usually, the abstract performance of an algorithm depends on the actual input for any particular size n. Which inputs should we use to characterize runtime? Time Input Size We define algorithm performance as conservatively as possible, on the worst-case inputs. No matter what, my algorithm takes at most c n steps for an input size of n.

Computational Complexity The field of computational complexity tries to categorize the difficulty of computational problems. It is a purely theoretical area of study, but has wide-ranging effects on the design and implementation of algorithms. Alan Turing A Turing Machine captures the essential components of computation: memory and state information. The Church-Turing Thesis states that everything algorithmically computable is computable by a Turing machine.

Computational Complexity The field of computational complexity tries to categorize the difficulty of computational problems. It is a purely theoretical area of study, but has wide-ranging effects on the design and implementation of algorithms. Intractable Efficiently Solvable Turing-Computable Problems

Computational Complexity The field of computational complexity tries to categorize the difficulty of computational problems. It is a purely theoretical area of study, but has wide-ranging effects on the design and implementation of algorithms.?? Intractable Efficiently Solvable Turing-Computable Problems

Computational Complexity The field of computational complexity tries to categorize the difficulty of computational problems. It is a purely theoretical area of study, but has wide-ranging effects on the design and implementation of algorithms.?? Super-Polynomial: Polynomial-Time: Turing-Computable Problems

Polynomial Versus Exponential Time We adopt the convention that as long as an algorithm s running time is polynomial (or logarithmic) in the input, it is efficient. Why is this a good criterion?

Polynomial Versus Exponential Time Selection Sort Merge Sort Minimum, Maximum, Linear Search Binary Search We adopt the convention that as long as an algorithm s running time is polynomial (or logarithmic) in the input, it is efficient. Why is this a good criterion?

Example: Fibonacci numbers F(0)=0; F(1)=1; F(n)=F(n-1)+F(n-2) for n 2 0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, Implement this recursion directly: F(n) n n/2 F(n-1) F(n-2) F(n-2) F(n-3) F(n-3) F(n-4) F(n-3) F(n-4) F(n-4) F(n-5) F(n-4) F(n-5) F(n-5) F(n-6) Runtime is exponential: 2 n/2 T(n) 2 n

Polynomial Versus Exponential Time Selection Recursive Sort Fibonacci O(2 n ) Merge Sort Minimum, Maximum, Linear Search Binary Search We adopt the convention that as long as an algorithm s running time is polynomial (or logarithmic) in the input, it is efficient. Why is this a good criterion?

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Which algorithm is better according to our usual method of comparison? For all large n?

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Which algorithm is better according to our usual method of comparison? For all large n??

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Which algorithm is better according to our usual method of comparison? For all large n????

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Which algorithm is better according to our usual method of comparison? For all large n????

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Which algorithm is better according to our usual method of comparison? For all large n????

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Which algorithm is better according to our usual method of comparison? For all large n??? For all large n, e.g., for all n 10 11

Polynomial versus Exponential Time Suppose we have two algorithms and for the same problem, where: Actually, every polynomial is (eventually) upper bounded by any exponential. Lemma: For any, and any, we have that, for sufficiently large.

Computational Complexity The field of computational complexity tries to categorize the difficulty of computational problems. It is a purely theoretical area of study, but has wide-ranging effects on the design and implementation of algorithms.?? Super-Polynomial: Polynomial-Time: Turing-Computable Problems

Computational Complexity The field of computational complexity tries to categorize the difficulty of computational problems. It is a purely theoretical area of study, but has wide-ranging effects on the design and implementation of algorithms.?? Super-Polynomial: Polynomial-Time: Turing-Computable Problems The Halting Problem

Upper and Lower Bounds If we can come up with an algorithm that correctly solves a particular problem, then its worst-case running time is an upper bound. What would be more useful though, is evidence that cannot be solved in a given amount of time. In other words, to establish difficulty we need a lower bound on the running time of any algorithm for. Upper Bound Algorithm A for can be solved in T A (n) time Lower Bound Regardless of the algorithm, the problem cannot be solved in less than T * (n) time.

Upper and Lower Bounds If we can come up with an algorithm that correctly solves a particular problem, then its worst-case running time is an upper bound. What would be more useful though, is evidence that cannot be solved in a given amount of time. In other words, to establish difficulty we need a lower bound on the running time of any algorithm for. Upper Bound MergeSort for sorting a list Lower Bound Every sorting algorithm requires at least??? time. Sorting can be done in O(n log n) time

Upper and Lower Bounds If we can come up with an algorithm that correctly solves a particular problem, then its worst-case running time is an upper bound. What would be more useful though, is evidence that cannot be solved in a given amount of time. In other words, to establish difficulty we need a lower bound on the running time of any algorithm for. Upper Bound MergeSort for sorting a list Sorting can be done in O(n log n) time Lower Bound Every sorting algorithm requires at least cn time. Can we match the lower bound to the upper bound?

Lower Bound for Sorting We came up with an algorithm for sorting that took time, can we be sure that this is the fastest possible? Given a list of distinct elements, consider what any algorithm for sorting actually does: Unsorted List How many possible orderings? Sorted List How many possible outputs?

Lower Bound for Sorting We came up with an algorithm for sorting that took time, can we be sure that this is the fastest possible? Given a list of distinct elements, consider what any algorithm for sorting actually does: Unsorted List How many possible orderings? Sorted List How many possible outputs?

Lower Bound for Sorting Unsorted List How many possible orderings? Sorted List How many possible outputs? Any correct sorting algorithm must be able to permute any input into a uniquely sorted list. Therefore any sorting algorithm must be able to apply any of the possible permutations necessary to produce the right answer.

Lower Bound for Sorting Any sorting algorithm must be able to apply any of the possible permutations necessary to produce the right answer. We can visualize the behavior of any sorting algorithm as a sequence of decisions based on comparing pairs of items: Algorithm : Yes No Yes Yes No No... Yes No

Lower Bound for Sorting For a list with items, let the possible permutations be. Any sorting algorithm must be able to reach all of these permutations by making a sequence of comparisons. The corresponding decision tree is: Algorithm : What does any of this tell us about the running time? This decision tree is a binary tree, and its height is a lower bound on the running time of....... What is the minimum height of any binary decision tree?

Lower Bound for Sorting For a list with items, let the possible permutations be. Any sorting algorithm must be able to reach all of these permutations by making a sequence of comparisons. The corresponding decision tree is: Algorithm : n! # leaves 2 height So, n! 2 height This is equivalent to: log n! height......

Lower Bound for Sorting For a list with items, let the possible permutations be. Any sorting algorithm must be able to reach all of these permutations by making a n! sequence = n (n-1) of (n-2) comparisons. 1 The corresponding decision tree is: = n (n/2+1) n/2 (n/2-1) 1 Algorithm : n/2 n/2 n/2 1 1 n! # leaves 2 height (n/2) n/2 So, n! 2 height So, This n! is (n/2) equivalent n/2 to: log n! height......

Lower Bound for Sorting So, n! (n/2) n/2 For a list with items, let the possible permutations be. Any sorting algorithm must be able to reach all of these permutations by making a sequence of comparisons. The corresponding decision tree is: Algorithm : n! # leaves 2 height So, n! 2 height This is equivalent to: log n! height...... So: log (n/2) n/2 log n! height

Lower Bound for Sorting For a list with items, let the possible permutations be. Any sorting algorithm must be able to reach all of these permutations by making a sequence of comparisons. The corresponding decision tree is: Algorithm : (n/2) log (n/2) = So: log (n/2) n/2 log n! height Therefore: (n/2) log (n/2) height Or equivalently: (1/2) n log n - (n/2) height...... What does this tell us about Merge Sort?

The Power of Lower Bounds Exponential-time Algorithm, Trivial lower bound I can t find an efficient algorithm, I guess I m just dumb. [Garey and Johnson 79]

The Power of Lower Bounds Matching Exponential-time bounds I can t find an efficient algorithm, because no such algorithm is possible. [Garey and Johnson 79]