CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Similar documents
CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2019

Computer Science Approach to problem solving

Copyright 2000, Kevin Wayne 1

Review implementation of Stable Matching Survey of common running times. Turn in completed problem sets. Jan 18, 2019 Sprenkle - CSCI211

CS 781 Advanced Algorithms 1 Winter 2011

Algorithm Analysis. Part I. Tyler Moore. Lecture 3. CSE 3353, SMU, Dallas, TX

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

10. EXTENDING TRACTABILITY

Lecture 5: Running Time Evaluation

CS 580: Algorithm Design and Analysis

3/7/2018. CS 580: Algorithm Design and Analysis. 8.1 Polynomial-Time Reductions. Chapter 8. NP and Computational Intractability

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

4.1 Interval Scheduling

CS:3330 (22c:31) Algorithms

10. EXTENDING TRACTABILITY

The Limits of Sorting Divide-and-Conquer Comparison Sorts II

Winter 2016 COMP-250: Introduction to Computer Science. Lecture 8, February 4, 2016

Winter 2016 COMP-250: Introduction to Computer Science. Lecture 8, February 4, 2016

Copyright 2000, Kevin Wayne 1

Introduction to the Analysis of Algorithms. Algorithm

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Review for Midterm Exam

5. DIVIDE AND CONQUER I

Chapter 8. NP and Computational Intractability. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Data Structures Lecture 8

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

CS 112 Introduction to Programming

Elementary maths for GMT. Algorithm analysis Part I

Algorithms. Algorithms 1.4 ANALYSIS OF ALGORITHMS

CS240 Fall Mike Lam, Professor. Algorithm Analysis

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Chapter 3. Graphs CLRS Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS240 Fall Mike Lam, Professor. Algorithm Analysis

Analysis of Algorithms

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

1 The Arthur-Merlin Story

! Greed. O(n log n) interval scheduling. ! Divide-and-conquer. O(n log n) FFT. ! Dynamic programming. O(n 2 ) edit distance.

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION

COMP 251 Winter 2017 Online quizzes with answers

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Algorithm Design and Analysis

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

APPLIED MECHANISM DESIGN FOR SOCIAL GOOD

4.1 Performance. Running Time. The Challenge. Scientific Method. Reasons to Analyze Algorithms. Algorithmic Successes

Announcements. CSEP 521 Applied Algorithms. Announcements. Polynomial time efficiency. Definitions of efficiency 1/14/2013

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

Algorithm. Algorithm Analysis. Algorithm. Algorithm. Analyzing Sorting Algorithms (Insertion Sort) Analyzing Algorithms 8/31/2017

Chapter 6. Dynamic Programming. Modified from slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Computational Complexity II: Asymptotic Notation and Classifica

The growth of functions. (Chapter 3)

Algorithmic Paradigms

Algorithms in Systems Engineering IE172. Midterm Review. Dr. Ted Ralphs

Chapter 3. Graphs. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

COMPSCI 311: Introduction to Algorithms First Midterm Exam, October 3, 2018

An algorithm is a sequence of instructions that one must perform in order to solve a wellformulated

! Greed. O(n log n) interval scheduling. ! Divide-and-conquer. O(n log n) FFT. ! Dynamic programming. O(n 2 ) edit distance.

Analysis of Algorithms. CSE Data Structures April 10, 2002

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

bar foo for sum-product for for for for for for end for end for end for end for end for end for end for sum-product foo bar Definition

UNIT 1 ANALYSIS OF ALGORITHMS

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours: Wed 4-6 pm (CSEB 3043), or by appointment.

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Chapter 8 DOMINATING SETS

Technical University of Denmark

Running Time. Analytic Engine. Charles Babbage (1864) how many times do you have to turn the crank?

Chapter 8 DOMINATING SETS

Complexity of Algorithms

Elementary maths for GMT. Algorithm analysis Part II

The Complexity of Algorithms (3A) Young Won Lim 4/3/18

1. To reduce the probability of having any collisions to < 0.5 when hashing n keys, the table should have at least this number of elements.

CSci 231 Final Review

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Running Time. Analytic Engine. Charles Babbage (1864) how many times do you have to turn the crank?

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Design and Analysis of Algorithms

Midterm solutions. n f 3 (n) = 3

Algorithmic Analysis. Go go Big O(h)!

11. APPROXIMATION ALGORITHMS

Basic Data Structures (Version 7) Name:

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

3.1 Basic Definitions and Applications. Chapter 3. Graphs. Undirected Graphs. Some Graph Applications

Scientific Computing. Algorithm Analysis

Algorithms: Efficiency, Analysis, techniques for solving problems using computer approach or methodology but not programming

Here is a recursive algorithm that solves this problem, given a pointer to the root of T : MaxWtSubtree [r]

Week 12: Running Time and Performance

CS 580: Algorithm Design and Analysis

8.1 Polynomial-Time Reductions

Measuring algorithm efficiency

EST Solutions. Ans 1(a): KMP Algorithm for Preprocessing: KMP Algorithm for Searching:

Divide-Conquer-Glue Algorithms

Applied Algorithm Design Lecture 3

1.4 Analysis of Algorithms

Lecture 7: Bipartite Matching

Algorithms (IX) Guoqiang Li. School of Software, Shanghai Jiao Tong University

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

Transcription:

CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018

Recap: Stable Matching Problem Definition of a Stable Matching Stable Roomate Matching Problem Stable matching does not always exist! Gale Shapley Algorithm (Propose-And-Reject) Proof that Algorithm Terminates in OO nn 2 steps Proof that Algorithm Outputs Stable Matching Matching is male-optimal If there are multiple different stable matchings each man get s his best valid partner Matching is female-pessimal If there are multiple different stable matchings each man get s her worst valid partner 2

Extensions: Matching Residents to Hospitals Ex: Men hospitals, Women med school residents. Variant 1. Some participants declare others as unacceptable. Variant 2. Unequal number of men and women. resident A unwilling to work in Cleveland Variant 3. Limited polygamy. hospital X wants to hire 3 residents Gale-Shapley Algorithm Still Works. Minor modifications to code to handle variations! 3

Extensions: Matching Residents to Hospitals Ex: Men hospitals, Women med school residents. Variant 1. Some participants declare others as unacceptable. Variant 2. Unequal number of men and women. resident A unwilling to work in Cleveland Variant 3. Limited polygamy. hospital X wants to hire 3 residents Def. Matching S unstable if there is a hospital h and resident r such that: h and r are acceptable to each other; and either r is unmatched, or r prefers h to her assigned hospital; and either h does not have all its places filled, or h prefers r to at least one of its assigned residents. 4

1.2 Five Representative Problems

Interval Scheduling Input. Set of jobs with start times and finish times. Goal. Find maximum cardinality subset of mutually compatible jobs. jobs don't overlap a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 6

Interval Scheduling Input. Set of jobs with start times and finish times. Goal. Find maximum cardinality subset of mutually compatible jobs. jobs don't overlap Greedy Choice. Select job with earliest finish time and eliminate incompatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 7

Interval Scheduling Input. Set of jobs with start times and finish times. Goal. Find maximum cardinality subset of mutually compatible jobs. jobs don't overlap Greedy Choice. Select job with earliest finish time and eliminate incompatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 8

Interval Scheduling Input. Set of jobs with start times and finish times. Goal. Find maximum cardinality subset of mutually compatible jobs. jobs don't overlap Greedy Choice. Select job with earliest finish time and eliminate incompatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 9

Interval Scheduling Input. Set of jobs with start times and finish times. Goal. Find maximum cardinality subset of mutually compatible jobs. jobs don't overlap Greedy Choice. Select job with earliest finish time and eliminate incompatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 10

Interval Scheduling Input. Set of jobs with start times and finish times. Goal. Find maximum cardinality subset of mutually compatible jobs. jobs don't overlap Chapter 4: We will prove that this greedy algorithm always finds the optimal solution! a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 11

Weighted Interval Scheduling Input. Set of jobs with start times, finish times, and weights. Goal. Find maximum weight subset of mutually compatible jobs. Greedy Algorithm No Longer Works! 23 12 20 26 13 20 11 0 1 2 3 4 5 6 7 8 9 10 11 16 Time 12

Weighted Interval Scheduling Input. Set of jobs with start times, finish times, and weights. Goal. Find maximum weight subset of mutually compatible jobs. Greedy Algorithm No Longer Works! 23 12 20 26 13 20 11 0 1 2 3 4 5 6 7 8 9 10 11 16 Time 13

Weighted Interval Scheduling Input. Set of jobs with start times, finish times, and weights. Goal. Find maximum weight subset of mutually compatible jobs. Problem can be solved using technique called Dynamic Programming 23 12 20 26 13 20 11 0 1 2 3 4 5 6 7 8 9 10 11 16 Time 14

Bipartite Matching Input. Bipartite graph. Goal. Find maximum cardinality matching. A 1 B 2 C 3 D 4 E 5 Different from Stable Matching Problem! How? 15

Bipartite Matching Input. Bipartite graph. Goal. Find maximum cardinality matching. A 1 B 2 C 3 D 4 E 5 Problem can be solved using Network Flow Algorithms 16

Independent Set Input. Graph. Goal. Find maximum cardinality independent set. subset of nodes such that no two joined by an edge 1 2 3 4 5 6 7 Brute-Force Algorithm: Check every possible subset. RunningTime: 2 nn steps NP-Complete: Unlikely that efficient algorithm exists! Positive: Can easily check that there is an independent set of size k 17

Competitive Facility Location Input. Graph with weight on each node. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors have been selected. Goal. Select a maximum weight subset of nodes. 10 1 5 15 5 1 5 1 15 10 Second player can guarantee 20, but not 25. 18

Competitive Facility Location Input. Graph with weight on each node. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors have been selected. Goal. Select a maximum weight subset of nodes. 10 1 5 15 5 1 5 1 15 10 Second player can guarantee 20, but not 25. 19

Competitive Facility Location Input. Graph with weight on each node. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors have been selected. Goal. Select a maximum weight subset of nodes. 10 1 5 15 5 1 5 1 15 10 Second player can guarantee 20, but not 25. 20

Competitive Facility Location Input. Graph with weight on each node. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors have been selected. Goal. Select a maximum weight subset of nodes. 10 1 5 15 5 1 5 1 15 10 Second player can guarantee 20, but not 25. 21

Competitive Facility Location Input. Graph with weight on each node. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors have been selected. Goal. Select a maximum weight subset of nodes. 10 1 5 15 5 1 5 1 15 10 Second player can guarantee 20, but not 25. 22

Competitive Facility Location Input. Graph with weight on each node. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors have been selected. Goal. Select a maximum weight subset of nodes. PSPACE-Complete: Even harder than NP-Complete! No short proof that player can guarantee value B. (Unlike previous problem) 10 1 5 15 5 1 5 1 15 10 Second player can guarantee 20, but not 25. 23

Five Representative Problems Variations on a theme: independent set. Interval scheduling: n log n greedy algorithm. Weighted interval scheduling: n log n dynamic programming algorithm. Bipartite matching: n k max-flow based algorithm. Independent set: NP-complete. Competitive facility location: PSPACE-complete. 24

Chapter 2 Basics of Algorithm Analysis Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 25

2.1 Computational Tractability "For me, great algorithms are the poetry of computation. Just like verse, they can be terse, allusive, dense, and even mysterious. But once unlocked, they cast a brilliant new light on some aspect of computing." - Francis Sullivan

Computational Tractability As soon as an Analytic Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will arise - By what course of calculation can these results be arrived at by the machine in the shortest time? - Charles Babbage Charles Babbage (1864) Analytic Engine (schematic) 27

Polynomial-Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that checks every possible solution. Typically takes 2 N time or worse for inputs of size N. Unacceptable in practice. n! for stable matching with n men and n women Desirable scaling property. When the input size doubles, the algorithm should only slow down by some constant factor C. There exists constants c > 0 and d > 0 such that on every input of size N, its running time is bounded by c N d steps. Def. An algorithm is poly-time if the above scaling property holds. choose C = 2 d 28

Worst-Case Analysis Worst case running time. Obtain bound on largest possible running time of algorithm on input of a given size N. Generally captures efficiency in practice. Draconian view, but hard to find effective alternative. Average case running time. Obtain bound on running time of algorithm on random input as a function of input size N. Hard (or impossible) to accurately model real instances by random distributions. Algorithm tuned for a certain distribution may perform poorly on other inputs. 29

Worst-Case Polynomial-Time Def. An algorithm is efficient if its running time is polynomial. Justification: It really works in practice! Although 6.02 10 23 N 20 is technically poly-time, it would be useless in practice. In practice, the poly-time algorithms that people develop almost always have low constants and low exponents. Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. Exceptions. Some poly-time algorithms do have high constants and/or exponents, and are useless in practice. Some exponential-time (or worse) algorithms are widely used because the worst-case instances seem to be rare. simplex method Unix grep 30

31 Why It Matters

2.2 Asymptotic Order of Growth

Asymptotic Order of Growth Upper bounds. T(n) is O(f(n)) if there exist constants c > 0 and n 0 0 such that for all n n 0 we have T(n) c f(n). Lower bounds. T(n) is Ω(f(n)) if there exist constants c > 0 and n 0 0 such that for all n n 0 we have T(n) c f(n). Tight bounds. T(n) is Θ(f(n)) if T(n) is both O(f(n)) and Ω(f(n)). Ex: T(n) = 32n 2 + 17n + 32. T(n) is O(n 2 ), O(n 3 ), Ω(n 2 ), Ω(n), and Θ(n 2 ). T(n) is not O(n), Ω(n 3 ), Θ(n), or Θ(n 3 ). 33

Notation Slight abuse of notation. T(n) = O(f(n)). Not transitive: f(n) = 5n 3 ; g(n) = 3n 2 f(n) = O(n 3 ) = g(n) but f(n) g(n). Better notation: T(n) O(f(n)). Meaningless statement. Any comparison-based sorting algorithm requires at least O(n log n) comparisons. Statement doesn't "type-check." Use Ω for lower bounds. 34

Properties Transitivity. If f = O(g) and g = O(h) then f = O(h). If f = Ω(g) and g = Ω(h) then f = Ω(h). If f = Θ(g) and g = Θ(h) then f = Θ(h). Additivity. If f = O(h) and g = O(h) then f + g = O(h). If f = Ω(h) and g = Ω(h) then f + g = Ω(h). If f = Θ(h) and g = O(h) then f + g = Θ(h). 35

Asymptotic Bounds for Some Common Functions Polynomials. a 0 + a 1 n + + a d n d is Θ(n d ) if a d > 0. Polynomial time. Running time is O(n d ) for some constant d independent of the input size n. Logarithms. O(log a n) = O(log b n) for any constants a, b > 0. can avoid specifying the base Logarithms. For every x > 0, log n = O(n x ). log grows slower than every polynomial Exponentials. For every r > 1 and every d > 0, n d = O(r n ). every exponential grows faster than every polynomial 36

2.4 A Survey of Common Running Times

Linear Time: O(n) Linear time. Running time is proportional to input size. Computing the maximum. Compute maximum of n numbers a 1,, a n. max a 1 for i = 2 to n { if (a i > max) } max a i 38

Linear Time: O(n) Merge. Combine two sorted lists A = a 1,a 2,,a n B = b 1,b 2,,b n into sorted whole. with i = 1, j = 1 while (both lists are nonempty) { if (a i b j ) append a i to output list and increment i else(a i b j )append b j to output list and increment j } append remainder of nonempty list to output list Claim. Merging two lists of size n takes O(n) time. Pf. After each comparison, the length of output list increases by 1. 39

O(n log n) Time O(n log n) time. Arises in divide-and-conquer algorithms. also referred to as linearithmic time Sorting. Mergesort and heapsort are sorting algorithms that perform O(n log n) comparisons. Largest empty interval. Given n time-stamps x 1,, x n on which copies of a file arrive at a server, what is largest interval of time when no copies of the file arrive? O(n log n) solution. Sort the time-stamps. Scan the sorted list in order, identifying the maximum gap between successive time-stamps. 40

Quadratic Time: O(n 2 ) Quadratic time. Enumerate all pairs of elements. Closest pair of points. Given a list of n points in the plane (x 1, y 1 ),, (x n, y n ), find the pair that is closest. O(n 2 ) solution. Try all pairs of points. min (x 1 - x 2 ) 2 + (y 1 - y 2 ) 2 for i = 1 to n { for j = i+1 to n { d (x i - x j ) 2 + (y i - y j ) 2 if (d < min) min d } } don't need to take square roots Remark. Ω(n 2 ) seems inevitable, but this is just an illusion. see chapter 5 41

Cubic Time: O(n 3 ) Cubic time. Enumerate all triples of elements. Set disjointness. Given n sets S 1,, S n each of which is a subset of 1, 2,, n, is there some pair of these which are disjoint? O(n 3 ) solution. For each pairs of sets, determine if they are disjoint. foreach set S i { foreach other set S j { foreach element p of S i { determine whether p also belongs to S j } if (no element of S i belongs to S j ) report that S i and S j are disjoint } } 42

Polynomial Time: O(n k ) Time Independent set of size k. Given a graph, are there k nodes such that no two are joined by an edge? O(n k ) solution. Enumerate all subsets of k nodes. k is a constant foreach subset S of k nodes { check whether S in an independent set if (S is an independent set) report S is an independent set } } Check whether S is an independent set = O(k 2 ). Number of k element subsets = n O(k 2 n k / k!) = O(n k ). n (n 1) (n 2) (n k +1) = k k (k 1) (k 2) (2) (1) nk k! poly-time for k=17, but not practical 43

Exponential Time Independent set. Given a graph, what is maximum size of an independent set? O(n 2 2 n ) solution. Enumerate all subsets. S* φ foreach subset S of nodes { check whether S in an independent set if (S is largest independent set seen so far) update S* S } } 44

Heap Data Structure 6 11 9 44 53 10 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Max Heap Order: For each node v in the tree Parent v. Value v. Value 45

Heap Insertion 6 Heap.Insert(3) 11 9 44 53 10 3 Min Heap Order: For each node v in the tree Parent v. Value v. Value 46

Heap Insertion Heap.Insert(3) 6 11 3 44 53 10 9 Min Heap Order: For each node v in the tree Parent v. Value v. Value 47

Heap Insertion Heap.Insert(3) 6 11 3 44 53 10 9 Min Heap Order: For each node v in the tree Parent v. Value v. Value 48

Heap Insertion Heap.Insert(3) 3 11 6 44 53 10 9 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.12 [KT]: The procedure Heapify-up fixes the heap property and allows us to insert a new element into a heap of n elements in O(log n) time. 49

Heap Extract Minimum Heap.ExtractMin() 3 11 6 44 53 10 9 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.13 [KT]: The procedure Heapify-down fixes the heap property and allows us to delete an elment in a heap of n elements in O(log n) time. 50

Heap Extract Minimum Heap.ExtractMin() 3 11 6 44 53 10 9 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.13 [KT]: The procedure Heapify-down fixes the heap property and allows us to delete an elment in a heap of n elements in O(log n) time. 51

Heap Extract Minimum Heap.ExtractMin() 6 11 3 44 53 10 9 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.13 [KT]: The procedure Heapify-down fixes the heap property and allows us to delete an elment in a heap of n elements in O(log n) time. 52

Heap Extract Minimum Heap.ExtractMin() 6 11 3 44 53 10 9 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.13 [KT]: The procedure Heapify-down fixes the heap property and allows us to delete an elment in a heap of n elements in O(log n) time. 53

Heap Extract Minimum Heap.ExtractMin() 6 11 9 44 53 10 3 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.13 [KT]: The procedure Heapify-down fixes the heap property and allows us to delete an elment in a heap of n elements in O(log n) time. 54

Heap Extract Minimum Heap.ExtractMin() 6 11 9 44 53 10 Next Min Heap Order: For each node v in the tree Parent v. Value v. Value Theorem 2.13 [KT]: The procedure Heapify-down fixes the heap property and allows us to delete an elment in a heap of n elements in O(log n) time. 55

Heap Summary 3 11 6 44 53 10 9 Insert: O(log n) FindMin: O(1) Delete: O(log n) time ExtractMin: O(log n) time Thought Question: O(n log n) time sorting algorithm using heaps? 56