CS711008Z Algorithm Design and Analysis

Similar documents
CS711008Z Algorithm Design and Analysis

Algorithms and Theory of Computation. Lecture 7: Priority Queue

Sorting and Searching

Sorting and Searching

Fibonacci Heaps. Structure. Implementation. Implementation. Delete Min. Delete Min. Set of min-heap ordered trees min

Priority Queues. Meld(Q 1,Q 2 ) merge two sets

Chapter 5 Data Structures Algorithm Theory WS 2016/17 Fabian Kuhn

Chapter 6 Heapsort 1

CSE 548: Analysis of Algorithms. Lectures 14 & 15 ( Dijkstra s SSSP & Fibonacci Heaps )

Binary Heaps in Dynamic Arrays

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

Minimum Spanning Trees

CSCE750 Analysis of Algorithms Fall 2018 Fibonacci Heaps

Heaps Goodrich, Tamassia. Heaps 1

Rank-Pairing Heaps. Bernard Haeupler Siddhartha Sen Robert E. Tarjan. SIAM JOURNAL ON COMPUTING Vol. 40, No. 6 (2011), pp.

Heaps. 2/13/2006 Heaps 1

Lecture 4: Advanced Data Structures

Heaps 2. Recall Priority Queue ADT. Heaps 3/19/14

CSCI2100B Data Structures Heaps

2 A Template for Minimum Spanning Tree Algorithms

Partha Sarathi Manal

Algorithm Theory, Winter Term 2015/16 Problem Set 5 - Sample Solution

Algorithm Design and Analysis

CS4800: Algorithms & Data Jonathan Ullman

Computer Science 210 Data Structures Siena College Fall Topic Notes: Priority Queues and Heaps

Priority Queues. T. M. Murali. January 29, 2009

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

Module 2: Priority Queues

W4231: Analysis of Algorithms

Priority Queues. Priority Queues Trees and Heaps Representations of Heaps Algorithms on Heaps Building a Heap Heapsort.

Analysis of Algorithms

Algorithms and Data Structures. (c) Marcin Sydow. Priority Queue. Example Applications. Extensions of. Priority Queue. Binomial Heap.

Height of a Heap. Heaps. 1. Insertion into a Heap. Heaps and Priority Queues. The insertion algorithm consists of three steps

Priority Queues & Heaps. Chapter 9

Module 2: Priority Queues

l Heaps very popular abstract data structure, where each object has a key value (the priority), and the operations are:

Priority Queues. T. M. Murali. January 26, T. M. Murali January 26, 2016 Priority Queues

Priority Queues. T. M. Murali. January 23, T. M. Murali January 23, 2008 Priority Queues

Algorithms for Minimum Spanning Trees

HEAPS: IMPLEMENTING EFFICIENT PRIORITY QUEUES

CSE 5311 Notes 4a: Priority Queues

Priority Queues Heaps Heapsort

Lecture 5 Using Data Structures to Improve Dijkstra s Algorithm. (AM&O Sections and Appendix A)

CH 8. HEAPS AND PRIORITY QUEUES

l So unlike the search trees, there are neither arbitrary find operations nor arbitrary delete operations possible.

CH. 8 PRIORITY QUEUES AND HEAPS

Advanced algorithms. binary heap, d-ary heap, binomial heap, amortized analysis, Fibonacci heap. Jiří Vyskočil, Radek Mařík 2013

Thus, it is reasonable to compare binary search trees and binary heaps as is shown in Table 1.

A Comparison of Data Structures for Dijkstra's Single Source Shortest Path Algorithm. Shane Saunders

Minimum Spanning Trees

Topic: Heaps and priority queues

Lecture Summary CSC 263H. August 5, 2016

EECS 477: Introduction to algorithms. Lecture 10

Course Review for Finals. Cpt S 223 Fall 2008

Review of 4 comparison-based priority queues: Binary, Binomial, Fibonacci and Weak heap : Technical report LUSY-2012/01

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

18.3 Deleting a key from a B-tree

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

9. Heap : Priority Queue

Chapter 6 Heaps. Introduction. Heap Model. Heap Implementation

Recall: Properties of B-Trees

Dijkstra s algorithm for shortest paths when no edges have negative weight.

SKEW HEAPS: Self-Adjusting Heaps

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

Overview of Presentation. Heapsort. Heap Properties. What is Heap? Building a Heap. Two Basic Procedure on Heap

Data Structures Lesson 9

Algorithms and Data Structures

Binary Heaps. CSE 373 Data Structures Lecture 11

Administration CSE 326: Data Structures

Design and Analysis of Algorithms

V Advanced Data Structures

CS 6783 (Applied Algorithms) Lecture 5

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

FIBONACCI HEAPS. preliminaries insert extract the minimum decrease key bounding the rank meld and delete. Lecture slides by Kevin Wayne

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

Heaps Outline and Required Reading: Heaps ( 7.3) COSC 2011, Fall 2003, Section A Instructor: N. Vlajic

Binary Heaps. COL 106 Shweta Agrawal and Amit Kumar

V Advanced Data Structures

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Stores a collection of elements each with an associated key value

Quake Heaps: A Simple Alternative to Fibonacci Heaps

2pt 0em. Computer Science & Engineering 423/823 Design and Analysis of Algorithms. Lecture 04 Minimum-Weight Spanning Trees (Chapter 23)

Partha Sarathi Manal

Priority queue. Advanced Algorithmics (6EAP) MTAT Heaps. Binary heap. Heap/Priority queue. Binomial heaps:

Heap Model. specialized queue required heap (priority queue) provides at least

Binary heaps (chapters ) Leftist heaps

Introduction to Algorithms

Questions from the material presented in this lecture

Fibonacci Heaps Priority Queues. John Ross Wallrabenstein

CS 240 Data Structures and Data Management. Module 2: Priority Queues

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

Dijkstra s Algorithm Last time we saw two methods to solve the all-pairs shortest path problem: Min-plus matrix powering in O(n 3 log n) time and the

Algorithms. AVL Tree

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

Algorithm Design and Analysis

Heapsort. Why study Heapsort?

Introduction to Algorithms

Cpt S 223 Fall Cpt S 223. School of EECS, WSU

Readings. Priority Queue ADT. FindMin Problem. Priority Queues & Binary Heaps. List implementation of a Priority Queue

22 Elementary Graph Algorithms. There are two standard ways to represent a

Transcription:

CS700Z Algorithm Design and Analysis Lecture 7 Binary heap, binomial heap, and Fibonacci heap Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 /

Outline Introduction to priority queue Various implementations of priority queue: Linked list: a list having n items is too long to support efficient ExtractMin and Insert operations simultaneously; Binary heap: using a tree rather than a linked list; Binomial heap: allowing multiple trees rather than a single tree to support efficient Union operation Fibonacci heap: implement DecreaseKey via simply cutting an edge rather than exchanging nodes, and control a bushy tree shape via allowing at most one child losing for any node 2 /

Priority queue 3 /

Priority queue: motivation Motivation: It is usually a case to extract the minimum from a set S of n numbers, dynamically Here, the word dynamically means that on S, we might perform Insertion, Deletion and DecreaseKey operations The question is how to organize the data to efficiently support these operations 4 /

Priority queue Priority queue is an abstract data type similar to stack or queue, but each element has a priority associated with its name A min-oriented priority queue must support the following core operations: 1 H=MakeHeap(): to create a new heap H; 2 Insert(H, x): to insert into H an element x together with its priority 3 ExtractMin(H): to extract the element with the highest priority; 4 DecreaseKey(H, x, k): to decrease the priority of element x; 5 Union(H 1, H 2 ): return a new heap containing all elements of heaps H 1 and H 2, and destroy the input heaps 5 /

Priority queue is very useful Priority queue has extensive applications, such as: Dijkstra s shortest path algorithm Prim s MST algorithm Huffman coding A searching algorithm HeapSort 6 /

Ạn example: Dijkstra s algorithm 7 /

Dijkstra s algorithm [15] Dijkstra(G, s, t) 1: key(s) = 0; //key(u) stores an upper bound of the shortest distance from s to u; 2: P Q Insert (s); 3: for all node v s do 4: key(v) = + 5: P Q Insert (v) //n times 6: end for 7: S = {}; // Let S be the set of explored nodes; : while S V do : v = P Q ExtractMin(); //n times : S = S {v }; : for all v / S and < v, v > E do 12: if key(v) + d(v, v) < key(v) then 13: P QDecreaseKey(v, key(v ) + d(v, v)); //m times : end if 15: end for 16: end while Here P Q denotes a min-priority queue /

Dijkstra s algorithm: an example s 15 c a b 20 24 1 30 e 44 2 d 1 f 6 16 t

Dijkstra s algorithm: an example S = {} P Q = {s(0), a(), b(), c(), d(), e(), f(), t()} 0 s 15 c a b 20 24 1 d 2 30 e 44 1 f 6 16 t

Dijkstra s algorithm: an example S = {} P Q = {s(0), a(), b(), c(), d(), e(), f(), t()} 0 s 15 c a b 20 24 1 d 2 30 e 44 1 f 6 16 ExtractMin returns s t

Dijkstra s algorithm: an example S = {} P Q = {s(0), a(), b(), c(), d(), e(), f(), t()} 0 24 s a d ExtractMin returns s 1 DecreaseKey(a, ) DecreaseKey(b, ) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 16 44 c t

Dijkstra s algorithm: an example S = {s} S = {} P Q P Q = = {a(), {s(0), b(), a(), c(15), b(), d(), c(), e(), d(), f(), e(), t()} f(), t()} 0 24 s a d ExtractMin returns s 1 DecreaseKey(a, ) DecreaseKey(b, ) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S = {s} S = {} P Q P Q = = {a(), {s(0), b(), a(), c(15), b(), d(), c(), e(), d(), f(), e(), t()} f(), t()} 0 24 s a d ExtractMin returns as 1 DecreaseKey(a, ) DecreaseKey(b, ) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S = {s} S = {} P Q P Q = = {a(), {s(0), b(), a(), c(15), b(), d(), c(), e(), d(), f(), e(), t()} f(), t()} 0 24 s a d ExtractMin returns as 1 DecreaseKey(d, DecreaseKey(a, 33) ) DecreaseKey(b, ) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S = {s, {s} S a} = {} P Q P = Q {a(), = {s(0), {b(), b(), a(), c(15), c(15), b(), d(33), d(), c(), e(), e(), d(), f(), f(), e(), t()} t()} f(), t()} 0 33 24 s a d ExtractMin returns as 1 DecreaseKey(d, DecreaseKey(a, 33) ) DecreaseKey(b, ) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S = {s, {s} S a} = {} P Q P = Q {a(), = {s(0), {b(), b(), a(), c(15), c(15), b(), d(33), d(), c(), e(), e(), d(), f(), f(), e(), t()} t()} f(), t()} 0 33 24 s a d ExtractMin returns as b 1 DecreaseKey(d, DecreaseKey(a, 33) ) DecreaseKey(b, ) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S = {s, {s} S a} = {} P Q P = Q {a(), = {s(0), {b(), b(), a(), c(15), c(15), b(), d(33), d(), c(), e(), e(), d(), f(), f(), e(), t()} t()} f(), t()} 0 33 24 s a d ExtractMin returns as b 1 DecreaseKey(d, DecreaseKey(a, 33) 32) ) DecreaseKey(e, DecreaseKey(b, ) 44) 2 15 1 DecreaseKey(c, 15) b 30 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} P Q P = Q P {a(), = Q {s(0), {b(), = {c(15), b(), a(), c(15), c(15), d(32), b(), d(33), d(), e(44), c(), e(), e(), f(), d(), f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns as b 1 DecreaseKey(d, DecreaseKey(a, 33) 32) ) DecreaseKey(e, DecreaseKey(b, ) 44) 2 15 1 DecreaseKey(c, 15) b 30 44 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} P Q P = Q P {a(), = Q {s(0), {b(), = {c(15), b(), a(), c(15), c(15), d(32), b(), d(33), d(), e(44), c(), e(), e(), f(), d(), f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns as cb 1 DecreaseKey(d, DecreaseKey(a, 33) 32) ) DecreaseKey(e, DecreaseKey(b, ) 44) 2 15 1 DecreaseKey(c, 15) b 30 44 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} P Q P = Q P {a(), = Q {s(0), {b(), = {c(15), b(), a(), c(15), c(15), d(32), b(), d(33), d(), e(44), c(), e(), e(), f(), d(), f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns as cb 1 DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) ) 44) 2 15 1 DecreaseKey(c, 15) b 30 44 f 20 e 6 15 16 44 c t

Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} P Q P = Q P {a(), = Q P{s(0), {b(), = Q {c(15), = b(), a(), {d(32), c(15), c(15), d(32), b(), d(33), e(35), d(), e(44), c(), e(), t(5), e(), f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns as cb 1 DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) ) 44) 2 15 1 DecreaseKey(c, 15) b 30 44 35 f 20 e 6 15 16 5 44 c t

Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} P Q P = Q P {a(), = Q P{s(0), {b(), = Q {c(15), = b(), a(), {d(32), c(15), c(15), d(32), b(), d(33), e(35), d(), e(44), c(), e(), t(5), e(), f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns ad sc b 1 DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) ) 44) 2 15 1 DecreaseKey(c, 15) b 30 44 35 f 20 e 6 15 16 5 44 c t

Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} P Q P = Q P {a(), = Q P{s(0), {b(), = Q {c(15), = b(), a(), {d(32), c(15), c(15), d(32), b(), d(33), e(35), d(), e(44), c(), e(), t(5), e(), f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns ad sc b 1 DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 44 35 f 20 e 6 15 16 5 44 c t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q a(), {d(32), = c(15), {e(34), c(15), d(32), b(), d(33), e(35), d(), t(51), e(44), c(), e(), t(5), e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns ad sc b 1 DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 44 35 34 f 20 e 6 15 16 5 51 44 c t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q a(), {d(32), = c(15), {e(34), c(15), d(32), b(), d(33), e(35), d(), t(51), e(44), c(), e(), t(5), e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns ad se cb 1 DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 44 35 34 f 20 e 6 15 16 5 51 44 c t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q a(), {d(32), = c(15), {e(34), c(15), d(32), b(), d(33), e(35), d(), t(51), e(44), c(), e(), t(5), e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns ad se cb 1 DecreaseKey(f, DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 45) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) 50) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 44 35 34 f 20 e 6 15 16 5 51 44 c t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s, a, {s} a, b, a, b, S a} c, b, c, b} = d, c} d} {} e} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q Pa(), {d(32), = c(15), Q {e(34), c(15), = d(32), b(), {f(45), d(33), e(35), d(), t(51), e(44), c(), e(), t(5), t(50)} e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns ad se cb 1 DecreaseKey(f, DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 45) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) 50) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 45 44 35 34 f 20 e 6 15 16 5 51 50 44 c t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s, a, {s} a, b, a, b, S a} c, b, c, b} = d, c} d} {} e} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q Pa(), {d(32), = c(15), Q {e(34), c(15), = d(32), b(), {f(45), d(33), e(35), d(), t(51), e(44), c(), e(), t(5), t(50)} e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns fa ds ec b 1 DecreaseKey(f, DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 45) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) 50) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 45 44 35 34 f 20 e 6 15 16 5 51 50 44 c t

Dijkstra s algorithm: an example S S = S = {s, S {s, = a, {s, = a, {s, b, a, {s} a, b, c, a, b, S a} c, b, d, c, b} = d, c} e, d} {} e} f} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q Pa(), {d(32), = c(15), Q P{e(34), c(15), = d(32), Q b(), {f(45), = d(33), e(35), {t(50)} d(), t(51), e(44), c(), e(), t(5), t(50)} e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns fa ds ec b 1 DecreaseKey(f, DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 45) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) 50) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 45 44 35 34 f 20 e 6 15 16 5 51 50 44 c t

Dijkstra s algorithm: an example S S = S = {s, S = {s, a, = a, {s, a, b, {s, b, a, {s} a, c, b, c, a, b, S a} d, c, b, d, c, b} = d, c} e, e, d} {} e} f, f} t} P Q P = Q P {a(), = Q P{s(0), {b(), = Q P{c(15), = b(), Q Pa(), {d(32), = c(15), Q P{e(34), c(15), = d(32), Q Pb(), {f(45), Q = d(33), e(35), {t(50)} = d(), t(51), e(44), {} c(), e(), t(5), t(50)} e(), f()} f(), d(), f()} f(), f(), t()} e(), t()} t()} f(), t()} 0 33 32 24 s a d ExtractMin returns fa ds ec b 1 DecreaseKey(f, DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 45) ) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 5) 50) ) 44) 34) 2 15 1 DecreaseKey(c, 15) b 30 45 44 35 34 f e 6 15 20 16 5 51 50 44 c t /

Time complexity of Dijkstra algorithm Operation Linked Binary Binomial Fibonacci list heap heap heap MakeHeap 1 1 1 1 Insert 1 log n log n 1 ExtractMin n log n log n log n DecreaseKey 1 log n log n 1 Delete n log n log n log n Union 1 n log n 1 FindMin n 1 log n 1 Dijkstra O(n 2 ) O(m log n) O(m log n) O(m + n log n) Dijkstra algorithm: n Insert, n ExtractMin, and m DecreaseKey /

Ịmplementing priority queue: array or linked list /

Implementing priority queue: unsorted array Unsorted array: 1 6 2 4 Unsorted linked list: head 1 6 2 4 Operations: Insert: O(1) ExtractMin: O(n) Note: a list containing n elements is too long to find the minimum efficiently 12 /

Implementing priority queue: sorted array Sorted array: 1 2 4 6 Sorted linked list: head 1 2 4 6 Operations: Insert: O(n) ExtractMin: O(1) Note: a list containing n elements is too long to maintain the order among elements 13 /

Implementing priority queue: array or linked list Operation Linked List Insert O(1) ExtractMin O(n) DecreaseKey O(1) Union O(1) /

Ḅinary heap: from a linked list to a tree 15 /

Binary heap Figure 1: R W Floyd [164] 16 /

Binary heap: a complete binary tree Basic idea: loosing the structure: Recall that the objective is to find the minimum To achieve this objective, it is not necessary to sort all elements; but don t loose it too much: we still need order between some elements; 6 12 1 25 21 17 Binary heap: elements are stored in a complete binary tree, ie, a tree that is perfectly balanced except for the bottom level Heap order is required, ie, any parent has a key smaller than his children; Advantage: any path has a short length of O(log 2 n) rather than n in linked list, making it efficient to maintain heap 17 /

Binary heap: an explicit implementation Pointer representation: each node has pointers to its parent and two children; The following information are maintained: the number of elements n; the pointer to the root node; lptr root 6 rptr 12 1 25 21 Note: the last node can be found in O(log n) time 1 /

Binary heap: an implicit implementation Array representation: one-one correspondence between a binary tree and an array Binary tree array: the indices starting from 1 for the sake of simplicity; the indices record the order that the binary tree is traversed level by level Array binary tree: the k-th item has two children located at 2k and 2k + 1; the parent of the k-th item is located at k 2 ; 1 6 2 3 4 12 5 1 6 7 25 6 1 2 3 12 4 1 5 6 25 7 21 21 1 /

Sorted array vs binary heap Sorted array: an array containing n elements in an increasing order; 6 12 1 21 25 1 2 3 4 5 6 7 Binary heap: heap order means that only the order among nodes in short paths (length is less than log n) are maintained Note that some inverse pairs exist in the array 6 12 1 25 21 1 2 3 4 5 6 7 20 /

Ḅinary heap: primitive and other operations 21 /

Primitive: exchanging nodes to restore heap order Primitive operation: when heap order is violated, ie a parent has a value larger than only one of its children, we simply exchange them to resolve the conflict 6 6 15 13 1 13 25 1 15 25 21 27 21 27 Figure 2: Heap order is violated: 15 > 13 Exchange them to resolve the conflict 22 /

Primitive: exchanging nodes to restore heap order Primitive operation: when heap order is violated, ie a parent has a value larger than both of its children, we exchange the parent with its smaller child to resolve the conflict 6 6 20 12 12 1 25 20 1 25 21 27 21 27 Figure 3: Heap order is violated: 20 > 12, and 20 > 1 Exchange 20 with its smaller child (12) to resolve the conflicts 23 /

Binary heap: Insert operation Insert operation: the element is added as a new node at the end Since the heap order might be violated, the node is repeatedly exchanged with its parent until heap order is restored For example, Insert( 7 ): 6 6 7 12 1 25 1 25 21 7 21 12 24 /

Binary heap: ExtractMin operation ExtractMin operation: exchange element in root with the last node; repeatedly exchange the element in root with its smaller child until heap order is restored For example, ExtractMin(): 6 21 12 1 25 12 1 25 21 6 12 1 21 25 25 /

Binary heap: DecreaseKey operation DecreaseKey operation: given a handle to a node, repeatedly exchange the node with its parent until heap order is restored For example, DecreaseKey( ptr, 7 ): 6 6 7 12 1 25 1 25 21 ptr 12 26 /

Binary heap: analysis Theorem In an implicit binary heap, any sequence of m Insert, DecreaseKey, and ExtractMin operations with n Insert operations takes O(m log n) time Note: Each operation touches at most log n nodes on a path from the root to a leaf Theorem In an explicit binary heap with n nodes, the Insert, DecreaseKey, and ExtractMin operations take O(m log n) time in the worst case Note: If using array representation, a dynamic array expanding/contracting is needed However, the total cost of array expanding/contracting is O(n) (see TableInsert) 27 /

Binary heap: heapify a set of items Question: Given a set of n elements, how to construct a binary heap containing them? Solutions: 1 Simply Insert the elements one by one Takes O(n log n) time 2 Bottom-up heapifying Takes O(n) time For i = n to 1, we repeatedly exchange the element in node i with its smaller child until the subtree rooted at node i is heap-ordered 1 21 4 2 5 6 3 7 21 1 2 3 25 4 5 1 6 12 7 6 25 1 12 6 (see a demo) 2 /

Binary heap: heapify Theorem Given n elements, a binary heap can be constructed using O(n) time Proof There are at most n nodes of height h; 2 h+1 It takes O(h) time to sink a node of height h; The total time is: log 2 n h=0 n h 2h+1 log 2 n 2n h=0 n h 2 h 2 /

Implementing priority queue: binary heap Operation Linked List Binary Heap Insert O(1) O(log n) ExtractMin O(n) O(log n) DecreaseKey O(1) O(log n) Union O(1) O(n) 30 /

Binary heap: Union operation Union operation: Given two binary heaps H 1 and H 2, to merge them into one binary heap H 1 H 2 21 6 25 27 2 O(n) time is needed if using heapify Question: Is there a quicker way to union two heaps? 31 /

Binomial heap: using multiple trees rather than a single tree to support efficient Union operation 32 /

Binomial heap Figure 4: Jean Vuillenmin [17] 33 /

Binomial heap: efficient Union Basic idea: loosing the structure: if multiple trees are allowed to represent a heap, Union can be efficiently implemented via simply putting trees together but don t loose it too much: there should not be too many trees; otherwise, it will take a long time to find the minimum among all root nodes H 1 H 2 21 6 25 27 2 ExtractMin: simply finding the minimum element of the root nodes Note that a root node holds the minimum of the tree due to the heap order 34 /

Why we can t loose the structure too much? An extreme case of multiple trees: each node is itself a tree Then it will take O(n) time to find the minimum 6 2 Solution: consolidating, ie, two trees (with the same size) are merged into one the larger root is linked to the smaller one to keep the heap order Note that after consolidating, at most log n trees will be left 6 link 6 6 6 link 35 /

Binomial tree Definition (Binomial tree) The binomial tree is defined recursively: a single node is itself a B 0 tree, and two B k trees are linked into a B k+1 tree B 0 B k+1 B k B k 36 /

Binomial tree examples: B 0, B 1, B 2 B 0 6 B 1 6 B 2 6 37 /

Binomial tree example: B 3 B 3 6 7 12 3 /

Binomial tree example: B 4 B 4 0 7 5 12 12 13 15 16 17 3 /

Binomial tree example: B 5 B 5 0 7 5 6 B 4 12 12 13 15 16 17 40 /

Binomial tree: property Properties: 1 B k = 2 k 2 height(b k ) = k 3 degree(b k ) = k 4 The i-th child of a node has a degree of i 1 5 The deletion of the root yields trees B 0, B 1,, B k 1 6 Binomial tree is named after the fact that the node number of all levels are binomial coefficients B 3 6 B k+1 7 12 B 0 B 1 B 2 B k 41 /

Binomial heap: a forest Definition (Binomial forest) A binomial heap is a collection of several binomial trees: Each tree is heap ordered; There is either 0 or 1 B k for any k Example: B 0 B 1 B 2 B 3 4 5 6 3 7 12 Note that the roots are organized using doubly-linked circular list, and the minimum of them is recorded using a pointer 42 /

Binomial heap: properties Properties: 1 A binomial heap with n nodes contains the binomial tree B i iff b i = 1, where b k b k 1 b 1 b 0 is binary representation of n 2 It has at most log 2 n + 1 trees 3 Its height is at most log 2 n Thus, it takes O(log n) time to find the minimum element via checking the roots B 0 B 1 B 2 B 3 4 5 6 3 7 12 43 /

Union is efficient: example 1 B 0 B 2 B 1 B 3 5 4 6 3 + 7 12 = B 0 3 B 1 4 B 2 5 B 3 6 7 12 Figure 5: An easy case: no consolidating is needed 44 /

Union is efficient: example 2 I B 1 B 2 B 1 B 3 3 5 4 6 + 7 7 12 = B 2 3 B 2 5 B 3 6 7 4 7 12 Figure 6: Consolidating two B 1 trees into a B 2 tree 45 /

Union is efficient: example 2 II B 1 B 2 B 1 B 3 3 5 4 6 + 7 7 12 = B 3 3 B 3 6 7 4 5 7 12 Figure 7: Consolidating two B 2 trees into a B 3 tree 46 /

Union is efficient: example 2 III B 1 B 2 B 1 B 3 3 5 4 6 + 7 7 = B 4 3 12 7 4 5 6 7 12 Time complexity: O(log n) since there are at most O(log n) trees 47 /

Binomial heap: Insert operation Insert(x) 1: Create a B 0 tree for x; 2: Change the pointer to the minimum root node if necessary; 3: while there are two B k trees for some k do 4: Link them together into one B k+1 tree; 5: Change the pointer to the minimum root node if necessary; 6: end while 4 /

Insert operation: an example B 0 B 4 2 3 4 5 6 7 12 Figure : An easy case: no consolidating is needed 4 /

Insert operation: example 2 I B 0 B 0 B 1 B 2 B 3 2 3 4 5 6 7 12 Figure : Consolidating two B 0 50 /

Insert operation: example 2 II B 1 B 1 B 2 B 3 4 5 6 2 3 7 12 Figure : Consolidating two B 1 51 /

Insert operation: example 2 III B 2 B 2 B 3 5 6 2 3 4 7 12 Figure : Consolidating two B 2 52 /

Insert operation: example 2 IV B 3 B 3 6 2 3 4 5 7 12 Figure 12: Consolidating two B 3 53 /

Insert operation: example 2 V B 4 2 3 4 5 6 7 12 Time complexity: O(log n) (worst case) since there are at most log n trees 54 /

Binomial heap: ExtractMin operation ExtractMin() 1: Remove the min node, and insert its children into the root list; 2: Change the pointer to the minimum root node if necessary; 3: while there are two B k trees for some k do 4: Link them together into one B k+1 tree; 5: Change the pointer to the minimum root node if necessary; 6: end while 55 /

ExtractMin operation: an example I B 0 2 B 4 3 4 5 6 7 12 56 /

ExtractMin operation: an example II B 0 B 0 B 1 B 2 B 3 3 4 5 6 7 12 Figure 13: The four children become trees 57 /

ExtractMin operation: an example III B 1 B 1 B 2 B 3 3 4 5 6 7 12 Figure : Consolidating two B 1 trees 5 /

ExtractMin operation: an example IV B 2 B 2 B 3 3 5 6 4 7 12 Figure 15: Consolidating two B 2 trees 5 /

ExtractMin operation: an example V B 3 B 3 3 6 4 5 7 12 Figure 16: Consolidating two B 2 trees 60 /

ExtractMin operation: an example VI B 4 3 4 5 6 7 12 Time complexity: O(log n) 61 /

Implementing priority queue: Binomial heap Operation Linked List Binary Heap Binomial Heap Insert O(1) O(log n) O(log n) ExtractMin O(n) O(log n) O(log n) DecreaseKey O(1) O(log n) O(log n) Union O(1) O(n) O(log n) 62 /

Binomial heap: more accurate analysis using the amortized technique 63 /

Amortized analysis of Insert Motivation: If an Insert takes a long time (say log n), the subsequent Insert operations shouldn t take long! Thus, it will be more accurate to examine a sequence of operations rather than each operation individually 64 /

Amortized analysis of Insert operation Insert(x) 1: Create a B 0 tree for x; 2: Change the pointer to the minimum root node if necessary; 3: while there are two B k trees for some k do 4: Link them together into one B k+1 tree; 5: Change the pointer to the minimum root node if necessary; 6: end while Analysis: A single Insert operation takes time 1 + w, where w = #WHILE For the sake of calculating the total running time of a sequence of operations, we represent the running time of a single operation as decrease of a potential function Consider a quantity Φ = #trees (called potential function) The changes of Φ during an operation are: Φ increase: 1 Φ decrease: w Thus the running time of Insert can be rewritten in terms of 65 /

Amortized analysis of ExtractMin ExtractMin() 1: Remove the min node, and insert its children to the root list; 2: Change the pointer to the minimum root node if necessary; 3: while there are two B k trees for some k do 4: Link them together into one B k+1 tree; 5: Change the pointer to the minimum root node if necessary; 6: end while Analysis: A single ExtractMin operation takes d + w time, where d denotes degree of the removed root node, and w = #WHILE For the sake of calculating the total running time of a sequence of operations, we represent the running time of a single operation as decrease of a potential function Consider a potential function Φ = #trees The changes during an operation are: Φ increase: d Φ decrease: w Similarly, the running time is rewritten in terms of Φ as 66 /

Amortized analysis Let s consider any sequence of n Insert and m ExtractMin operations The total running time is at most n + m log n+ total decrease in #trees Note: total decrease in #trees total increase in #trees (why?), which is at most n + m log n Thus the total time is at most 2n + 2m log n We say Insert takes O(1) amortized time, and ExtractMin takes O(log n) amortized time Definition (Amortized time) For any sequence of n 1 operation 1, n 2 operation 2, if the total time is O(n 1 T 1 + n 2 T 2 ), we say that operation 1 takes T 1 amortized time, operation 2 takes T 2 amortized time 67 /

Intuition of the amortized analysis The actual running time of an Insert operation is 1 + w A large w means that the Insert operation takes a long time Note that the w time was spent on decreasing trees ; thus, if the w time was amortized over the operations creating trees, the amortized time of Insert operation will be only O(1) The actual running time of an ExtractMin operation is at most log n + w Note that at most log n new trees are created during an ExtractMin operation; thus, the amortized time is still O(log n) even if some costs have been amortized to it from other operations due to tree creating 6 /

Implementing priority queue: Binomial heap Operation Linked List Binary Heap Binomial Heap Binomial Heap* Insert O(1) O(log n) O(log n) O(1) ExtractMin O(n) O(log n) O(log n) O(log n) DecreaseKey O(1) O(log n) O(log n) O(log n) Union O(1) O(n) O(log n) O(1) *amortized cost 6 /

Binomial heap: DecreaseKey operation B 4 0 7 5 12 12 13 15 16 17 Figure 17: DecreaseKey: 17 to 1 Time: O(log n) since in the worst case, we need to perform node exchanging up to the root Question: is there a quicker way for decrease key? 70 /

Fibonacci heap: an efficient implementation of DecreaseKey via simply cutting an edge rather than exchanging nodes 71 /

Fibonacci heap Figure 1: Robert Tarjan [16] 72 /

Fibonacci heap: an efficient DecreaseKey operation Basic idea: loosing the structure: When heap order is violated, a simple solution is to cut off a node, and insert it into the root list but don t loose it too much: the cutting off operation makes a tree not binomial any more; however, it should not deviate from a binomial tree too much A technique to achieve this objective is allowing any non-root node to lose at most one child Figure 1: Heap order is violated when DescreaseKey 13 to 2 However, the heap order can be easily restored via cutting off the node, and inserting it into the root list 73 /

Fibonacci heap: DescreaseKey DecreaseKey(v, x) 1: key(v) = x; 2: if heap order is violated then 3: u = v s parent; 4: Cut subtree rooted at node v, and insert it into the root list; 5: Change the pointer to the minimum root node if necessary; 6: while u is marked do 7: Cut subtree rooted at node u, and insert it into the root list; : Change the pointer to the minimum root node if necessary; : Unmark u; : u = u s parent; : end while 12: Mark u; 13: end if 74 /

DecreaseKey: an example I 0 7 5 12 1 1 13 20 15 16 17 Figure 20: A Fibonacci heap To DecreaseKey: 1 to 3 75 /

DecreaseKey: an example II 0 3 7 5 20 12 1 13 15 16 17 Figure 21: After DecreaseKey: 1 to 3 To DecreaseKey: 15 to 2 76 /

DecreaseKey: an example III 0 3 2 7 5 20 12 1 13 16 Figure 22: After DecreaseKey: 15 to 2 To DecreaseKey: 12 to 17 77 /

DecreaseKey: an example IV 0 3 2 7 5 20 1 13 16 Figure 23: After DecreaseKey: 12 to To DecreaseKey: to 1 17 7 /

DecreaseKey: an example V 1 3 2 0 7 5 20 1 13 16 Figure 24: After DecreaseKey: to 1 To DecreaseKey: 16 to 17 7 /

DecreaseKey: an example VI 1 5 3 2 13 0 7 1 20 17 Figure 25: After DecreaseKey: 16 to 0 /

Fibonacci heap: Insert Insert(x) 1: Create a tree for x, and insert it into the root list; 2: Change the pointer to the minimum root node if necessary; Note: Being lazy! Consolidating trees when extracting minimum 1 0 5 3 2 13 7 1 20 17 1 5 3 2 13 6 0 7 1 20 17 Figure 26: Insert(6): creating a new tree, and insert it into the root list 1 /

Fibonacci heap: ExtractMin ExtractMin() 1: Remove the min node, and insert its children into the root list; 2: Change the pointer to the minimum root node if necessary; 3: while there are two roots u and v of the same degree do 4: Consolidate the two trees together; 5: Change the pointer to the minimum root node if necessary; 6: end while 2 /

ExtractMin: an example I 1 5 3 2 13 0 7 1 20 17 1 7 5 3 2 13 1 20 17 Figure 27: ExtractMin: removing the min node, and adding 3 trees 3 /

ExtractMin: an example II 1 7 5 3 2 13 1 20 17 Figure 2: ExtractMin: after consolidating two trees rooted at node 1 and 1 7 5 3 2 13 1 20 17 Figure 2: ExtractMin: after consolidating two trees rooted at node 1 and 4 /

ExtractMin: an example III 1 5 3 2 13 7 1 20 17 Figure 30: ExtractMin: after consolidating two trees rooted at node 1 and 7 1 3 2 13 7 20 5 17 1 Figure 31: ExtractMin: after consolidating two trees rooted at node 3 and 5 5 /

ExtractMin: an example IV 1 3 2 7 20 5 13 17 1 Figure 32: ExtractMin: after consolidating two trees rooted at node 2 and 13 1 3 2 7 20 5 13 1 17 Figure 33: ExtractMin: after consolidating two trees rooted at node 2 and 6 /

ExtractMin: an example V 1 2 5 7 13 3 17 20 1 Figure 34: ExtractMin: after consolidating two trees rooted at node 2 and 3 7 /

ExtractMin: an example VI 1 5 7 2 13 3 17 20 1 Figure 35: ExtractMin: after consolidating two trees rooted at node 1 and 2 /

Fibonacci heap: an amortized analysis /

Fibonacci heap: DecreaseKey DecreaseKey(v, x) 1: key(v) = x; 2: if heap order is violated then 3: u = v s parent; 4: Cut subtree rooted at node v, and insert it into the root list; 5: Change the pointer to the minimum root node if necessary; 6: while u is marked do 7: Cut subtree rooted at node u, and insert it into the root list; : Change the pointer to the minimum root node if necessary; : Unmark u; : u = u s parent; : end while 12: Mark u; 13: end if 0 /

DecreaseKey: analysis Analysis: The actual running time of a single operation is 1 + w, where w = #WHILE For the sake of calculating the total running time of a sequence of operations, we represent the running time of a single operation as decrease of a potential function Consider a potential function Φ = #trees + 2#marks The changes of Φ during an operation are: Φ increase: 1 + 2 = 3 Φ decrease: ( 1 + 2 1) w = w Thus we can rewrite the running time in terms of Φ as 1 + w = 1 + Φ decrease Intuition: a large w means that DecreaseKey takes a long time; however, if we can amortize w over other operations, a DecreaseKey operation takes only O(1) amortized time 1 /

Fibonacci heap: ExtractMin ExtractMin() 1: Remove the min node, and insert its children into the root list; 2: Change the pointer to the minimum root node if necessary; 3: while there are two roots u and v of the same degree do 4: Consolidate the two trees together; 5: Change the pointer to the minimum root node if necessary; 6: end while 2 /

ExtractMin: analysis Analysis: The actual running time of a single operation is d + w, where d denotes degree of the removed node, and w = #WHILE For the sake of calculating the total running time of a sequence of operations, we represent the running time of a single operation as decrease of a potential function Consider a potential function Φ = #trees + 2#marks The changes of Φ during an operation are: Φ increase: d Φ decrease: w Thus the running time can be rewritten in terms of Φ as d + w = d+ decrease in Φ Note: d d max, where d max denotes the maximum root node degree 3 /

Fibonacci heap: Insert Insert(x) 1: Create a tree for x, and insert it into the root list; 2: Change the pointer to the minimum root node if necessary; Analysis: The actual running time is 1, and the changes of Φ during this operation are: Φ increase: 1 Φ decrease: 0 Note: Recall that a binomial heap consolidates trees in both Insert and ExtractMin operations In contrast, the Fibonacci heap adopts the strategy of being lazy tree consolidating is removed from Insert operation for the sake of efficiency, and there is no tree consolidating until an ExtractMin operation 4 /

Fibonacci heap: amortized analysis Consider any sequence of n Insert, m ExtractMin, and r DecreaseKey operations The total running time is at most: n + md max + r+ total decrease in Φ Note: total decrease in Φ total increase in Φ = n + md max + 3r Thus the total running time is at most: n + md max + r + n + md max + 3r = 2n + 2md max + 4r Thus Insert takes O(1) amortized time, DecreaseKey takes O(1) amortized time, and ExtractMin takes O(d max ) amortized time In fact, ExtractMin takes O(log n) amortized time since d max can be upper-bounded by log n (why?) 5 /

Fibonacci heap: bounding d max 6 /

Fibonacci heap: bounding d max Recall that for a binomial tree having n nodes, the root degree d is exactly log 2 n, ie d = log 2 n 6 7 12 In contrast, a tree in a Fibonacci heap might have several subtrees cutting off, leading to d log 2 n 6 7 However, the marking technique guarantees that any node can lose at most one child, thus limiting the deviation from the original binomial tree, ie log ϕ n d log 2 n, where 1+ 5 7 /

Fibonacci heap: a property of node degree Recall that for a binomial tree, the i-th child of each node has a degree of exactly i 1 6 7 12 For a tree in a Fibonacci heap, we will show that the i-th child of each node has degree i 2 6 7 /

Lemma For any node in a Fibonacci heap, the i-th child has a degree i 2 6 w 7 u Proof Suppose u is the current i-th child of w; If w is not a root node, it has at most 1 child lost; otherwise, it might have multiple children lost; Consider the time when u is linked to w At that time, degree(w) i 1, so degree(u) = degree(w) i 1; Subsequently, degree(u) decreases by at most 1 (Otherwise, u will be cut off and no longer a child of w) Thus, degree(u) i 2 /

The smallest tree with root degree k in a Fibonacci heap Let F k be the smallest tree with root degree of k, and for any node of F k, the i-th child has degree i 2; B 1 6 F 0 6 Figure 36: B 1 = 2 1 and F 0 = 1 ϕ 0 0 /

Example: B 2 versus F 1 Let F k be the smallest tree with root degree of k, and for any node of F k, the i-th child has degree i 2; B 2 6 F 1 6 Figure 37: B 2 = 2 2 and F 1 = 2 ϕ 1 1 /

Example: B 3 versus F 2 Let F k be the smallest tree with root degree of k, and for any node of F k, the i-th child has degree i 2; B 3 6 7 F 2 6 12 Figure 3: B 3 = 2 3 and F 2 = 3 ϕ 2 2 /

Example: B 4 versus F 3 B 4 0 7 5 F 3 6 12 12 13 7 15 16 17 Figure 3: B 4 = 2 4 and F 3 = 5 ϕ 3 3 /

Example: B 5 versus F 4 B 5 0 7 12 5 12 13 6 B 4 F 4 0 7 5 15 16 12 17 Figure 40: B 5 = 2 5 and F 4 = ϕ 4 4 /

General case of trees in Fibonacci heap Recall that a binomial tree B k+1 is a combination of two B k trees B 0 B k+1 B k In contrast, F k+1 is the combination of an F k tree and an F k 1 tree B k F 0 F k+1 F k F k 1 We will show that though F k is smaller than B k, the difference is not too much In fact, F k 161 k 5 /

Fibonacci numbers and Fibonacci heap Definition (Fibonacci numbers) The Fibonacci sequence is 0, 1, 1, 2, 3, 5,, 13, 21, 34 It can be 0 if k = 0 defined by the recursion relation: f k = 1 if k = 1 f k 1 + f k 2 if k 2 Recall that f k+2 ϕ k, where ϕ = 1+ 5 2 = 161 Note that F k = f k+2, say F 0 = f 2 = 1, F 1 = f 3 = 2, F 2 = f 4 = 3 Consider a Fibonacci heap H having n nodes Let T denote a tree in H with root degree d We have n T F d = f d+2 ϕ d Thus d = O(log ϕ n) = O(log n) So, d max = O(log n) Therefore, ExtractMin operation takes O(log n) amortized time 6 /

Implementing priority queue: Fibonacci heap Operation Linked List Binary Heap Binomial Heap Binomial Heap* Fibonacci Heap* Insert O(1) O(log n) O(log n) O(1) O(1) ExtractMin O(n) O(log n) O(log n) O(log n) O(log n) DecreaseKey O(1) O(log n) O(log n) O(log n) O(1) Union O(1) O(n) O(log n) O(1) O(1) *amortized cost 7 /

Time complexity of Dijkstra algorithm Operation Linked Binary Binomial Fibonacci list heap heap heap MakeHeap 1 1 1 1 Insert 1 log n log n 1 ExtractMin n log n log n log n DecreaseKey 1 log n log n 1 Delete n log n log n log n Union 1 n log n 1 FindMin n 1 log n 1 Dijkstra O(n 2 ) O(m log n) O(m log n) O(m + n log n) Dijkstra algorithm: n Insert, n ExtractMin, and m DecreaseKey /