SKEW HEAPS: Self-Adjusting Heaps

Similar documents
Self-Adjusting Heaps

CMSC 341 Leftist Heaps

CMSC 341 Lecture 15 Leftist Heaps

Rank-Pairing Heaps. Bernard Haeupler Siddhartha Sen Robert E. Tarjan. SIAM JOURNAL ON COMPUTING Vol. 40, No. 6 (2011), pp.

CMSC 341 Lecture 15 Leftist Heaps

Are Fibonacci Heaps Optimal? Diab Abuaiadh and Jeffrey H. Kingston ABSTRACT

Chapter 6 Heaps. Introduction. Heap Model. Heap Implementation

CS711008Z Algorithm Design and Analysis

Advanced Algorithms. Class Notes for Thursday, September 18, 2014 Bernard Moret

Leftist Heaps and Skew Heaps. (Leftist Heaps and Skew Heaps) Data Structures and Programming Spring / 41

Heap Model. specialized queue required heap (priority queue) provides at least

CS711008Z Algorithm Design and Analysis

Heaps with merging. We can implement the other priority queue operations in terms of merging!

Priority Queues. Meld(Q 1,Q 2 ) merge two sets

Computational Optimization ISE 407. Lecture 16. Dr. Ted Ralphs

Binary Heaps in Dynamic Arrays

Binary heaps (chapters ) Leftist heaps

Trees (Part 1, Theoretical) CSE 2320 Algorithms and Data Structures University of Texas at Arlington

Lecture Notes for Advanced Algorithms

On the other hand, the main disadvantage of the amortized approach is that it cannot be applied in real-time programs, where the worst-case bound on t

Binary search trees. Support many dynamic-set operations, e.g. Search. Minimum. Maximum. Insert. Delete ...

CSCI2100B Data Structures Heaps

Chapter 22 Splay Trees

CSE 5311 Notes 4a: Priority Queues

CSCI-1200 Data Structures Fall 2018 Lecture 23 Priority Queues II

Algorithms in Systems Engineering ISE 172. Lecture 16. Dr. Ted Ralphs

Binary Trees. Recursive definition. Is this a binary tree?

CS521 \ Notes for the Final Exam

l So unlike the search trees, there are neither arbitrary find operations nor arbitrary delete operations possible.

Priority Queues and Binary Heaps

Priority Queues. Priority Queues Trees and Heaps Representations of Heaps Algorithms on Heaps Building a Heap Heapsort.

CE 221 Data Structures and Algorithms

arxiv: v2 [cs.ds] 9 Apr 2009

(2,4) Trees. 2/22/2006 (2,4) Trees 1

18.3 Deleting a key from a B-tree

Chapter 6 Heapsort 1

Thus, it is reasonable to compare binary search trees and binary heaps as is shown in Table 1.

Sorting and Searching

Treaps. 1 Binary Search Trees (BSTs) CSE341T/CSE549T 11/05/2014. Lecture 19

1 Figure 1: Zig operation on x Figure 2: Zig-zig operation on x

Lecture 6: Analysis of Algorithms (CS )

Cpt S 223 Course Overview. Cpt S 223, Fall 2007 Copyright: Washington State University

V Advanced Data Structures

l Heaps very popular abstract data structure, where each object has a key value (the priority), and the operations are:

Sorting and Searching

HEAPS: IMPLEMENTING EFFICIENT PRIORITY QUEUES

CSE 373 Sample Midterm #2 (closed book, closed notes, calculators o.k.)

! Tree: set of nodes and directed edges. ! Parent: source node of directed edge. ! Child: terminal node of directed edge

V Advanced Data Structures

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

CSE 548: Analysis of Algorithms. Lectures 14 & 15 ( Dijkstra s SSSP & Fibonacci Heaps )

Soft Heaps And Minimum Spanning Trees

Recall: Properties of B-Trees

COMP Analysis of Algorithms & Data Structures

Algorithms and Theory of Computation. Lecture 7: Priority Queue

Lecture 4 Feb 21, 2007

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

CS102 Binary Search Trees

CSE373: Data Structures & Algorithms Lecture 9: Priority Queues. Aaron Bauer Winter 2014

2 A Template for Minimum Spanning Tree Algorithms

Outline for Today. Why could we construct them in time O(n)? Analyzing data structures over the long term.

Algorithms and Data Structures

CMSC 341 Lecture 14: Priority Queues, Heaps

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

COMP Analysis of Algorithms & Data Structures

FIBONACCI HEAPS. preliminaries insert extract the minimum decrease key bounding the rank meld and delete. Lecture slides by Kevin Wayne

Heaps, Heap Sort, and Priority Queues.

Lecture: Analysis of Algorithms (CS )

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

Lecture 5: Scheduling and Binary Search Trees

Stores a collection of elements each with an associated key value

Disjoint-set data structure: Union-Find. Lecture 20

CSE 373 Spring 2010: Midterm #2 (closed book, closed notes, NO calculators allowed)

Quake Heaps: A Simple Alternative to Fibonacci Heaps

Lecture Summary CSC 263H. August 5, 2016

Technical University of Denmark

Lecture 5 Using Data Structures to Improve Dijkstra s Algorithm. (AM&O Sections and Appendix A)

Three Sorting Algorithms Using Priority Queues

Data Structures Lesson 9

3. Priority Queues. ADT Stack : LIFO. ADT Queue : FIFO. ADT Priority Queue : pick the element with the lowest (or highest) priority.

Jana Kosecka. Red-Black Trees Graph Algorithms. Many slides here are based on E. Demaine, D. Luebke slides

! Tree: set of nodes and directed edges. ! Parent: source node of directed edge. ! Child: terminal node of directed edge

3 Competitive Dynamic BSTs (January 31 and February 2)

Data Structures Week #8. Heaps (Priority Queues)

CSci 231 Final Review

Binary Trees, Binary Search Trees

CSE 100: GRAPH ALGORITHMS

CPS 231 Exam 2 SOLUTIONS

Let the dynamic table support the operations TABLE-INSERT and TABLE-DELETE It is convenient to use the load factor ( )

RANK-PAIRING HEAPS BERNHARD HAEUPLER 1, SIDDHARTHA SEN 2,4, AND ROBERT E. TARJAN 3,4

Introduction to Algorithms

Priority queues. Priority queues. Priority queue operations

CS 361, Lecture 21. Outline. Things you can do. Things I will do. Evaluation Results

Advanced algorithms. binary heap, d-ary heap, binomial heap, amortized analysis, Fibonacci heap. Jiří Vyskočil, Radek Mařík 2013

The priority is indicated by a number, the lower the number - the higher the priority.

Heaps Goodrich, Tamassia. Heaps 1

Operations on Heap Tree The major operations required to be performed on a heap tree are Insertion, Deletion, and Merging.

A Comparison of Data Structures for Dijkstra's Single Source Shortest Path Algorithm. Shane Saunders

W4231: Analysis of Algorithms

Algorithms and Data Structures. (c) Marcin Sydow. Priority Queue. Example Applications. Extensions of. Priority Queue. Binomial Heap.

Transcription:

Lecture Note 05 CSE40/50 SKEW HEAPS: Self-Adjusting Heaps In this handout we describe the skew heap data structure, a self-adjusting form of heap related to the leftist heap of Crane and Knuth. Skew heaps, in contrast to leftist heaps, use less space and are easier to implement, and yet in the amortized sense they are as efficient, to within a constant factor, asleftist heaps. We will consider the following priority queue operations: makeheap(h): Create anew, empty heap, named h. findmin(h): insert(k, h): deletemin(h): Return a minimum key in h. If h is empty, return the special key "null". (This operation does not change the set of keys stored in h.) Insert key k in heap h. (Duplicate keys are allowed in the heap. That is, the key tobeinserted is treated as "new".) Delete a minimum key from heap h and return it. If the heap is initially empty, return null. union( h, h 2 ): Return the heap formed by joining (i.e., taking the union of) heaps h and h 2.This operation distroys h and h 2. THE DATA STRUCTURE There are several ways to implement heaps in a self-adjusting fashion. The one we shall discuss is called skew heaps as proposed by Sleator and Tarjan, and is analogous to leftist heaps. Askew heap is a heap-ordered binary tree. That is, it is a binary tree with one key ineach node so that for each node x other that the root, the key at node x is no less than the key at the parent of x. Torepresent such a tree we store in each node x its associated key, denoted key(x) and two pointers left(x) and right(x), to its left child and right child, respectively. If x has no left child we define left(x)=λ; if x has no right child we define right(x)=λ. Access to the tree is by a pointer to its root; we represent an empty tree by a pointer to Λ. With this representation we can carry out the various heap operations as follows. We perform makeheap(h) in O( ) time by initializing h to Λ. Since heap order implies that the root is a minimum key in the tree, we can carry out findmin(h) in O( ) time by returning the key at the root; returning null if the heap is empty. We perform insert and deletemin using union. Tocarry out insert(k, h), we make k into a one-node heap and Union it with h. To carry out deletemin(h), if h is not empty we replace h by the Union of its left and right subtrees and return the key at the original root. (If h is originally empty we return null.) To perform union( h, h 2 ), we form a single tree by traversing down the right paths of h and h 2,merging them into a single right path with keys innondecreasing order. First assume the left subtrees of nodes along the merge path do not change. (See Figure (a).) The time for the Union operation is bounded by a constant times the length of the merge path. To make Union efficient, we must keep right paths short. In leftist heaps this is done by maintaining the invariant that, for any node x, the right path descending from x is a

-2- shortest path down to a missing node. Maintaining this invariant requires storing at every node the length of a shortest path down to a missing node; after the merge we walk back up the merge path, updating the shortest path lengths and swapping left and right children as necessary to maintain the leftist property. The length of the right path in a leftist heap of n nodes is at most log n, implying an O(log n) worst-case time bound for each of the heap operations, where n is the number of nodes in the heap or heaps involved. 6 5 30 4 (a) 5 6 30 4 (b) 6 5 4 30 Figure : AUnion of two skew heaps. (a) Merge of the right paths. (b) Swapping of children along the path formed by the merge. In our self-adjusting version of this data structure, we perform the Union operation by merging the right paths of the two trees and then swapping the left and right children of every node on the merge path except the lowest. (See Figure (b).) This makes the potentially long right path formed by the merge into a left path. We call the resulting data structure a skew heap.

-3- Exercise : Write a simple recursive algorithm for the Union operation on skew heaps. THE AMORTIZED ANALYSIS In our analysis of skew heaps we shall use the following general approach. We associate with each collection S of skew heaps a real number Φ(S) called the potential of S. For any sequence of m operations with running times t, t 2,..., t m,wedefine the amortized time a i of the i th operation to be a i = t i +Φ i Φ i,where, Φ i,for i =,2,..., m, isthe potential of the skew heaps after the i th operation and Φ 0 is the potential of the skew heaps before the first operation. The total running time of the sequence of the operations is then m Σ t i = Σ m ( a i Φ i +Φ i )=Φ 0 Φ m + Σ m a i That is, the total running time equals the total amortized time plus the net decrease in potential from the initial to the final collection of the skew heaps. In our analysis of skew heaps the potential will be zero initially and will remain nonnegative. This implies that the total amortized time is an upper bound on the actual running time over the entire sequence of operations. We shall define the potential of a single skew heap; the potential of a collection of skew heaps is the sum of the potentials of its members. Intuitively, a heap with high potential is one subject to unusually time-consuming operations; the extra time spent corresponds to a drop in potential. Definition : For any node x in a binary tree, we define the weight of x, denoted wt(x), to be the number of descendents of x, including x itself. We use weights to partition nonroot nodes into two classes: Definition 2: Anonroot node x is heavy if wt(x)> wt( parent(x)) /2and light otherwise. Aroot node is considered neither heavy nor light. The following facts immediately follow from the above definitions. (Try to prove them yourself.) Fact : Of the children of any node, at most one is heavy. Fact 2: On any path from a node x down to a descendent y, there are at most log ( wt(x)/ wt(y)) light nodes, not counting x. In particular, any path in an n-node tree contains at most log n light nodes. Definition 3: left otherwise. Anonroot node is called right if it is the right child of its parent; it is called Definition 4: We define the potential of a skew heap to be the total number of right heavy nodes it contains. Suppose we begin with no heaps and carry out a sequence of skew heap operations. The initial potential is zero and the final potential is nonnegative, so the total amortized time is an upper bound on the total actual time of the operations. The amortized time of a makeheap or findmin operation is O( ), since these operations require O( ) time and do not

-4- change the potential. The amortized time of the other operations depend on the amortized time for the union operation. Consider a Union of two heaps h and h 2,containing n and n 2 nodes, respectively. Let n = n + n 2 be the total number of nodes in the two heaps. As ameasure of the time for Union, we shall charge one per node on the merge path. Thus the amortized time of Union is the number of nodes on the merge path plus the change in potential. By Fact 2, the number of light nodes on the right paths of h and h 2 is at most log n and log n 2, respectively. Thus the total number of light nodes on the two paths is at most 2 log n. (Check this.) (See Figure 2.) # LIGHT lg n # LIGHT lg n 2 # HEAVY = k # HEAVY = k 2 # LIGHT lg n # HEAVY = k 3 lg n Figure 2: Analysis of right heavy nodes in union. Let k and k 2 be the number of heavy nodes on the right paths of h and h 2,respectively, and let k 3 be the number of nodes that become right heavy children of nodes on the merge path. By Fact, every node counted by k 3 corresponds to a light node on the merge path. Thus Fact 2 implies that k 3 log n. The number of nodes on the merge path is at most 2 + log n + k + log n 2 + k 2 + 2 log n + k + k 2. (The first "2" counts the roots of h and h 2.) Therefore the running time of the Union operation is t + 2 log n + k + k 2. The change in potential caused by the Union is Φ = k 3 k k 2 log n k k 2. Thus the amortized time of the Union, t + Φ, isatmost 3 log n +. We summarize the above result in the following theorem. Theorem: The amortized time of an insert, deletemin, or union skew heap operation is O(log n), where n is the number of keys inthe heap or heaps involved in the operation. The amortized time of a makeheap or findmin operation is O( ). Proof: The analysis above gives the bound for findmin, makeheap, and union; the bound for insert and deletemin follows immediately from that of union.

-5- Bibliography: To see further improvements to the basic structure described above, the interested reader is referred to: Sleator, Tarjan, "Self-Adjusting Heaps," SIAM J. Computing, (), 6, pp. 52-6. Bernard Chazelle s papers sited below describe a novel self adjusting priority queue that allows a controled measure of error in order to accelerate performance! He shows many applications, including the fastest deterministic minimum spanning tree algorithm todate: "The Soft Heap: An Approximate Priority Queue with Optimal Error Rate", JACM 47(6), Nov. 2000, pp. 0-027. "A Minimum Spanning Tree Algorithm with inverse-ackermann Type Complexity" JACM 47(6), Nov. 2000, pp. 028-047.