Assignment 3 Solutions, COMP 251B

Similar documents
Algorithms. AVL Tree

AVL Tree Definition. An example of an AVL tree where the heights are shown next to the nodes. Adelson-Velsky and Landis

AVL Trees Goodrich, Tamassia, Goldwasser AVL Trees 1

CHAPTER 10 AVL TREES. 3 8 z 4

Lecture 6: Analysis of Algorithms (CS )

CS350: Data Structures Red-Black Trees

Search Trees (Ch. 9) > = Binary Search Trees 1

Analysis of Algorithms

COSC160: Data Structures Balanced Trees. Jeremy Bolton, PhD Assistant Teaching Professor

Analysis of Algorithms

Self-Balancing Search Trees. Chapter 11

CISC 235: Topic 4. Balanced Binary Search Trees

Chapter 10: Search Trees

Module 4: Index Structures Lecture 13: Index structure. The Lecture Contains: Index structure. Binary search tree (BST) B-tree. B+-tree.

COMP Analysis of Algorithms & Data Structures

Midterm solutions. n f 3 (n) = 3

CIS265/ Trees Red-Black Trees. Some of the following material is from:

Lecture: Analysis of Algorithms (CS )

Linked Structures Songs, Games, Movies Part III. Fall 2013 Carola Wenk

COMP Analysis of Algorithms & Data Structures

Balanced Binary Search Trees. Victor Gao

COMP171. AVL-Trees (Part 1)

Part 2: Balanced Trees

Operations on Heap Tree The major operations required to be performed on a heap tree are Insertion, Deletion, and Merging.

Transform & Conquer. Presorting

A red-black tree is a balanced binary search tree with the following properties:

AVL Trees (10.2) AVL Trees

Binary heaps (chapters ) Leftist heaps

CSci 231 Homework 7. Red Black Trees. CLRS Chapter 13 and 14

Trees. Eric McCreath

Sorted Arrays. Operation Access Search Selection Predecessor Successor Output (print) Insert Delete Extract-Min

Binary Search Trees, etc.

Search Trees - 1 Venkatanatha Sarma Y

Algorithms. Red-Black Trees

Dynamic Access Binary Search Trees

Properties of red-black trees

Red-black trees (19.5), B-trees (19.8), trees

CS 206 Introduction to Computer Science II

COMP Analysis of Algorithms & Data Structures

Graduate Algorithms CS F-07 Red/Black Trees

COMP251: Red-black trees

Red-Black, Splay and Huffman Trees

Uses for Trees About Trees Binary Trees. Trees. Seth Long. January 31, 2010

CS102 Binary Search Trees

We will show that the height of a RB tree on n vertices is approximately 2*log n. In class I presented a simple structural proof of this claim:

Advanced Set Representation Methods

Binary Heaps in Dynamic Arrays

Note that this is a rep invariant! The type system doesn t enforce this but you need it to be true. Should use repok to check in debug version.

Balanced Search Trees. CS 3110 Fall 2010

Ch04 Balanced Search Trees

BINARY SEARCH TREES cs2420 Introduction to Algorithms and Data Structures Spring 2015

TREES. Trees - Introduction

Data Structures in Java

Chapter 13. Michelle Bodnar, Andrew Lohr. April 12, 2016

Balanced search trees

Search Trees. Undirected graph Directed graph Tree Binary search tree

Section 4 SOLUTION: AVL Trees & B-Trees

Quiz 1 Solutions. (a) f(n) = n g(n) = log n Circle all that apply: f = O(g) f = Θ(g) f = Ω(g)

Search Trees: BSTs and B-Trees

CS 380 ALGORITHM DESIGN AND ANALYSIS

CSci 231 Homework 7. Red Black Trees. CLRS Chapter 13 and 14

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

Trees. (Trees) Data Structures and Programming Spring / 28

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

CMPS 2200 Fall 2017 Red-black trees Carola Wenk

Introduction to Algorithms March 11, 2009 Massachusetts Institute of Technology Spring 2009 Professors Sivan Toledo and Alan Edelman Quiz 1

8. Binary Search Tree

10/23/2013. AVL Trees. Height of an AVL Tree. Height of an AVL Tree. AVL Trees

Binary Tree. Preview. Binary Tree. Binary Tree. Binary Search Tree 10/2/2017. Binary Tree

Trees. Reading: Weiss, Chapter 4. Cpt S 223, Fall 2007 Copyright: Washington State University

Binary Search Trees. Analysis of Algorithms

2-3 Tree. Outline B-TREE. catch(...){ printf( "Assignment::SolveProblem() AAAA!"); } ADD SLIDES ON DISJOINT SETS

1. [1 pt] What is the solution to the recurrence T(n) = 2T(n-1) + 1, T(1) = 1

Multi-way Search Trees! M-Way Search! M-Way Search Trees Representation!

Binary Trees, Binary Search Trees

Solutions. Suppose we insert all elements of U into the table, and let n(b) be the number of elements of U that hash to bucket b. Then.

Data Structures and Algorithms

CS 3343 Fall 2007 Red-black trees Carola Wenk

Dynamic Access Binary Search Trees

AVL Trees. (AVL Trees) Data Structures and Programming Spring / 17

AVL Trees / Slide 2. AVL Trees / Slide 4. Let N h be the minimum number of nodes in an AVL tree of height h. AVL Trees / Slide 6

Balanced BST. Balanced BSTs guarantee O(logN) performance at all times

Some Search Structures. Balanced Search Trees. Binary Search Trees. A Binary Search Tree. Review Binary Search Trees

Module 4: Dictionaries and Balanced Search Trees

ECE 242 Data Structures and Algorithms. Trees IV. Lecture 21. Prof.

Data Structures and Algorithms for Engineers

CSE 332, Spring 2010, Midterm Examination 30 April 2010

13.4 Deletion in red-black trees

Lecture 5. Treaps Find, insert, delete, split, and join in treaps Randomized search trees Randomized search tree time costs

ICS 691: Advanced Data Structures Spring Lecture 3

DATA STRUCTURES AND ALGORITHMS. Hierarchical data structures: AVL tree, Bayer tree, Heap

Data Structures and Algorithms

Binary Search Tree: Balanced. Data Structures and Algorithms Emory University Jinho D. Choi

Algorithms. Deleting from Red-Black Trees B-Trees

Red-Black Trees. Based on materials by Dennis Frey, Yun Peng, Jian Chen, and Daniel Hood

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

B-Trees. Version of October 2, B-Trees Version of October 2, / 22

CSCI2100B Data Structures Heaps

9/29/2016. Chapter 4 Trees. Introduction. Terminology. Terminology. Terminology. Terminology

Algorithms in Systems Engineering ISE 172. Lecture 16. Dr. Ted Ralphs

Transcription:

Assignment 3 Solutions, COMP 251B Dipinder S. Rekhi March 24, 2005 1 Exercise 12.1-5 Binary search tree can be thought of as a sorted structure that is created using comparisons only. This weakly implies that the lower-bound for creating such a tree in the worst-case is Ω(nlog n), because if it was asymptotically lower than that then it would mean that we ll have a comparison-based sorting algorithm with a lower-bound for the worst case asympotically smaller than nlog n. However, it is not the case, as proved in the book. For a more rigorous proof, we would use decision trees. Consider a decision tree for a comparisonbased algorithm that creates a binary search tree from a given sequence on n numbers. The inner-nodes of this tree refer to the comparisons, and the edges emanating from those nodes refer to different paths the algorithm takes based on the comparison made. The leaf nodes refer to the BSTs formed after following one particular path in the tree. Since there are n! ways of arranging n input numbers, it means that there are n! possible BSTs (assuming that no two numbers are the same). In other words, the decision tree would have n! nodes. The decision tree would be a binary tree because a comparison of two non-identical numbers at each inner-node will result in either the first one being greater than the other or vice-versa. Let the length of the longest path from root to the leaf be h. h would also be the height of the decision tree. Any binary tree of height h would have at the most 2 h leaves. So, n! 2 h log n! h h =Ω(nlog n) Hence, in the worst case constructing a BST would take Ω(nlog n) time. 1

2 Join Operation on Red Black Trees a) To update bh[t] in O(log n) time within RB-INSERT and RB-DELETE functions, we make use of the property that number of black nodes from root to any leaf is the same. After RB-INSERT and RB-DELETE do their usual work, we travel from root to one of the leafs by following the left most path (we can follow any path). Along the way we count the number of black nodes encountered. The root is not counted. The number of black nodes encountered becomes the new value of bh[t]. Since the height of RB-tree is O(log n), therefore this operation takes O(log n) time. Hence, it dosent increase the asymptotic running time of RB-INSERT or RB-DELETE. Calculating the black height of the nodes while descending down the tree is simple. Given the black height of the parent, the childs black height depends on its color. If it is black then the black height is one less than its parents, whereas, if it is red then its black height is the same as its parents. Therefore, it takes O(1) to do this. b) Given bh[t 1 ] bh[t 2 ]. Consider the right most path is the tree T 1.Right most path means that when descending down the tree starting from the root, we always visit the right child. When we go down any path in a RB-Tree, the black height of the nodes decrease (see part a to convince yourself). It decreases from bh[t 1 ] down to zero, such that all the values in this range belong to atleast one particular node in the descending path. The point of following the right most path is that we ll encounter the largest values. So while descending the right most path, when we encounter a black-node of black height h, it means that it is the largest black node in the tree with a black height h. Using the above reasoning, we descend down the right most path of tree T 1, until we encounter the black node of height bh[t 2 ]. Since the height of RB-tree is O(log n), therefore this operation takes O(log n)time. c and d) These parts follow part (b) of the question and fit in the whole scheme of performing a join operation. Consequently, we can safely assume that y is a black node and tree subrooted at y has the same black height as tree T 2. Put node x in y s position. Make node y the left child and root of T 2 the right child of x. Color node x red so that properties 1,3 and 5 are maintained. To enforce properties 2 and 4 assign z = x, T = T 1 and execute RB-INSERT-FIXUP(T,z). To get a better understanding of how this works, 2

please refer to the book. This operation runs in O(log n) time. e and f) Part e is obvious. When bh[t 1 ] <bh[t 2 ], the method is exactly the same except that we follow the left most path in tree T 2 to find the smallest black node that has the same black height as tree T 1. After that a similar procedure follows as in the above parts. Join operation is actually a sequence of the above steps. Since these steps are performed in time O(log n), therefore Join operation can be performed in time O(log n). 3 AVL Trees a) Let n(h) be the number of nodes in the AVL tree of height h. n(0) 0and n(1) 1 are easy to imagine. Now consider a AVL tree of height h>1. The root will have two subtrees where 1) either both the subtrees have a height h-1 or 2) one has a height h-1 and the other has the height h-2. If both the subtrees are of height h-2 then the given tree cannot be of the height h. The above reasoning follows the AVL tree property. So the number of nodes in the above tree would be at least n(h 1) + n(h 2). Or n(h) n(h 1) + n(h 2) The above recurrence is a fibonacci recurrence (see problem 4-5, page 87 of the book). n(h 1) + n(h 2) is the h th fibonacci number F h,where F h = ( 1+ 5 ) h 2 5 or n(h) ( 1+ 5 ) h 2 5 Taking log on both the sides (let n(h) =n) or h = O(log n) log n h log( 1+ 5 ) log( 5) 2 b) Given, a subtree rooted at x whose right and left subtrees are height 3

balanced, but x itself is not height balanced. Lets consider the case where left subtree is of the height h and right subtree is of the height h+2. Similar method should be followed for the opposite case with orientations changed from left to right and vice-versa. Let L be the left child and R be the right child of x. Let α and β be left and right children of R. If height(α) height(β) thenlef T ROTATE(T,x) elseifheight(α) > height(β) then first RIGHT ROTATE(T,R)andthenLEF T ROTATE(T,x)where T is the tree immediately after the insertion. Set the heights accordingly. c) Following is the INSERT code INSERT(x, z) - if(x = NULL) (Root) - root = Z - return - if(key[z] key[x]) - if(right [x] =NULL) - RIGHT [x] =z - h[x] =newheightof x - return - else - INSERT(RIGHT [x],z) - else - if(lef T [x] =NULL) - LEF T [x] =z - h[x] =newheightof x - return - else - INSERT(LEF T [x],z) - h[x] =max(left[x],right[x]) + 1 - if(not BALANCED(x)) - BALANCE(x) d) Consider the function INSERT (see above) running on an AVL tree. Since we go down from root to find a node whose child the new node z would become, it takes O(log n) time. This is because height of the AVL tree is O(log n). Now consider the path from root to the newly inserted node z. Let the first node (going up the path from z) that is imbalanced be x. If there 4

are other nodes up the path that are imba;anced then it would be because subtree at x is imbalanced (as a node got added to it). So if we fix this imbalance at x then we would not need to perform any more rotations. The new height of x after the insertion is h+1 where h was the height before insertion was done (we are considering the case where addition of a new node icreases the height of some nodes. otherwise we would not have to worry about imbalances). Now, after we perform a rotation at x, the height of the node that will be in x s position would go back to h. Hence, the balance is achieved. If you look carefully at how balance works you would see that the height of the node in x s position would not stay the same, but decrease by one. This is because by just adding a single node we dont get into the situation where α and β have the same height. 4 Planning a Company Party Let C T be the total cost of the tree rooted at T. In this tree, if T is invited to the party then none of its children is invited. However, if T is not invited then some of its children may and some may not be invited. Let C Ti be the cost of the tree rooted at T when T is invited. Let C Tn be the cost of the tree rooted at T when T is not invited. It is obvious that C T = max(c Tn,C Ti ) In turn, C Ti = C xn x Childrenof T Also, C Tn = C x x Childrenof T For the node x, of the final optimal tree, that is a leaf, C xi = C(x) where C(x) is the conviviality rating of x and C xn = 0. Using this and the above equations we can build an optimal tree in a bottom-up fashion. 5

5 Moving on a Checker Board Let your current position on the board be x. If x belongs to the topmost row then the game is over and print the total. Otherwise, let Y be the set of positions that you can go from x. Let C x be the maximum number of dollars that one can earn following a certain optimal path from x. Then following recusion holds, C x = max z Y (p(x, z)+c z ) Calculating C x this way and finding the optimal path can be time consuming due to redundancy in recusion. Many problems have the same subproblems that are recalculated. To avoid this we start from the next to top most row. Calculate the C x values for all the nodes x in this row. C x values for all the nodes in the topmost row is zero. Go down one row and do similar calculations. Carry on like this until you get to the bottom most row. If the board is of the dimensions NxN, then these calculations can all be done in time O(N 2 ). 6