COMP 250 Fall Homework #4

Similar documents
COMP 250 Fall Solution - Homework #4

COMP 250 Winter Homework #4

COMP 250 Fall Homework #4

COMP 250 Winter Homework #4. Due date: March 20 th, 23:59pm

Faculty of Science FINAL EXAMINATION COMP-250 A Introduction to Computer Science School of Computer Science, McGill University

Advanced Tree Data Structures

CSC 273 Data Structures

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

Sorting and Selection

COMP 250 Fall Midterm examination

CSE 332 Spring 2013: Midterm Exam (closed book, closed notes, no calculators)

Programming II (CS300)

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

We can use a max-heap to sort data.

COMP Data Structures

CSE 332 Autumn 2013: Midterm Exam (closed book, closed notes, no calculators)

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Programming II (CS300)

CMSC351 - Fall 2014, Homework #2

Basic Data Structures (Version 7) Name:

Faster Sorting Methods

Data Structures Question Bank Multiple Choice

CS102 Binary Search Trees

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

COMP 250 Fall recurrences 2 Oct. 13, 2017

CSE 373 Winter 2009: Midterm #1 (closed book, closed notes, NO calculators allowed)

Overview of Presentation. Heapsort. Heap Properties. What is Heap? Building a Heap. Two Basic Procedure on Heap

Programming II (CS300)

DO NOT. UNIVERSITY OF CALIFORNIA Department of Electrical Engineering and Computer Sciences Computer Science Division. P. N.

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

CS301 - Data Structures Glossary By

CS171 Final Practice Exam

CSE 373 Spring 2010: Midterm #1 (closed book, closed notes, NO calculators allowed)

Multiple-choice (35 pt.)

COMP 250 Winter 2010 Exercises 6 - trees March 2010

COMP Analysis of Algorithms & Data Structures

Suggested Study Strategy

Lower Bound on Comparison-based Sorting

Lecture 5: Sorting Part A

Tree: non-recursive definition. Trees, Binary Search Trees, and Heaps. Tree: recursive definition. Tree: example.

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

9. Heap : Priority Queue

Q1 Q2 Q3 Q4 Q5 Q6 Total

CS 234. Module 8. November 15, CS 234 Module 8 ADT Priority Queue 1 / 22

CSC Design and Analysis of Algorithms

Chapter 4. Divide-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

DDS Dynamic Search Trees

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

COMP Analysis of Algorithms & Data Structures

COT 5407: Introduction. to Algorithms. Giri NARASIMHAN. 1/29/19 CAP 5510 / CGS 5166

Data Structures and Algorithms Week 4

Final Examination. Algorithms & Data Structures II ( )

MID TERM MEGA FILE SOLVED BY VU HELPER Which one of the following statement is NOT correct.

Week 10. Sorting. 1 Binary heaps. 2 Heapification. 3 Building a heap 4 HEAP-SORT. 5 Priority queues 6 QUICK-SORT. 7 Analysing QUICK-SORT.

Divide-and-Conquer. The most-well known algorithm design strategy: smaller instances. combining these solutions

QuickSort

CSI33 Data Structures

Practical Session 10 - Huffman code, Sort properties, QuickSort algorithm, Selection

Topics for CSCI 151 Final Exam Wednesday, May 10

Hiroki Yasuga, Elisabeth Kolp, Andreas Lang. 25th September 2014, Scientific Programming

Design and Analysis of Algorithms Lecture- 9: Binary Search Trees

Mergesort again. 1. Split the list into two equal parts

This is a set of practice questions for the final for CS16. The actual exam will consist of problems that are quite similar to those you have

Computer Science 302 Spring 2017 (Practice for) Final Examination, May 10, 2017

Sorting. Task Description. Selection Sort. Should we worry about speed?

Divide-and-Conquer. Dr. Yingwu Zhu

Week 10. Sorting. 1 Binary heaps. 2 Heapification. 3 Building a heap 4 HEAP-SORT. 5 Priority queues 6 QUICK-SORT. 7 Analysing QUICK-SORT.

Sorting Pearson Education, Inc. All rights reserved.

Data Structures and Algorithms Chapter 4

CS-301 Data Structure. Tariq Hanif

21# 33# 90# 91# 34# # 39# # # 31# 98# 0# 1# 2# 3# 4# 5# 6# 7# 8# 9# 10# #

Searching, Sorting. part 1

O(n): printing a list of n items to the screen, looking at each item once.

Quick-Sort fi fi fi 7 9. Quick-Sort Goodrich, Tamassia

Algorithms, Spring 2014, CSE, OSU Lecture 2: Sorting

CS Divide and Conquer

COMP 250 Midterm #2 March 11 th 2013

Balanced Binary Search Trees. Victor Gao

CS24 Week 8 Lecture 1

Questions from the material presented in this lecture

CS171 Final Practice Exam

Lecture 19 Sorting Goodrich, Tamassia

COSC 2007 Data Structures II Final Exam. Part 1: multiple choice (1 mark each, total 30 marks, circle the correct answer)

Merge Sort

Deliverables. Quick Sort. Randomized Quick Sort. Median Order statistics. Heap Sort. External Merge Sort

CSC 421: Algorithm Design & Analysis. Spring 2015

Practical Session #11 - Sort properties, QuickSort algorithm, Selection

Quick-Sort. Quick-Sort 1

Total Points: 60. Duration: 1hr

Divide and Conquer 4-0

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

Midterm solutions. n f 3 (n) = 3

EXAMINATIONS 2015 COMP103 INTRODUCTION TO DATA STRUCTURES AND ALGORITHMS

Question 7.11 Show how heapsort processes the input:

COMP171 Data Structures and Algorithms Fall 2006 Midterm Examination

Comparisons. Θ(n 2 ) Θ(n) Sorting Revisited. So far we talked about two algorithms to sort an array of numbers. What is the advantage of merge sort?

Day 10. COMP1006/1406 Summer M. Jason Hinek Carleton University

CSE 373 Final Exam 3/14/06 Sample Solution

Comparisons. Heaps. Heaps. Heaps. Sorting Revisited. Heaps. So far we talked about two algorithms to sort an array of numbers

Data Structures and Algorithms. Roberto Sebastiani

Transcription:

COMP 250 Fall 2006 - Homework #4 1) (35 points) Manipulation of symbolic expressions See http://www.mcb.mcgill.ca/~blanchem/250/hw4/treenodesolution.java 2) (10 points) Binary search trees Consider a binary search tree that contains n nodes with keys 1, 2, 3,..., n. The shape of the tree depends on the order in which the keys have been inserted in the tree. a) In what order should the keys be inserted into the binary search tree to obtain a tree with minimal height? There are several possible answers. One possible answer, assuming that n=2 k -1 is: 1/2*(n+1), 1/4*(n+1), 3/4*(n+1), 1/8*(n+1), 3/8(n+1), 5/8(n+1),7/8(n+1)... This actually fills the binary search tree layer by layer, resulting in a balanced tree of minimal height log(n). b) On the tree obtained in (a), what would be the worst-case running time of a find, insert, or remove operation? Use the big-oh notation. Since the worst-case running time for all these operations is O(h), where h is the height of tree, and h = log(n), we get a worst-case running time that is O(log(n)). This means that having a binary search tree that is balanced is a good thing! c) In what order should the keys be inserted into the binary search tree to obtain a tree with maximal height? One possible answer is: 1, 2, 3,..., n-1, n. This would result in a very high tree, where nodes only have a right child. The height of the tree would be n-1. d) On the tree obtained in (c), what would be the worst-case running time of a find, insert, or remove operation? Use the big-oh notation. The running time would be O(n), because the height is O(n). 3) (15 points) Finding the k-th element Consider the following problem: Given an unsorted array A[0...n-1] of distinct integers, find the k-th smallest element (with k=0 giving the smallest element). For example, findkth( [ 9 10 6 7 4 12 ], 0) = 4 and findkth([ 1 7 6 4 15 10 ], 4 ) = 10. a) (10 points). Complete the following pseudocode for the findkth algorithm. Your algorithm should have similarities to quicksort and to binarysearch and should call the

partition algorithm described in class and used by the quicksort method. You can call the partition without redefining it. Your algorithm should have the running time described in (b). Algorithm findkth( A, start, stop, k ) Input: An unsorted array A[start...stop] of numbers, and a number k between 0 and stopstart-1 Output: Returns the k-th smallest element of A[start...stop] /* Complete this pseudocode */ if ( k > stop-start ) print Illegal value of k ; return; // not really needed else if ( start == stop ) return A[start]; pivot <- partition(a,start,stop) // now all the elements in A[start... pivot-1] are smaller // or equal to A[pivot], and the elements in // A[pivot+1...stop] are larger or equal A[pivot] if (pivot = k) return A[k]; if (pivot > k) return findkth(a, start, pivot-1, k); if (pivot < k) return findkth(a, pivot+1, stop, k ); b) (5 points) Assuming that n is a power of two and that we are always lucky enough that the pivot chosen by the partition method always splits A[start...stop] into two equal halves, show that your algorithm runs in time O(n). You can assume that executing partition(start,stop) takes O(stop-start+1) time. The running time of this recursive algorithm is, assuming the pivot always splits the array in two: T(n) = T(n/2) + c n + d if n > 1 e if n = 1 This is because partition takes time c n (for some constant c), other primitive operations take time d (for some constant d), and the recursive call takes time T(n/2). By the master theorem (case 3), we obtain that T(n) is Θ(n).

4) (16 points) Tree traversals Consider the following pair of recursive algorithms calling each other to traverse a binary tree. Algorithm weirdpreorder(treenode n) if (n!= null) then print n.getvalue() weirdpostorder( n.getleftchild() ) weirdpostorder( n.getrightchild() ) Algorithm weirdpostorder(treenode n) if (n!= null) then weirdpreorder( n.getrightchild() ) weirdpreorder( n.getleftchild() ) print n.getvalue() a) (4 points) Write the output being printed when weirdpreorder(root) is executed on the following binary tree: Method executed Result printed Pre(7) 7 Post(3) Pre(6) 6 Pre(2) 2 Post(1) Print 1 Post(2) Print 2 Print 3 Post(9) Pre(9) 9 Pre(8) 8

Post(7) Print 7 Print 9 Conclusion: The order is : 7 6 2 1 2 3 9 8 7 9 b) (4 points) Write the output being printed when weirdpostorder(root) is executed. Method executed Result printed Post(7) Pre(9) Print(9) 9 Post(8) Pre(7) Print(7) 7 Print(8) 8 Post(9) Print(9) 9 Pre(3) Print(3) 3 Post(2) Pre(2) Print(2) 2 Pre(1) Print(1) 1 Print(2) 2 Post(6) Print(null) Print(null) Print(6) 6 Print(7) 7 Conclusion: 9 7 8 9 3 2 1 2 6 7 c) (4 points) Consider the binary tree traversal algorithm below.

Algorithm queuetraversal(treenode n) Input: a treenode n Output: Prints the value of each node in the binary tree rooted at n Queue q new Queue(); q.enqueue(n); while (! q.empty() ) do x q.dequeue(); print x.getvalue(); if ( x.getleftchild()!= null ) then q.enqueue( x.getleftchild() ); if ( x.getrightchild()!= null ) then q.enqueue( x.getrightchild() ); Question: Write the output being printed when queuetraversal(root) is executed. This produces a breadth-first search of the tree: 7 3 9 2 6 8 9 1 2 7 d) (4 points) Consider the binary tree traversal algorithm below. Algorithm stacktraversal(treenode n) Input: a treenode n Output: Prints the value of each node in the binary tree rooted at n Stack s new Stack(); s.push(n); while (! s.empty() ) do x s.pop(); print x.getvalue(); if (x.getleftchild()!= null) then s.push(x.getleftchild()); if (x.getrightchild()!= null) then s.push(x.getrightchild()); Question: Write the output being printed when stacktraversal(root) is executed. This is the equivalent of what traversal method seen previously in class? This produces a depth-first search of the tree, but one that always starts with the right child instead of the left child: 7 9 9 8 7 3 6 2 2 1

5) (12 points) Tree isomorphism Two unordered binary trees A and B are said to be isomorphic if, by swapping the left and right subtrees of certain nodes of A, one can obtain a tree identical to B. For example, the following two trees are isomorphic: because starting from the first tree and exchanging the left and right subtrees of node 5 and of node 8, one obtains the second tree. On the other hand, the following two trees are not isomorphic, because it is impossible to rearrange one into the other: Question: Write a recursive algorithm that tests if the trees rooted at two given treenodes are isomorphic. Hint: if your algorithm takes more than 10 lines to write, you re probably not doing the right thing. Algorithm isisomorphic(treenode A, treenode B) Input: Two treenodes A and B Output: Returns true if the trees rooted at A and B are isomorphic /* Complete this pseudocode */ if (A = null and B = null) then return true; if ((A = null and B!=null) or (A!= null and B = null)) then return false; if (A.getKey()!= B.getKey() ) return false; boolean isoll = isisomorphic(a.getleftchild(), B.getLeftChild() );

boolean isolr = isisomorphic(a.getleftchild(), B.getRightChild() ); boolean isorl = isisomorphic(a.getrightchild(), B.getLeftChild() ); boolean isorr = isisomorphic(a.getrightchild(), B.getRightChild() ); return ((isoll and isorr) or (isolr and isorl)); 6) (12 points) Heaps Let T be a heap storing n keys. Give the pseudocode for an efficient algorithm for reporting all the keys in T that are smaller than or equal to a given query key x (which is not necessarily in T). For example, given the heap below and the query x=7, the algorithm should print 2 5 7 6. Note that the keys do not need to be reported in sorted order. Your algorithm should run in time O(k+1), where k is the number of keys reported. (So you can t afford to traverse the whole tree.) Algorithm printatmost(key x, treenode n) Input: a key x and a treenode n (!) Output: Prints the keys of all nodes of the subtree rooted at n with keys at most x /* Complete this pseudocode */ if (n!= null and n.getkey() <= x ) then print n.getkey(); printatmost( x, n.getleftchild() ); printatmost( x, n.getrightchild() );