National aiwan University andout #1 epartment of lectrical ngineering November, 01 lgorithms, Fall 01 : Zhi-en in and Yen-hun iu ample olutions to omework # 1. (15) (a) ee Figure 1. 3 1 3 1 3 1 3 1 1 1 1 1 (a) eap built by uild-ax-eap (b) First exchange followed by ax-eapify (c) (d) 1 1 1 1 1 3 1 3 1 3 1 3 (e) (f) (g) (h) 1 1 1 1 1 3 1 3 1 3 1 3 (i) () (k) (l) 1 1 3 1 1 1 3 1 3 (o) orted sequence (m) (n) Figure 1: llustration for Problem 1(a). (b) ee Figure. (c) ee Figure 3.. (0) (a) ee Figure.
i x = 1 1 1 3 i 1 1 3 i 1 1 3 i 1 1 3 i 1 1 3 Figure : llustration for Problem 1(b). 1 1 3 1... 1... 1 1 1 1 3 1 1 1 (a) he array and the auxiliary array 1...... 5 11 1 13 1 (b) he auxiliary array after accumulation 1...... 5 11 1 1 1 (c) he output array and auxiliary array after filling in one element 1...... 5 11 1 1 1 (d) he output array and auxiliary array after filling in the second element 1...... 11 1 1 1 (e) he output array and auxiliary array after filling in the third element 1 1 3 (f) he sorted result Figure 3: llustration for Problem 1(c). (b) ince Y [1, 1] and Y [m, n] are the smallest and the largest elements in Y respectively, the two statements are true.
9 3 1 5 1 1 Figure : ample table for Problem (a). (c) e first record the smallest value and then replace it with the infinity. he infinity must be moved to proper position for maintaining property the of Young tableau. his goal is achieved by recursion. he recursion compares the current position with right and down positions. e exchange the smaller one with current position, treat the position with the smaller value as the new current position, and then do next recursion. t stops when right and down values are both infinity. fter the recursion, the recorded value is returned as the minimum value. he method is shown as follows: Framework: tep 1. ecord Y [1, 1], i.e. v Y [1, 1] tep. eplace Y [1, 1] with, then call V-(1, 1) tep 3. eturn v V-(i, ): assume Y [i, ] = if i > m or > n if Y [i + 1, ] = and Y [i, + 1] = then UN if Y [i + 1, ] < Y [i, + 1] then swap Y [i, ] with Y [i + 1, ] call V-(i + 1, ) else swap Y [i, ] with Y [i, + 1] call V-(i, + 1) ime complexity: (p) = (p 1) + Θ(1) = (p) ence, (m + n) = (m + n) (d) e first put the new element at Y [m, n]. he added element must be moved to proper position for maintaining the property of Young tableau. his goal is also achieved by recursion. he recursion compares the current position with up and left positions. f both them are larger than the current position, we exchange current position with the larger one of them; otherwise, we exchange current position with the one whose value is larger than current position. fter the exchange, we do next recursion for new position. t stops when up and left values are both smaller than current position. he method is shown as follows: Framework: tep 1. eplace Y [m, n] with the new element tep. all V-U(m, n) 3
V-U(i, ): assume Y [i, ] = if i < 1 or < 1 if Y [i 1, ] < Y [i, ] and Y [i, 1] < Y [i, ] then UN if Y [i 1, ] >= Y [i, ] and Y [i, 1] >= Y [i, ] then if Y [i 1, ] > Y [i, 1] then swap Y [i, ] with Y [i 1, ] call V-U(i 1, ) else swap Y [i, ] with Y [i, 1] call V-U(i, 1) else if Y [i 1, ] >= Y [i, ] then swap Y [i, ] with Y [i 1, ] call V-U(i 1, ) else swap Y [i, ] with Y [i, 1] call V-U(i, 1) ime complexity: (p) = (p 1) + Θ(1) = (p) ence, (m + n) = (m + n) (e) tep 1. nsert n values into table, needs n (n + n) = (n 3 ) tep. xtract n values from table by -N, needs n (n + n) = (n 3 ) ccording to 1 and, we have (n 3 ) sorting method. (f) Key idea: ee Figure 5 for illustration. For any position [i, ], we can see that p, q, p i, q, Y [p, q] Y [i, ]. hat is, the gray positions are not greater than Y [p, q]. o use this property, we can compare target value with Y [i, ], and then determine that the target value falls in gray or white positions.... i... Framework: tep 1. all etermine(value, m, 1) Figure 5: llustration of Problem (f). etermine(value, i, ): assume Y [i, ] = if i < 1 or < 1 or i > m or > n if Y [i, ] = then UN false if value = Y [i, ] then UN true else if value > Y [i, ] then UN etermine(value, i, + 1) else if value < Y [i, ] then UN etermine(value, i 1, )
ime complexity: (p) = (p 1) + Θ(1) = (p) ence, (m + n) = (m + n) 3. (5) onstruct a counting array used in counting sort in (n + k) time. ince [a] denotes the number of integers which fall into the range [0, a], it is obvious to see the answer is [b] [a 1], where [ 1] is defined as 0. herefore, the query then can be answered in (1) time.. (10) (a) ee Figure. U N F U N F U N F U N F Figure : he operation of adix-ort. (b) ee Figure. 5. (0) (a) ince there are n red ugs and n blue ugs, that will take Θ(n ) comparisons to compare each red ug with each blue ug in the worst case. (b) he computation of the algorithm can be viewed in terms of a decision tree. very internal node is labelled with two ugs (red and blue), and has three outgoing edges (red ug smaller, same size, or larger than the blue ug). he leaves are labelled with a unique matching of ugs. he height of the decision tree is equal to the worst-case number of comparisons the algorithm has to make to determine the matching. o bound that size, we first compute the number of possible matchings for n red and n blue ugs. f we label the red ugs from 1 to n and we also label the blue ugs from 1 to n before starting the comparisons, every outcome of the algorithm can be represented as a set {(i, π(i)) : 1 i n and π is a permutation on {1,..., n}}, which contains the pairs of red ugs (first component) and blue ugs (second component) that are matched up. ince every permutation π corresponds to a different outcome, there must be exactly n! different results. 5
1.9.13 3.1. 5.39.0.9.53 9.1 10. (a) 0 / 1 3 5 9 /.13.1 /.0 /.39 /. /.53 /. /.1.9 /.9 / (b) Figure : he operation of ucket-ort. (a) he input array [1..10]. (b) he array [0..9] of sorted lists (buckets). he sorted output consists of a concatenation in order of the lists [0], [1],..., [9]. herefore, the height h of the decision tree can be bounded as follows: very tree with a branching factor of 3 (every inner node has at most three children) has at most 3 h leaves, and the decision tree must have at least n! children. his statement follows that 3 h n! (n/e) n h n log 3 n n log 3 e = Ω(n lg n). (c) o any algorithm solving the problem must use Ω(n lg n) comparisons. : red ugs with labelled numbers 1,,..., n, {1,..., n}. : blue ugs with labelled number 1,,..., n, {1,..., n}. output: n distinct pairs (i, ) indicate the red ug i and the blue ug have the same volume. Pseudocode: -JU(, ) 1 if = 0 then return 3 if = 1 then let = {r} and = {b} 5 output (r, b) return else r a randomly chosen ug in compare r to every ug of 9 < the set of ugs in that are smaller than r 10 > the set of ugs in that are larger than r 11 b the one ug in with the same size as r 1 compare b to every ug of {r} 13 < the set of ugs in that are smaller than b 1 > the set of ugs in that are larger than b 15 output (r, b) 1 -JU( <, < ) 1 -JU( >, > ) he procedure will be called only with inputs that can be matched, which means =.
. (10) he correctness of the algorithm can be proved as follows. nce the ug r is randomly picked from, there will be a matching among the ugs in volume smaller than r which are in the sets < and <, and likewise between the ugs larger than r which are in the sets > and >. ince < + > < in every recursive step, the size of the first parameter reduces with every recursive call, and the recursion will be terminated when the parameter reaches 0 or 1. he analysis of the expected number of comparisons is similar to that of the quicksort algorithm. o analyze the expect number of comparisons, we first order the red and blue ugs as r 1,..., r n and b 1,..., b n where r i < r i+1 and b 1 < b i+1 for i = 1,..., n, and r i = b i. hen, the indicator random variable can be defined as follows. i = {red ug r i is compared to blue ug b }. s in quicksort, a given pair r i and b i is compared at most once. hen comparing r i to every ug in, ug r i will not be put in either < or <. hen comparing b i to every ug in {r i }, ug b i is not put into either < or <. he total number of comparisons is = n 1 n i=1 =i+1 i. hen, the expected value of can be calculated as follows. [] = n 1 n i=1 =i+1 P r{r i is compared to b } nce we choose a ug r k such that r i < r k < b, we will put r i in < and b in >, and so r i and b will never be compared again. Jugs r i and b will be compared if and only if the first ug in i = {r i,..., r } to be chosen is either r i or r. ny ug in i is equally likely to be first one chosen. ince i = i + 1, the probability of any given ug being the first one chosen in i is 1/( i + 1). imilar to the analysis of quicksort, we can prove that the expected number of comparisons is (n lg n). he worst-case number of comparisons is Θ(n ). n the worst-case we always choose the largest or smallest ugs to partition the sets, which reduces the set sizes only by 1. he worst-case running time obeys the recurrence (n) = (n 1) + Θ(n), and the worst-case number of comparisons is (n) = Θ(n ). e can optimize by splitting the input in pairs and comparing each pair. fter n/ comparisons, we have reduced the potential minimums and potential maximums to n/ each. Furthermore, those two sets are disoint so now we have two problems, one minimum and one maximum, each of size n/. he total number of comparisons is n/ + (n/ 1) = n/ + n = 3n/ his assumes that n is even. f n is odd we need one additional comparison in order to determine whether the last element is a potential minimum or maximum. ence the ceiling.. (10) his problem can be solved by a binary-search approach. et p be the median of [1..n], and q be the median of Y [1..n]. f p > q, then the median of the two arrays is in [1.. n ] or Y [ n..n]. f p q, then the median of the two arrays is in [ n..n] or Y [1.. n ]. herefore, we can use recursion to solve this problem. For any instance with two sorted arrays [1..n] and Y [1..n], we first compare the median of them, and then follow the mentioned rules to divide the two arrays into half of the original size. fter dividing, we can let the new smaller arrays as the new instance and apply the same strategy recursively. bviously, this binary-search approach divides the problem size into half size every recursion. hus we have the time complexity, (n) = (n/) + θ(1) = (lg n)
. (10) he optimal y-coordinate for Professor lay s east-west oil pipeline is as follows: f n is even, then on either the oil well whose y-coordinate is the lower median or the one whose y-coordinate is the upper median, or anywhere between them. f n is odd, then on the oil well whose y-coordinate is the median. Proof: e examine various cases. n each case, we will start out with the pipeline at a particular y-coordinate and see what happens when we move it. e ll denote by s the sum of the north-south spurs with the pipeline at the starting location, and s will denote the sum after moving the pipeline. e start with the case in which n is even. et us start with the pipeline somewhere on or between the two oil wells whose y-coordinates are the lower and upper medians. f we move the pipeline by a vertical distance d without crossing either of the median wells, then n/ of the wells become d farther from the pipeline and n/ become d closer, and s = s + dn/ dn/ = s; thus, all locations on or between the two medians are equally good. Now suppose that the pipeline goes through the oil well whose y-coordinate is the upper median. hat happens when we increase the y-coordinate of the pipeline by d > 0 units, so that it moves above the oil well that achieves the upper median? ll oil wells whose y-coordinates are at or below the upper median become d units farther from the pipeline, and there are at least n/ + 1 such oil wells (the upper median, and every well at or below the lower median). here are at most n/ 1 oil wells whose y-coordinates are above the upper median, and each of these oil wells becomes at most d units closer to the pipeline when it moves up. hus, we have a lower bound on s of s s + d(n/ + 1) d(n/ 1) = s + d > s. e conclude that moving the pipeline up from the oil well at the upper median increases the total spur length. symmetric argument shows that if we start with the pipeline going through the oil well whose y-coordinate is the lower median and move it down, then the total spur length increases. e see, therefore, that when n is even, an optimal placement of the pipeline is anywhere on or between the two medians. Now we consider the case when n is odd. e start with the pipeline going through the oil well whose y-coordinate is the median, and we consider what happens when we move it up by d > 0 units. ll oil wells at or below the median become d units farther from the pipeline, and there are at least (n + 1)/ such wells (the one at the median and the (n 1)/ at or below the median.) here are at most (n 1)/ oil wells above the median, and each of these becomes at most d units closer to the pipeline. e get a lower bound on s of s s + d(n + 1)/ d(n 1)/ = s + d > s, and we conclude that moving the pipeline up from the oil well at the median increases the total spur length. symmetric argument shows that moving the pipeline down from the median also increases the total spur length, and so the optimal placement of the pipeline is on the median. ince we know we are looking for the median, we can use the linear-time median-finding algorithm. hus, the optimal location can be determined in linear time. 9. (10) (b) (d) 10. (10) ne possible algorithm works in the following steps. (a) nsert the sequences one by one into the radix tree. (b) raverse the radix tree with pre-order tree walk, and write down sequences visited. he correctness of the algorithm can be ustified by proving that the pre-order tree walk visits sequences in monotonically increasing order. ccording to the structure of radix trees and the definition of lexicographically less than, for any node i in a radix tree, we have (he sequence on i) < (any sequence in the left subtree of i) < (any sequence in the right subtree of i.)
1 5 1 5 (a) (b) Figure : llustration for Problem 11(a) and (b). (a) inary search tree. (b) ed-black tree. 1 1 5 5 (a) (b) Figure 9: llustration for Problem 11(c). (a) inary search tree. (b) ed-black tree. herefore, the pre-order tree walk does visit sequences in monotonically increasing order. he timing bound can be derived as follows. nserting the sequences takes Θ(n) time and the pre-order tree walk takes Θ(n) time. n conclusion, the timing of the algorithm is Θ(n). 11. (0) 1. (0) (a) ee Figure (a). (b) ee Figure (b). (c) ee Figure 9(a) for the binary search tree. Yes, see Figure 9(b) for the red-black tree. (d) ee Figure 10(a) for the inserted red-black tree. ee Figure 10(b) for the legal red-black tree. (e) ee Figure 11(a) for the deleted red-black tree. ee Figure 11(b) for the legal red-black tree. (a) f we know N h 1 and N h, we can determine N h. ince this N h noded tree must have a height h, the root must have a child that has height h 1. o minimize the total number of nodes in this 9
1 5 1 3 5 3 (a) (b) Figure 10: llustration for Problem 11(d). (a) nserting key 3 to the original red-black tree. (b) egal red-black tree. 1 5 1 5 (a) (b) Figure 11: llustration for Problem 11(e). (a) eleting key to the original red-black tree. (b) egal red-black tree. tree, we would have this sub-tree contain N h 1 nodes. y the property of an V tree, if one child has height h1, the minimum height of the other child is h. y creating a tree with a root whose left sub-tree has N h 1 nodes and whose right sub-tree has N h nodes, we have constructed the V tree of height h with the least nodes possible. his V tree has a total of N h 1 + N h + 1 nodes. he base cases are N 0 = 1 and N 1 =. From here, we can iteratively construct N h by using the fact that N h = N h1 + N h + 1 that we figured out above. N h = N h 1 + N h + 1 N h 1 = N h + N h 3 + 1 N h = (N h + N h 3 + 1) + N h + 1 N h > N h N h > h lg N h > lg h lg N h > h 10
ase 1 ase ight rotate eft rotate ase 3 ase eft rotate ight rotate Figure 1: llustration for Problem 1(b). h = (lg N h ) herefore, an V tree witn n nodes has height (lg n). (b) ee Figure 1 for four cases for N(x). (c) Pseudocode: V-N(x, z) 1 -N(x, z) N(x) (d) V-insertions are binary search tree insertions plus at most two rotations. ince binary search tree insertions take (h) time, rotations are (1) time, and V trees have h = (lgn), V insertions take (lgn) time. 13. (0) ynamic Programming. Please finish this on 3. 1. (0) ynamic Programming. Please finish this on 3. 15. (0) Y. 11