=========================preview======================
(comp271)[2009](s)midterm~cs_ysp^_10195.pdf
Back to COMP271 Login to download
======================================================
COMP 271 Design and Analysis of Algorithms
Spring 2009 Midterm Exam Solutions
1.
Multiple Choice
1.1 (b) 1.2 (d) 1.3 (c) 1.4 (e) 1.5 (d)
2.
(a) If n 3 we can solve the problem trivially. Let m = .n/2.. We look at the two elements A[m],A[m + 1]. There could be the following cases:
(a)
If A[m]=0 and A[m + 1] = 1, then let k = m and we are done;
(b)
If A[m]= A[m + 1] = 1, then k<m so we recursively solve the problem on A[1..m];
(c)
If A[m]= A[m + 1] = 0, similar to the case above, we recursively solve the problem on A[m..n];
In any case we either terminate or reduce the problem size by half. So the running time of the algorithm is O(log n).
(b)We keep probing A[1],A[2],A[4],A[8],...,A[2i],... until we .nd the .rst j such that A[2j] = 1. Then we directly apply the algorithm in (a) on A[2j.1 ,..., 2j]. The running time of probing part is O(j), and running time of applying algorithm in
(a) is O(log 2j)= O(j). Since 2j.1 k we have j = O(log k) and the total running time is O(log k)
3. (a) Let X denote the random variable of the number of bad pivots before a good one. Let Xi be the indicator variable such that Xi = 1 if the i-th pivot is bad and
.
Xi = 0 otherwise. We have X = Let B[1,...,m] denote the the array
i=1 Xi.
of the remaining m elements in sorted order, then a pivot is good if and only if
it is chosen from B[m/4,..., 3m/4]. Therefore the probability that a pivot is bad
is 1/2. So we have Pr[Xi = 1] = Pr[the .rst i pivots are all bad] = (1/2)i . So
.
E[X]= i=1(1/2)i = 1.
(b)
Suppose the algorithm terminate with l good pivots. Since every time we .nd a good pivot the number of elements decrease by a fraction of at least 3/4, we have (34 )ln< 1 and therefore l = O(log n).
(c)
By (a) the expected number of bad pivots between two good pivots is O(1), and since there are O(log n) good pivots, we can conclude that the expected number of total bad pivots is O(log n), by the linearity of expectation. So the total number of pivots the algorithm draws is O(log n) in expectation.
4. Pick any v V and run a DFS traversal in the graph G. When the DFS travels through a back edge (u, v), we declare that G contains a cycle. Next, starting from u we follow the pred pointers and output the vertices until we reach v. These vertices form a cycle.
Correctness: If the graph has a cycle, then there must be a back edge. If the graph doesnt have a cycle, it must be a tree, so there wont be any back edges.
1
Running time: The running time of the algorithm is O(V ). This is because all the edges traveled except the last back edge form a tree T spanning a subgraph of G, thus we travel at most O(V ) edges.
5. Let e =(u, v) be the edge with the second smallest weight. Let T be any MST of G. If T does not include e, then e T must contain a cycle C. Since C must contain at least three edges, there must be an edge e. C such that the weight of e. is larger than the w