Chapter 7
Exercise 7.1-1
Coloring is similar as in the book, whilst indexing is the same.
13 19 9 5 12 8 7 4 21 2 6 11 13 19 9 5 12 8 7 4 21 2 6 11 13 19 9 5 12 8 7 4 21 2 6 11 9 19 13 5 12 8 7 4 21 2 6 11 9 5 13 19 12 8 7 4 21 2 6 11 9 5 13 19 12 8 7 4 21 2 6 11 9 5 8 19 12 13 7 4 21 2 6 11 9 5 8 7 12 13 19 4 21 2 6 11 9 5 8 7 4 13 19 12 21 2 6 11 9 5 8 7 4 13 19 12 21 2 6 11 9 5 8 7 4 2 19 12 21 13 6 11 9 5 8 7 4 2 6 12 21 13 19 11 9 5 8 7 4 2 6 11 21 13 19 12
Exercise 7.1-2
Available in the latest revision of the IM.
Exercise 7.1-3
Available in the latest revision of the IM.
Exercise 7.1-4
Available in the latest revision of the IM.
Exercise 7.2-1
Available in the latest revision of the IM.
Exercise 7.2-2
Available in the latest revision of the IM.
If we use a partitioning scheme introduced in Exercise 7.1-2, then the running time is Θ(nlgn).
Exercise 7.2-3
Available in the latest revision of the IM.
Exercise 7.2-4
Available in the latest revision of the IM.
Nearly sorted input will push QuickSort toward the worst-case scenario, rather than the best-case, as hinted in the IM.
Exercise 7.2-5
Available in the latest revision of the IM.
Exercise 7.2-6
Available in the latest revision of the IM.
In this exercise, we are asked to derive an approximate probability. Thus, we can ignore floors, ceilings and the pivot in calculations. The analysis in the IM follows this assumption.
Exercise 7.3-1
Available in the latest revision of the IM.
Exercise 7.3-2
Available in the latest revision of the IM.
The analysis of the best-case in the IM is a bit inaccurate. To sort an array of n elements, Randomized-QuickSort must find the matching permutation of elements, that is, find their sorted positions. Each call to Randomized-Partition fixes the pivot, so leaves are singleton calls to Randomized-QuickSort. In the best-case, the recursion tree is as balanced as possible at every level. The number of pivots matches non-singleton subarrays and equals ⌈n/2⌈ with most balanced splits, hence we have ⌊n/2⌋ leaves. Despite ending with the same conclusion, as in the IM, of Θ(n) calls to Random, it is important to be accurate in analysis.
Exercise 7.4-1
We guess that T(n)≥cn2 for some constant c>0. Substituting this guess into recurrence (7.1) yields
Every term in the maximization is bounded by (n−1)2, when q=0∨q=n−1 (see Exercise 7.4-3), as explained in the book. Continuing our analysis of T(n), we obtain
by picking the constant c small enough that the Θ(n) term dominates. Therefore, T(n)=Ω(n2).
Exercise 7.4-2
Available in the latest revision of the IM.
Exercise 7.4-3
A quadratic polynomial of q can have at most two distinct zeros. As explained in the book, q≤n−1 implies that 2q(q−(n−1))≤0. When this expression is zero we have a maximum, over all values of q, for q2+(n−1−q)2. Apparently, the two distinct zeros are q=0∨q=n−1.
Exercise 7.4-4
This bound and Lemma 7.1 allow us to conclude that the expected running time of Randomized-Quicksort is Ω(nlgn).
Exercise 7.4-5
Insertion sort requires O(k2) time to sort an array of size k. The total time spent by insertion sort is O(n/k)⋅O(k2)=O(nk), since we can have at most n/k subarrays of size k. The expected number of levels in the recursion tree is O(lg(n/k)). Therefore, the total time spent in partitioning subarrays is O(nlg(n/k)). After summing up these terms, we get that the randomized version of this sorting algorithm runs in O(nk+nlg(n/k)) expected time.
The modified algorithm has the same asymptotic running time as standard randomized quick-sort when O(nk+nlg(n/k))=O(nlgn). The largest asymptotic value of k, as a function of n, that satisfies this condition is k=Θ(lgn). Notice that k cannot be asymptotically larger than Θ(lgn), otherwise nk=ω(nlgn) and we would make things worse. On the other hand, plugging in k=lgn into our formula for the expected running time gives
In practice, you would need to experimentally find the best k, as performance depends on many other machine related factors. Usually, you would want to select the largest possible k among all values that give proper performance advantages.
Exercise 7.4-6
If the median falls into the lowest or largest αn numbers, then we get a worse than α-to-(1−α) split. This happens if at least two randomly selected numbers belong in these ranges. We have the following 4 disjoint events:
All three numbers are among the lowest αn values.
All three numbers are among the largest αn values.
Exactly two numbers are among the lowest αn values.
Exactly two numbers are among the largest αn values.
Therefore, the total probability is
Problem 7-1
a.
Initialization: A=(13, 19, 9, 5, 12, 8, 7, 4, 11, 2, 6, 21) p=1 r=12 x=13 i=0 j=13
After Iteration 1: A=(6, 19, 9, 5, 12, 8, 7, 4, 11, 2, 13, 21) i=1 j=11
After Iteration 2: A=(6, 2, 9, 5, 12, 8, 7, 4, 11, 19, 13, 21) i=2 j=10
After Iteration 3: A=(6, 2, 9, 5, 12, 8, 7, 4, 11, 19, 13, 21) i=10 j=9 RETURN 9
b.
Hoare-Partition produces a balanced split instead of hitting the worst-case scenario, as is the case with Partition. It better distributes repeated elements equal to pivot, so it works well even in presence of duplicates.
c.
We focus on the first iteration and show what happens to indices i and j as well as what transformation of A ensures that none of these indices ever fall out of range. Initially, before entering the while loop, we have x=A[p], i=p−1 and j=r+1. The body of the loop runs as follows:
Index j is decremented before reading A, so p≤j≤r on first access.
It will be decremented until stumbling across an element not greater than x. Notice that j cannot pass x, hence we know that p≤j after line 7.
Index i is incremented before reading A, so p≤i≤r on first access.
It will be incremented until stumbling across an element not smaller than x. Notice that i cannot pass x, hence we know that i=p after line 10.
At this point, we have two possibilities at line 11:
If i=j, then the function returns p. This occurs when all other elements are greater than x.
Otherwise, we must have i<j, so A[i] and A[j] are swapped. But exactly these values, now on opposite positions relative to indices i and j, guard i and j from passing them. Furthermore, if either i or j (or both) stop at these locations, then we would surely have i>j and the function would return j.
d.
Again, we focus our attention on the first iteration. From part (c) we know that p≤j at termination. Before accessing A index j is decremented, so j≤r. Now, based on the fact that p<r, we have two scenarios:
If A[p]≥A[r], then these values are exchanged, since at line 11 i=p and j=r. In the next iteration, j will be decremented at least once.
Otherwise, j is immediately decremented below r.
At any rate, at termination we have that p≤j<r.
e.
Let's prove the next invariant:
At the beginning of the while loop at lines 4-13, i<j, the subarray A[p:i] contains elements less than or equal to the pivot, and the subarray A[j:r] contains elements greater than or equal to the pivot.
Initialization
Before entering the loop for the first time, we have x=A[p], i=p−1 and j=r+1. The empty subarrays A[p:i] and A[j:r] satisfy the invariant and i<j.
Maintenance
The repeat-until loop at lines 5-7 decreases j up to some position j′<j. We know that every element of A[j′+1:j] is greater than or equal to x. So, A[j′+1:j]∪A[j:r]=A[j′+1:r] has only elements greater than or equal to x.
The repeat-until loop at lines 8-10 increases i up to some position i′>i. We know that every element of A[i:i′−1] is less than or equal to x. So, A[p:i]∪A[i:i′−1]=A[p:i′−1] has only elements less than or equal to x.
Postcondition: Let i=i′ and j=j′. After line 10, but before entering line 11, we have that the subarray A[p:i−1] contains elements less than or equal to the pivot, and the subarray A[j+1:r] contains elements greater than or equal to the pivot. Furthermore, A[i]≥x and A[j]≤x.
If i<j at line 11, then A[i] and A[j] are exchanged, so that the subarray A[p:i] contains elements less than or equal to the pivot and the subarray A[j:r] contains elements greater than or equal to the pivot. Hence, the invariant is reestablished for the next iteration.
Termination
The while loop must eventually terminate, since both indices are moving closer to each other at every iteration. This happens when i≥j at line 11. Nonetheless, no elements are exchanged, so the postcondition stated in the previous section remains true. We can differentiate the following cases:
If i=j, then A[i]=A[j], thus the subarray A[p:j] contains elements less than or equal to the pivot and the subarray A[j+1:r] contains elements greater than or equal to the pivot.
Otherwise, i>j⟹i−1≥j, so the above conclusion again follows.
In both cases, It is true the every element from A[p:j] is less than or equal to every element from A[j+1:r].
f.
Problem 7-2
Available in the latest revision of the IM.
There is an omission in the IM for part (c). Randomized-Partition' also differs at line 2, where A[i] must be exchanged with A[p] instead of A[r].
In part (d) the IM does not cover the best-case scenario. Namely, when all element values are equal, then Quicksort' runs in linear time.
Problem 7-3
Available in the latest revision of the IM.
Problem 7-4
Available in the latest revision of the IM.
Problem 7-5
Available in the latest revision of the IM.
Problem 7-6
a.
The three randomly selected elements must include zi itself, one smaller than zi, and one larger than zi. These elements can be chosen in any order, so we should count the number of 3-sets satisfying these constraints among (3n) such sets. We have pi=n(n−1)(n−2)6(i−1)(n−i).
b.
We need to find 1/np⌊(n+1)/2⌋ at the limit, when n→∞. Observe that i=⌊(n+1)/2⌋ tends to n/2 as n increases. Therefore, we have
c.
For convenience, suppose n is a multiple of 3. We need to find 1/3∑i=n/32n/3pi at the limit, when n→∞. Following the hint from the book, we approximate the sum by an integral.
Thus, the median-of-3 method increases the likelihood of getting a good split compared with the ordinary implementation by 13/9.
d.
The Ω(nlgn) running time of quicksort is still valid even if the algorithm chooses the best pivot (median of the subarray) in every partitioning. Thus, any technique that improves the chances of having good splits cannot influence the asymptotic lower bound of quicksort.
Problem 7-7
a.
The Fuzzy-Partition function is similar to Partition' from Problem 7-2, part (b). The rest of the code of randomized quicksort uses this partitioning function instead of the ordinary version.
b.
Observe that as the intervals overlap more and more, the "equal" sector will grow. These intervals will be omitted from subsequent recursive calls. When all intervals overlap, then after calling Fuzzy-Partition once from the top level, the whole array becomes sorted. Therefore, the expected running time reduces to Θ(n).
Otherwise, our algorithm runs in usual Θ(nlgn) expected time.
Last updated