Divide and Conquer. Textbook Reading Chapters 4, 7 & 33.4
|
|
- Laurel Daniels
- 7 years ago
- Views:
Transcription
1 Divide d Conquer Textook Reading Chapters 4, 7 & 33.4
2 Overview Design principle Divide d conquer Proof technique Induction, induction, induction Analysis technique Recurrence relations Prolems Sorting (Merge Sort d Quick Sort) Selection Matrix multiplication Finding the two closest points
3 Merge Sort MergeSort(A, `, r) if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r)
4 Merging Two Sorted Lists Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j m ` r
5 Divide d Conquer Recursively solve the same prolem on these smaller parts. Comine the solutions computed y the recursive calls to otain the final solution Divide Three steps: Divide the input into smaller parts Recurse Comine
6 Divide d Conquer Recursively solve the same prolem on these smaller parts. Comine the solutions computed y the recursive calls to otain the final solution Divide Three steps: Divide the input into smaller parts Recurse Comine This c e viewed as a reduction technique, reducing the original prolem to simpler prolems to e solved in the divide d/or comine steps.
7 Divide d Conquer Recursively solve the same prolem on these smaller parts. Comine the solutions computed y the recursive calls to otain the final solution Divide Three steps: Divide the input into smaller parts Recurse Comine This c e viewed as a reduction technique, reducing the original prolem to simpler prolems to e solved in the divide d/or comine steps. Example: Once we unfold the recursion of Merge Sort, we re left with nothing ut Merge steps. Thus, we reduce sorting to the simpler prolem of merging sorted lists.
8 Loop Invarits... are a technique for proving the correctness of iterative algorithm. The invarit states conditions that should hold efore d after each iteration. Correctness proof using a loop invarit: Initialization: Prove the invarit holds efore the first iteration. Maintence: Prove each iteration maintains the invarit. Termination: Prove that the correctness of the invarit after the last iteration implies correctness of the algorithm.
9 Correctness of Merge Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j+ Loop invarit: A[`.. k ] L[i.. n ] R[j.. n ] is the set of elements originally in A[`.. r]. A[`.. k ], L[i.. n ], d R[j.. n ] are sorted. x y for all x A[`.. k ] d y L[i.. n ] R[j.. n ]. L A i R k j
10 Correctness of Merge Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j+ Initialization: A[`.. m] is copied to L[.. n ]. A[m +.. r] is copied to R[.. n ]. i =, j =, k =. L A i R k j
11 Correctness of Merge Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j+ Termination: k=r+ A[`.. r] contains all items it contained initially, in sorted order. L A i R k j
12 Correctness of Merge Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j+ Maintence: A[k0 ] L[i] for all k0 < k 0 0 L[i] L[i ] for all i > i L[i] R[j] R[j0 ] for all j0 > j L A i R k j
13 Correctness of Merge Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j+ Maintence: A[k0 ] L[i] for all k0 < k 0 0 L[i] L[i ] for all i > i L[i] R[j] R[j0 ] for all j0 > j L A i R k j
14 Correctness of Merge Merge(A, `, m, r) n = m ` + n = r m for i = to n do L[i] = A[l + i ] for i = to n do R[i] = A[m + i] L[n + ] = R[n + ] = + i=j= for k = ` to r do if L[i] R[j] then A[k] = L[i] i=i+ else A[k] = R[j] j=j+ Maintence: A[k0 ] L[i] for all k0 < k 0 0 L[i] L[i ] for all i > i L[i] R[j] R[j0 ] for all j0 > j L A i R k j
15 Correctness of Merge Sort Lemma: Merge Sort correctly sorts y input array. MergeSort(A, `, r) if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r)
16 Correctness of Merge Sort Lemma: Merge Sort correctly sorts y input array. Proof y induction on n. MergeSort(A, `, r) if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r)
17 Correctness of Merge Sort Lemma: Merge Sort correctly sorts y input array. Proof y induction on n. Base case: (n = ) An empty or one-element array is already sorted. Merge sort does nothing. MergeSort(A, `, r) if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r)
18 Correctness of Merge Sort Lemma: Merge Sort correctly sorts y input array. Proof y induction on n. Base case: (n = ) An empty or one-element array is already sorted. Merge sort does nothing. MergeSort(A, `, r) if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Inductive step: (n > ) The left d right halves have size less th n each. By the inductive hypothesis, the recursive calls sort them correctly. Merge correctly merges the two sorted sequences.
19 Correctness of Divide d Conquer Algorithms Divide d conquer algorithms are the algorithmic incarnation of induction: Base case: Solve trivial instces directly, without recursing. Inductive step: Reduce the solution of a given instce to the solution of smaller instces, y recursing.
20 Correctness of Divide d Conquer Algorithms Divide d conquer algorithms are the algorithmic incarnation of induction: Base case: Solve trivial instces directly, without recursing. Inductive step: Reduce the solution of a given instce to the solution of smaller instces, y recursing. Induction is the natural proof method for divide d conquer algorithms.
21 Recurrence Relations A recurrence relation defines the value f(n) of a function f( ) for argument n in terms of the values of f( ) for arguments smaller th n.
22 Recurrence Relations A recurrence relation defines the value f(n) of a function f( ) for argument n in terms of the values of f( ) for arguments smaller th n. Examples: Fionacci numers: ( Fn = Fn + Fn n = 0 or n = otherwise
23 Recurrence Relations A recurrence relation defines the value f(n) of a function f( ) for argument n in terms of the values of f( ) for arguments smaller th n. Examples: Fionacci numers: ( Fn = Fn + Fn n = 0 or n = otherwise Binomial coefficients ( B(n, k) = k = or k = n B(n, k ) + B(n, k) otherwise
24 A Recurrence for Merge Sort MergeSort(A, `, r) Recurrence: T(n) = if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r)
25 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: T(n) = ( Θ() n
26 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit. If n >, we if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: T(n) = ( Θ() n n>
27 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit If n >, we Spend constt time to determine the middle index m, if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: T(n) = ( Θ() Θ() n n>
28 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit If n >, we Spend constt time to determine the middle index m, Make one recursive call on the left half, which has size dn/e, if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: ( Θ() T(n) = T(dn/e) + Θ() n n>
29 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit. If n >, we Spend constt time to determine the middle index m, Make one recursive call on the left half, which has size dn/e, Make one recursive call on the right half, which has size n/c, d if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: ( Θ() T(n) = T(dn/e) + T(n/c) + Θ() n n>
30 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit. If n >, we Spend constt time to determine the middle index m, Make one recursive call on the left half, which has size dn/e, Make one recursive call on the right half, which has size n/c, d Spend linear time to merge the two resulting sorted sequences if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: ( Θ() T(n) = T(dn/e) + T(n/c) + Θ() n n>
31 A Recurrence for Merge Sort Analysis: MergeSort(A, `, r) If n = 0 or n =, we spend constt time to figure out that there is nothing to do d then exit. If n >, we Spend constt time to determine the middle index m, Make one recursive call on the left half, which has size dn/e, Make one recursive call on the right half, which has size n/c, d Spend linear time to merge the two resulting sorted sequences if r ` then return m = (` + r)/c MergeSort(A, `, m) MergeSort(A, m +, r) Merge(A, `, m, r) Recurrence: ( Θ() T(n) = T(dn/e) + T(n/c) + Θ(n) n n>
32 A Recurrence for Binary Search BinarySearch(A, `, r, x) Recurrence: T(n) = if r < ` then return m = (` + r)/c if x = A[m] then return if x < A[m] then return else return no yes BinarySearch(A, `, m, x) BinarySearch(A, m +, r, x)
33 A Recurrence for Binary Search Analysis: BinarySearch(A, `, r, x) If n = 0, we spend constt time to swer no if r < ` then return m = (` + r)/c if x = A[m] then return if x < A[m] then return else return Recurrence: ( T(n) = Θ() n=0 no yes BinarySearch(A, `, m, x) BinarySearch(A, m +, r, x)
34 A Recurrence for Binary Search Analysis: BinarySearch(A, `, r, x) If n = 0, we spend constt time to swer no. If n > 0, we if r < ` then return m = (` + r)/c if x = A[m] then return if x < A[m] then return else return Recurrence: ( T(n) = Θ() n=0 n>0 no yes BinarySearch(A, `, m, x) BinarySearch(A, m +, r, x)
35 A Recurrence for Binary Search Analysis: BinarySearch(A, `, r, x) If n = 0, we spend constt time to swer no. If n > 0, we Spend constt time to find the middle element d compare it to x if r < ` then return m = (` + r)/c if x = A[m] then return if x < A[m] then return else return Recurrence: ( T(n) = Θ() n=0 Θ() n > 0 no yes BinarySearch(A, `, m, x) BinarySearch(A, m +, r, x)
36 A Recurrence for Binary Search Analysis: BinarySearch(A, `, r, x) If n = 0, we spend constt time to swer no. If n > 0, we Spend constt time to find the middle element d compare it to x. Make one recursive call on one of the two halves, which has size at most n/c, if r < ` then return m = (` + r)/c if x = A[m] then return if x < A[m] then return else return Recurrence: ( T(n) = Θ() n=0 T(n/c) + Θ() n > 0 no yes BinarySearch(A, `, m, x) BinarySearch(A, m +, r, x)
37 Simplified Recurrence Notation The recurrences we use to alyze algorithms all have a ase case of the form T(n) c n n0. The exact choices of c d n0 affect the value of T(n) for y n y only a constt factor.
38 Simplified Recurrence Notation The recurrences we use to alyze algorithms all have a ase case of the form T(n) c n n0. The exact choices of c d n0 affect the value of T(n) for y n y only a constt factor. Floors d ceilings usually affect the value of T(n) y only a constt factor.
39 Simplified Recurrence Notation The recurrences we use to alyze algorithms all have a ase case of the form T(n) c n n0. The exact choices of c d n0 affect the value of T(n) for y n y only a constt factor. Floors d ceilings usually affect the value of T(n) y only a constt factor. So we are lazy d write Merge Sort: T(n) = T(n/) + Θ(n) Binary search: T(n) = T(n/) + Θ()
40 Solving Recurrences Given two algorithms A d B with running times TA (n) = T(n/) + Θ(n) TB (n) = 3T(n/) + Θ(lg n), which one is faster?
41 Solving Recurrences Given two algorithms A d B with running times TA (n) = T(n/) + Θ(n) TB (n) = 3T(n/) + Θ(lg n), which one is faster? A recurrence for T(n) precisely defines T(n), ut it is hard for us to look at the function d say which one grows faster.
42 Solving Recurrences Given two algorithms A d B with running times TA (n) = T(n/) + Θ(n) TB (n) = 3T(n/) + Θ(lg n), which one is faster? A recurrence for T(n) precisely defines T(n), ut it is hard for us to look at the function d say which one grows faster. We wt a closed-form expression for T(n), that is, one of the form T(n) Θ(n), T(n) Θ(n ),..., one that does not depend on T(n0 ) for y n0 < n.
43 Methods for Solving Recurrences Sustitution: Guess the solution. (Intuition, experience, lack magic, recursion trees, trial d error) Use induction to prove that the guess is correct.
44 Methods for Solving Recurrences Sustitution: Guess the solution. (Intuition, experience, lack magic, recursion trees, trial d error) Use induction to prove that the guess is correct. Recursion trees: Draw a tree that visualizes how the recurrence unfolds. Sum up the costs of the nodes in the tree to Otain exact swer if the alysis is done rigorously enough or Otain a guess that c then e verified rigorously using sustitution.
45 Methods for Solving Recurrences Sustitution: Guess the solution. (Intuition, experience, lack magic, recursion trees, trial d error) Use induction to prove that the guess is correct. Recursion trees: Draw a tree that visualizes how the recurrence unfolds. Sum up the costs of the nodes in the tree to Otain exact swer if the alysis is done rigorously enough or Otain a guess that c then e verified rigorously using sustitution. Master Theorem: Cook ook recipe for solving common recurrences. Immediately tells us the solution after we verify some simple conditions to determine which case of the theorem applies.
46 Sustitution: Merge Sort Lemma: The running time of Merge Sort is in O(n lg n).
47 Sustitution: Merge Sort Lemma: The running time of Merge Sort is in O(n lg n). Recurrence: T(n) = T(n/) + O(n)
48 Sustitution: Merge Sort Lemma: The running time of Merge Sort is in O(n lg n). Recurrence: T(n) = T(n/) + O(n), that is, T(n) T(n/) +, for some a > 0 d all n n0.
49 Sustitution: Merge Sort Lemma: The running time of Merge Sort is in O(n lg n). Recurrence: T(n) = T(n/) + O(n), that is, T(n) T(n/) +, for some a > 0 d all n n0. Guess: T(n) cn lg n, for some c > 0 d all n n.
50 Sustitution: Merge Sort Lemma: The running time of Merge Sort is in O(n lg n). Recurrence: T(n) = T(n/) + O(n), that is, T(n) T(n/) +, for some a > 0 d all n n0. Guess: T(n) cn lg n, for some c > 0 d all n n. Base case: For n < 4, T(n) c0 c0 n c0 n lg n, for some c0 > 0.
51 Sustitution: Merge Sort Lemma: The running time of Merge Sort is in O(n lg n). Recurrence: T(n) = T(n/) + O(n), that is, T(n) T(n/) +, for some a > 0 d all n n0. Guess: T(n) cn lg n, for some c > 0 d all n n. Base case: For n < 4, T(n) c0 c0 n c0 n lg n, for some c0 > 0. T(n) cn lg n as long as c c0.
52 Sustitution: Merge Sort Inductive step: (n 4)
53 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) +
54 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) + cn n lg +
55 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) + cn n lg + = cn(lg n ) +
56 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) + cn n lg + = cn(lg n ) + = cn lg n + (a c)n
57 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) + cn n lg + = cn(lg n ) + = cn lg n + (a c)n cn lg n, for all c a.
58 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) + cn n lg + = cn(lg n ) + = cn lg n + (a c)n cn lg n, for all c a. Notes: We only proved the upper ound. The lower ound, T(n) Ω(n lg n) c e proven alogously.
59 Sustitution: Merge Sort Inductive step: (n 4) T(n) T(n/) + cn n lg + = cn(lg n ) + = cn lg n + (a c)n cn lg n, for all c a. Notes: We only proved the upper ound. The lower ound, T(n) Ω(n lg n) c e proven alogously. Since the ase case is valid only for n d we use the inductive hypothesis for n/ in the inductive step, the inductive step is valid only for n 4. Hence, a ase case for n < 4.
60 Sustitution: Binary Search Lemma: The running time of inary search is in O(lg n).
61 Sustitution: Binary Search Lemma: The running time of inary search is in O(lg n). Recurrence: T(n) = T(n/) + O(), that is, T(n) T(n/) + a, for some a > 0 d all n n0.
62 Sustitution: Binary Search Lemma: The running time of inary search is in O(lg n). Recurrence: T(n) = T(n/) + O(), that is, T(n) T(n/) + a, for some a > 0 d all n n0. Guess: T(n) c lg n, for some c > 0 d all n n.
63 Sustitution: Binary Search Lemma: The running time of inary search is in O(lg n). Recurrence: T(n) = T(n/) + O(), that is, T(n) T(n/) + a, for some a > 0 d all n n0. Guess: T(n) c lg n, for some c > 0 d all n n. Base case: For n < 4, T(n) c0 c0 lg n, for some c0 > 0.
64 Sustitution: Binary Search Lemma: The running time of inary search is in O(lg n). Recurrence: T(n) = T(n/) + O(), that is, T(n) T(n/) + a, for some a > 0 d all n n0. Guess: T(n) c lg n, for some c > 0 d all n n. Base case: For n < 4, T(n) c0 c0 lg n, for some c0 > 0. T(n) c lg n as long as c c0.
65 Sustitution: Binary Search Inductive step: (n 4)
66 Sustitution: Binary Search Inductive step: (n 4) T(n) T(n/) + a
67 Sustitution: Binary Search Inductive step: (n 4) T(n) T(n/) + a n c lg + a
68 Sustitution: Binary Search Inductive step: (n 4) T(n) T(n/) + a n c lg + a = c(lg n ) + a
69 Sustitution: Binary Search Inductive step: (n 4) T(n) T(n/) + a n c lg + a = c(lg n ) + a = c lg n + (a c)
70 Sustitution: Binary Search Inductive step: (n 4) T(n) T(n/) + a n c lg + a = c(lg n ) + a = c lg n + (a c) c lg n, for all c a.
71 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence d the claim T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n?
72 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence d the claim T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n? If we re not careful, we may prove nonsensical results: Recurrence: T(n) = T(n ) + n Claim: T(n) O(n) Note that T(n) Θ(n )!
73 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n? d the claim If we re not careful, we may prove nonsensical results: Recurrence: T(n) = T(n ) + n Claim: T(n) O(n) Base case: (n = ) T(n) = T() = O(n)
74 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n? d the claim If we re not careful, we may prove nonsensical results: Recurrence: T(n) = T(n ) + n Claim: T(n) O(n) Base case: (n = ) T(n) = T() = O(n) Inductive step: (n > ) T(n) = T(n ) + n = O(n ) + n = O(n) + n = O(n)
75 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n? d the claim If we re not careful, we may prove nonsensical results: Recurrence: T(n) = T(n ) + n Claim: T(n) O(n) Base case: (n = ) T(n) = T() = O(n) Inductive step: (n > ) T(n) = T(n ) + n = O(n ) + n = O(n) + n = O(n) T(n) cn
76 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n? d the claim If we re not careful, we may prove nonsensical results: Recurrence: T(n) = T(n ) + n Claim: T(n) O(n) T(n) cn Base case: (n = ) T(n) = T() = O(n) Inductive step: (n > ) T(n) = T(n ) + n = O(n ) + n = O(n) + n = O(n) T(n) = T() = c cn
77 Sustitution d Asymptotic Notation Why did we expd the Merge Sort recurrence T(n) = T(n/) + O(n) to T(n) T(n/) + T(n) O(n lg n) to T(n) cn lg n? d the claim If we re not careful, we may prove nonsensical results: Recurrence: T(n) = T(n ) + n Claim: T(n) O(n) T(n) cn Base case: (n = ) T(n) = T() = O(n) T(n) = T() = c cn Inductive step: (n > ) T(n) = T(n ) + n = O(n ) + n = O(n) + n = O(n) T(n) = T(n ) + n c(n ) + n = cn + (n c) > cn!
78 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) +
79 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case
80 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case T(n)
81 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case T n T n
82 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case T n 4 T n 4 T n 4 T n 4
83 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case 4 T n 8 4 T n 8 T n 8 4 T n 8 T n 8 4 T n 8 T n 8 T n 8
84 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case
85 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case
86 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case
87 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case
88 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case n
89 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case n
90 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case 4 lg n n
91 A Recursion Tree for Merge Sort Recurrence: T(n) = T(n/) + Θ(n) T(n) = T(n/) + Strategy: Expd the recurrence all the way down to the ase case 4 lg n Solution: T(n) Θ(n lg n) n
92 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a
93 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a T(n)
94 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a a T n
95 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a a a T n 4
96 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a a a a T n 8
97 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a a a a a
98 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a a a lg n a a
99 A Recursion Tree for Binary Search Recurrence: T(n) = T(n/) + Θ() T(n) = T(n/) + a a a lg n a a Solution: T(n) Θ(lg n)
100 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) +
101 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + a a a 8n 7 4n 9 n 3 n a n 3 a n 9 a n 7
102 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + a a log 3 n a 8n 7 4n 9 n 3 n a n 3 a n 9 a n 7
103 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + a a log 3 n a 8n 7 4n 9 n 3 n a n 3 a n 9 a log3 n n 7
104 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + log 3 n log3 n
105 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + log 3 n log3 n
106 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + log 3 n log3 n
107 A Less Ovious Recursion Tree Recurrence: T(n) = T(n/3) + T(n/3) + Θ(n) T(n) = T(n/3) + T(n/3) + log 3 n Solution: T(n) Θ(n lg n) log3 n
108 Sometimes Only Sustitution Will Do Recurrence: T(n) = T(n/3) + T(n/) + Θ(n) log n ith level: Θ(n (7/6)i ) log 3 n Lower ound: T(n) Ω(n+log (7/6) ) Ω(n. ) Upper ound: T(n) O(n+log3/ (7/6) ) O(n.38 )
109 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)).
110 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Merge Sort again T(n) = T(n/) + Θ(n)
111 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Merge Sort again T(n) = T(n/) + Θ(n) a= = f(n) Θ(n)
112 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Merge Sort again T(n) = T(n/) + Θ(n) a= = f(n) Θ(n) = Θ(nlog )
113 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Merge Sort again T(n) = T(n/) + Θ(n) a= = f(n) Θ(n) = Θ(nlog ) T(n) Θ(n lg n)
114 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Matrix multiplication T(n) = 7T(n/) + Θ(n )
115 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Matrix multiplication T(n) = 7T(n/) + Θ(n ) a=7 = f(n) Θ(n )
116 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Matrix multiplication T(n) = 7T(n/) + Θ(n ) a=7 = f(n) Θ(n ) O(nlog 7 ) for all 0 < log 7
117 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example : Matrix multiplication T(n) = 7T(n/) + Θ(n ) a=7 = f(n) Θ(n ) O(nlog 7 ) for all 0 < log 7 T(n) Θ(nlog 7 ) Θ(n.8 )
118 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example 3: T(n) = T(n/) + n
119 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example 3: T(n) = T(n/) + n a= f(n) = n =
120 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example 3: T(n) = T(n/) + n a= = f(n) = n Ω(nlog + ) for all 0 <
121 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example 3: T(n) = T(n/) + n a= = f(n) = n Ω(nlog + ) for all 0 < f(n/) = n/ f(n)/
122 Master Theorem Master Theorem: Let a d >, let f(n) e a positive function d let T(n) e given y the following recurrence: n + f(n). T(n) = a T (i) If f(n) O(nlog a ), for some > 0, then T(n) Θ(nlog a ). (ii) If f(n) Θ(nlog a ), then T(n) Θ(nlog a lg n). (iii) If f(n) Ω(nlog a+ ) d a f(n/) cf(n), for some > 0 d c < d for all n n0, then T(n) Θ(f(n)). Example 3: T(n) = T(n/) + n a= = f(n) = n Ω(nlog + ) for all 0 < T(n) Θ(n) f(n/) = n/ f(n)/
123 Master Theorem: Proof T(n) = a T n + f(n) f(n) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
124 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) f(n) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
125 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) i a f n i i O a log a! n i log a = O(n i f(n) ) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
126 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) i a f n i i O a log a! n i log a = O(n i ) f(n) O(...) O(...) log n O(...) i alog n Θ() = Θ(nlog a )
127 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) i a f n i O a i log a T(n) = Θ(n log a! n ) + O(n i log a log n ) X i= log a = O(n i ) f(n) O(...) i O(...) log n O(...) i alog n Θ() = Θ(nlog a )
128 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) i a f n i O a i log a T(n) = Θ(n log a = Θ(n log a! n ) + O(n ) + O(n i log a log a log n ) X log a = O(n i ) f(n) O(...) i i= log n ( ) ) O(...) log n O(...) i alog n Θ() = Θ(nlog a )
129 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) i a f n i O a i log a T(n) = Θ(n log a! n ) + O(n i log a log n ) X log a = O(n log a ) f(n) O(...) i i= log n ( ) = Θ(n ) + O(n ) = Θ(nlog a ) + O(nlog a ) n log a i O(...) log n O(...) i alog n Θ() = Θ(nlog a )
130 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) O(nlog a ) i a f n i O a i log a T(n) = Θ(n log a! n ) + O(n i log a log n ) X log a = O(n log a ) f(n) O(...) i i= log n ( ) = Θ(n ) + O(n ) = Θ(nlog a ) + O(nlog a ) n log a i O(...) log n O(...) i = Θ(nlog a ) alog n Θ() = Θ(nlog a )
131 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) Θ(nlog a ) f(n) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
132 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) Θ(nlog a ) ai f n i Θ ai log a! n i = Θ(nlog a ) f(n) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
133 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) Θ(nlog a ) ai f n i Θ ai log a! n i = Θ(nlog a ) f(n) Θ(nlog a ) Θ(nlog a ) log n Θ(nlog a ) alog n Θ() = Θ(nlog a )
134 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) Θ(nlog a ) ai f n i Θ ai log a! n T(n) = Θ(nlog a ) log n i = Θ(nlog a ) f(n) Θ(nlog a ) Θ(nlog a ) log n Θ(nlog a ) alog n Θ() = Θ(nlog a )
135 Master Theorem: Proof T(n) = a T n + f(n) Case : f(n) Θ(nlog a ) ai f n i Θ ai log a! n T(n) = Θ(nlog a ) log n = Θ(nlog a lg n) i = Θ(nlog a ) f(n) Θ(nlog a ) Θ(nlog a ) log n Θ(nlog a ) alog n Θ() = Θ(nlog a )
136 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < f(n) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
137 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) a f( n ) a f i a f n n i alog n Θ() = Θ(nlog a ) log n
138 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) n 0 For i = 0, a f 0 = f(n) = c0 f(n). a f( n ) n a f i a f n i alog n Θ() = Θ(nlog a ) log n
139 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) For i > 0,!! n i a f( ) n n/ ai f i = ai a f n a f i a f n i alog n Θ() = Θ(nlog a ) log n
140 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) For i > 0,!! n i a f( ) n n/ ai f i = ai a f n a f n i a c f i n i a f i alog n Θ() = Θ(nlog a ) log n
141 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) For i > 0,!! n i a f( ) n n/ ai f i = ai a f n a f n i a c f i n n i a f i = c ai f i alog n Θ() = Θ(nlog a ) log n
142 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) For i > 0,!! n i a f( ) n n/ ai f i = ai a f n a f n i a c f i n n i a f i = c ai f i i c c f(n) alog n Θ() = Θ(nlog a ) log n
143 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) For i > 0,!! n i a f( ) n n/ ai f i = ai a f n a f n i a c f i n n i a f i = c ai f i i c c f(n) = ci f(n) alog n Θ() = Θ(nlog a ) log n
144 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) c f(n) c f(n) log n ci f(n) alog n Θ() = Θ(nlog a )
145 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) T(n) Ω(nlog a + f(n)) = Ω(f(n)) c f(n) c f(n) log n ci f(n) alog n Θ() = Θ(nlog a )
146 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) T(n) Ω(nlog a + f(n)) = Ω(f(n))! X T(n) O nlog a + ci f(n) i=0 c f(n) c f(n) log n ci f(n) alog n Θ() = Θ(nlog a )
147 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) T(n) Ω(nlog a + f(n)) = Ω(f(n))! X T(n) O nlog a + ci f(n) i=0 = O nlog a + f(n) X i=0 c f(n) c f(n) log n! ci ci f(n) alog n Θ() = Θ(nlog a )
148 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) T(n) Ω(nlog a + f(n)) = Ω(f(n))! X T(n) O nlog a + ci f(n) i=0 = O nlog a + f(n) X c f(n) log n! ci i=0 = O nlog a + f(n) c f(n) c ci f(n) alog n Θ() = Θ(nlog a )
149 Master Theorem: Proof T(n) = a T n + f(n) Case 3: f(n) Ω(nlog a+ ) d a f(n/) c f(n) for some c < n i Claim: a f i ci f(n) f(n) T(n) Ω(nlog a + f(n)) = Ω(f(n))! X T(n) O nlog a + ci f(n) i=0 = O nlog a + f(n) X = O(f(n)) c f(n) log n! ci i=0 = O nlog a + f(n) c f(n) c ci f(n) alog n Θ() = Θ(nlog a )
150 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r)
151 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p p p
152 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) Worst case: p Partition p p p
153 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p Worst case: T(n) = Θ(n) + T(n ) p p
154 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p p Worst case: T(n) = Θ(n) + T(n ) = Θ(n ) p
155 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p p Worst case: T(n) = Θ(n) + T(n ) = Θ(n ) Best case: p
156 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p p Worst case: T(n) = Θ(n) + T(n ) = Θ(n ) Best case: T(n) = Θ(n) + T(n/c) p
157 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p p Worst case: T(n) = Θ(n) + T(n ) = Θ(n ) Best case: T(n) = Θ(n) + T(n/c) = Θ(n lg n) p
158 Quick Sort QuickSort(A, `, r) if r ` then return m = Partition(A, `, r) QuickSort(A, `, m ) QuickSort(A, m +, r) p Partition p p Worst case: T(n) = Θ(n) + T(n ) = Θ(n ) Best case: T(n) = Θ(n) + T(n/c) = Θ(n lg n) Average case: T(n) = Θ(n lg n) p
159 Two Partitioning Algorithms HoarePartition(A, l, r) x = A[r] i=l j=r+ while True do repeat i = i + until A[i] x repeat j = j until A[j] x if i < j then swap A[i] d A[j] else return j Loop invarits: x x? i j
160 Two Partitioning Algorithms HoarePartition(A, l, r) LomutoPartition(A, l, r) x = A[r] i=l j=r+ while True do repeat i = i + until A[i] x repeat j = j until A[j] x if i < j then swap A[i] d A[j] else return j i=l for j = l to r do if A[j] A[r] then i = i + swap A[i] d A[j] swap A[i + ] d A[r] return i + Loop invarits: x x? i j >x x i? j x
161 Two Partitioning Algorithms HoarePartition(A, l, r) x = A[r] i=l j=r+ while True do repeat i = i + until A[i] x repeat j = j until A[j] x if i < j then swap A[i] d A[j] else return j LomutoPartition(A, l, r) i=l for j = l to r do if A[j] A[r] then i = i + swap A[i] d A[j] swap A[i + ] d A[r] return i + HoarePartition is more efficient in practice. LomutoPartition has some properties that make average-case alysis easier. HoarePartition is more convenient for worst-case Quick Sort.
162 Two Partitioning Algorithms HoarePartition(A, l, r, x) i=l j=r+ while True do repeat i = i + until A[i] x repeat j = j until A[j] x if i < j then swap A[i] d A[j] else return j LomutoPartition(A, l, r) i=l for j = l to r do if A[j] A[r] then i = i + swap A[i] d A[j] swap A[i + ] d A[r] return i + HoarePartition is more efficient in practice. LomutoPartition has some properties that make average-case alysis easier. HoarePartition is more convenient for worst-case Quick Sort.
163 Selection Goal: Given unsorted array A d integer k n, find the kth smallest element in A th smallest element
164 Selection Goal: Given unsorted array A d integer k n, find the kth smallest element in A th smallest element First idea: Sort the array d then return the element in the kth position. O(n lg n) time
165 Selection Goal: Given unsorted array A d integer k n, find the kth smallest element in A th smallest element First idea: Sort the array d then return the element in the kth position. O(n lg n) time We c find the minimum (k = ) or the maximum (k = n) in O(n) time!
166 Selection Goal: Given unsorted array A d integer k n, find the kth smallest element in A th smallest element First idea: Sort the array d then return the element in the kth position. O(n lg n) time We c find the minimum (k = ) or the maximum (k = n) in O(n) time! It would e nice if we were ale to find the kth smallest element, for y k, in O(n) time.
167 The Sorting Idea Isn t Half Bad, But... To find the kth smallest element, we don t need to sort the input completely. We only need to verify that there are exactly k elements smaller th the element we return.
168 The Sorting Idea Isn t Half Bad, But... To find the kth smallest element, we don t need to sort the input completely. We only need to verify that there are exactly k elements smaller th the element we return. p Partition p L p p R
169 The Sorting Idea Isn t Half Bad, But... To find the kth smallest element, we don t need to sort the input completely. We only need to verify that there are exactly k elements smaller th the element we return. p Partition p p L If L = k, then p is the kth smallest element. p R
170 The Sorting Idea Isn t Half Bad, But... To find the kth smallest element, we don t need to sort the input completely. We only need to verify that there are exactly k elements smaller th the element we return. p Partition p p L p R If L = k, then p is the kth smallest element. If L k, then the kth smallest element in L is the kth smallest element in A.
171 The Sorting Idea Isn t Half Bad, But... To find the kth smallest element, we don t need to sort the input completely. We only need to verify that there are exactly k elements smaller th the element we return. p Partition p p L p R If L = k, then p is the kth smallest element. If L k, then the kth smallest element in L is the kth smallest element in A. If L < k, then the (k L + )st element in R is the kth smallest element in A.
172 Quick Select QuickSelect(A, `, r, k) if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) p p
173 Quick Select QuickSelect(A, `, r, k) if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) Worst case: p p
174 Quick Select QuickSelect(A, `, r, k) p if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) Worst case: T(n) = Θ(n) + T(n ) = Θ(n ) p
175 Quick Select QuickSelect(A, `, r, k) p if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) Worst case: Best case: T(n) = Θ(n) + T(n ) = Θ(n ) p
176 Quick Select QuickSelect(A, `, r, k) p if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) Worst case: Best case: T(n) = Θ(n) + T(n ) = Θ(n ) T(n) = Θ(n) p
177 Quick Select QuickSelect(A, `, r, k) p if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) Worst case: Best case: T(n) = Θ(n) + T(n ) = Θ(n ) T(n) = Θ(n) + T(n/) = Θ(n) p
178 Quick Select QuickSelect(A, `, r, k) p if r ` Partition then return A[`] m = Partition(A, `, r) p p if m ` = k then return A[m] else if m ` k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) Worst case: Best case: T(n) = Θ(n) + T(n ) = Θ(n ) T(n) = Θ(n) + T(n/) = Θ(n) Average case: T(n) = Θ(n) p
179 Worst-Case Selection QuickSelect(A, `, r, k) if r ` then return A[`] p = FindPivot(A, `, r) m = HoarePartition(A, `, r, p) if m ` + k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `))
180 Worst-Case Selection QuickSelect(A, `, r, k) if r ` then return A[`] p = FindPivot(A, `, r) m = HoarePartition(A, `, r, p) if m ` + k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) If we could guartee that p is the medi of A[`.. r], then we d recurse on at most n/ elements.
181 Worst-Case Selection QuickSelect(A, `, r, k) if r ` then return A[`] p = FindPivot(A, `, r) m = HoarePartition(A, `, r, p) if m ` + k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) If we could guartee that p is the medi of A[`.. r], then we d recurse on at most n/ elements. T(n) = Θ(n) + T(n/) = Θ(n).
182 Worst-Case Selection QuickSelect(A, `, r, k) if r ` then return A[`] p = FindPivot(A, `, r) m = HoarePartition(A, `, r, p) if m ` + k then return QuickSelect(A, `, m, k) else return QuickSelect(A, m +, r, k (m + `)) If we could guartee that p is the medi of A[`.. r], then we d recurse on at most n/ elements. T(n) = Θ(n) + T(n/) = Θ(n). Alas, finding the medi is selection!
183 Making Do With An Approximate Medi p n p p n If there are at least n elements smaller th p d at least n elements greater th p, then T(n) Θ(n) + T(( )n) = Θ(n).
184 Finding An Approximate Medi FindPivot(A, `, r) n0 = (r `)/5c + for i = 0 to n0 do InsertionSort(A, ` + 5 i, min(` + 5 i + 4, r)) if ` + 5i + 4 r then B[i + ] = A[` + 5 i + ] else B[i + ] = A[` + 5 i] return QuickSelect(B,, n0, dn0 /e) Approximate medi
185 Finding An Approximate Medi
186 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis.
187 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis. For at least dn0 /e of the medis, there are two elements in each of their groups that are smaller th the medi of medis.
188 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis. For at least dn0 /e of the medis, there are two elements in each of their groups that are smaller th the medi of medis.
189 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis. For at least dn0 /e of the medis, there are two elements in each of their groups that are smaller th the medi of medis. n0 = dn/5e
190 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis. For at least dn0 /e of the medis, there are two elements in each of their groups that are smaller th the medi of medis. n0 = dn/5e Total numer of elements smaller th the medi of medis: 0 n dn/5e 3n 3 =
191 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis. For at least dn0 /e of the medis, there are two elements in each of their groups that are smaller th the medi of medis. n0 = dn/5e Total numer of elements smaller th the medi of medis: 0 n dn/5e 3n 3 = The same alysis holds for counting the numer of elements greater th the medi of medis.
192 Finding An Approximate Medi There are at least dn0 /e medis smaller th the medi of medis. For at least dn0 /e of the medis, there are two elements in each of their groups that are smaller th the medi of medis. n0 = dn/5e Total numer of elements smaller th the medi of medis: 0 n dn/5e 3n 3 = The same alysis holds for counting the numer of elements greater th the medi of medis.
193 Worst-Case Selection: Analysis FindPivot (excluding the recursive call to QuickSelect) d Partition take O(n) time.
194 Worst-Case Selection: Analysis FindPivot (excluding the recursive call to QuickSelect) d Partition take O(n) time. FindPivot recurses on d 5n e < n 5 + elements.
195 Worst-Case Selection: Analysis FindPivot (excluding the recursive call to QuickSelect) d Partition take O(n) time. FindPivot recurses on d 5n e < n 5 + elements. QuickSelect itself recurses on at most 7n elements.
196 Worst-Case Selection: Analysis FindPivot (excluding the recursive call to QuickSelect) d Partition take O(n) time. FindPivot recurses on d 5n e < n 5 + elements. QuickSelect itself recurses on at most 7n elements. T(n) T( 5n + ) + T( 7n 0 + 3) + O(n)
197 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n.
198 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n. Base case: (n < 80)
199 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n. Base case: (n < 80) We already oserved that the running time is in O(n ) in the worst case. Since n O(), n O().
200 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n. Base case: (n < 80) We already oserved that the running time is in O(n ) in the worst case. Since n O(), n O(). T(n) c0 cn for c sufficiently large.
201 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n. Base case: (n < 80) We already oserved that the running time is in O(n ) in the worst case. Since n O(), n O(). T(n) c0 cn for c sufficiently large. Inductive Step: (n 80)
202 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n. Base case: (n < 80) We already oserved that the running time is in O(n ) in the worst case. Since n O(), n O(). T(n) c0 cn for c sufficiently large. Inductive Step: (n 80) n 7n T(n) T + +T
203 Worst-Case Selection: Analysis Claim: T(n) O(n), that is, T(n) cn, for some c > 0 d all n. Base case: (n < 80) We already oserved that the running time is in O(n ) in the worst case. Since n O(), n O(). T(n) c0 cn for c sufficiently large. Inductive Step: (n 80) n 7n T(n) T + +T n 7n c + +c
Analysis of Algorithms I: Optimal Binary Search Trees
Analysis of Algorithms I: Optimal Binary Search Trees Xi Chen Columbia University Given a set of n keys K = {k 1,..., k n } in sorted order: k 1 < k 2 < < k n we wish to build an optimal binary search
More informationMany algorithms, particularly divide and conquer algorithms, have time complexities which are naturally
Recurrence Relations Many algorithms, particularly divide and conquer algorithms, have time complexities which are naturally modeled by recurrence relations. A recurrence relation is an equation which
More informationCSC148 Lecture 8. Algorithm Analysis Binary Search Sorting
CSC148 Lecture 8 Algorithm Analysis Binary Search Sorting Algorithm Analysis Recall definition of Big Oh: We say a function f(n) is O(g(n)) if there exists positive constants c,b such that f(n)
More informationSection IV.1: Recursive Algorithms and Recursion Trees
Section IV.1: Recursive Algorithms and Recursion Trees Definition IV.1.1: A recursive algorithm is an algorithm that solves a problem by (1) reducing it to an instance of the same problem with smaller
More informationOutline BST Operations Worst case Average case Balancing AVL Red-black B-trees. Binary Search Trees. Lecturer: Georgy Gimel farb
Binary Search Trees Lecturer: Georgy Gimel farb COMPSCI 220 Algorithms and Data Structures 1 / 27 1 Properties of Binary Search Trees 2 Basic BST operations The worst-case time complexity of BST operations
More informationLecture Notes on Linear Search
Lecture Notes on Linear Search 15-122: Principles of Imperative Computation Frank Pfenning Lecture 5 January 29, 2013 1 Introduction One of the fundamental and recurring problems in computer science is
More informationAnalysis of Binary Search algorithm and Selection Sort algorithm
Analysis of Binary Search algorithm and Selection Sort algorithm In this section we shall take up two representative problems in computer science, work out the algorithms based on the best strategy to
More informationAlgorithms. Margaret M. Fleck. 18 October 2010
Algorithms Margaret M. Fleck 18 October 2010 These notes cover how to analyze the running time of algorithms (sections 3.1, 3.3, 4.4, and 7.1 of Rosen). 1 Introduction The main reason for studying big-o
More informationThe Tower of Hanoi. Recursion Solution. Recursive Function. Time Complexity. Recursive Thinking. Why Recursion? n! = n* (n-1)!
The Tower of Hanoi Recursion Solution recursion recursion recursion Recursive Thinking: ignore everything but the bottom disk. 1 2 Recursive Function Time Complexity Hanoi (n, src, dest, temp): If (n >
More informationRecursive Algorithms. Recursion. Motivating Example Factorial Recall the factorial function. { 1 if n = 1 n! = n (n 1)! if n > 1
Recursion Slides by Christopher M Bourke Instructor: Berthe Y Choueiry Fall 007 Computer Science & Engineering 35 Introduction to Discrete Mathematics Sections 71-7 of Rosen cse35@cseunledu Recursive Algorithms
More informationClosest Pair Problem
Closest Pair Problem Given n points in d-dimensions, find two whose mutual distance is smallest. Fundamental problem in many applications as well as a key step in many algorithms. p q A naive algorithm
More informationHOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!
Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following
More informationBiostatistics 615/815
Merge Sort Biostatistics 615/815 Lecture 8 Notes on Problem Set 2 Union Find algorithms Dynamic Programming Results were very ypositive! You should be gradually becoming g y g comfortable compiling, debugging
More informationCS473 - Algorithms I
CS473 - Algorithms I Lecture 4 The Divide-and-Conquer Design Paradigm View in slide-show mode 1 Reminder: Merge Sort Input array A sort this half sort this half Divide Conquer merge two sorted halves Combine
More informationClass Overview. CSE 326: Data Structures. Goals. Goals. Data Structures. Goals. Introduction
Class Overview CSE 326: Data Structures Introduction Introduction to many of the basic data structures used in computer software Understand the data structures Analyze the algorithms that use them Know
More informationSample Induction Proofs
Math 3 Worksheet: Induction Proofs III, Sample Proofs A.J. Hildebrand Sample Induction Proofs Below are model solutions to some of the practice problems on the induction worksheets. The solutions given
More informationSorting Algorithms. Nelson Padua-Perez Bill Pugh. Department of Computer Science University of Maryland, College Park
Sorting Algorithms Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park Overview Comparison sort Bubble sort Selection sort Tree sort Heap sort Quick sort Merge
More informationData Structures. Algorithm Performance and Big O Analysis
Data Structures Algorithm Performance and Big O Analysis What s an Algorithm? a clearly specified set of instructions to be followed to solve a problem. In essence: A computer program. In detail: Defined
More informationData Structures Fibonacci Heaps, Amortized Analysis
Chapter 4 Data Structures Fibonacci Heaps, Amortized Analysis Algorithm Theory WS 2012/13 Fabian Kuhn Fibonacci Heaps Lacy merge variant of binomial heaps: Do not merge trees as long as possible Structure:
More informationFull and Complete Binary Trees
Full and Complete Binary Trees Binary Tree Theorems 1 Here are two important types of binary trees. Note that the definitions, while similar, are logically independent. Definition: a binary tree T is full
More information3. Mathematical Induction
3. MATHEMATICAL INDUCTION 83 3. Mathematical Induction 3.1. First Principle of Mathematical Induction. Let P (n) be a predicate with domain of discourse (over) the natural numbers N = {0, 1,,...}. If (1)
More informationEfficiency of algorithms. Algorithms. Efficiency of algorithms. Binary search and linear search. Best, worst and average case.
Algorithms Efficiency of algorithms Computational resources: time and space Best, worst and average case performance How to compare algorithms: machine-independent measure of efficiency Growth rate Complexity
More informationConverting a Number from Decimal to Binary
Converting a Number from Decimal to Binary Convert nonnegative integer in decimal format (base 10) into equivalent binary number (base 2) Rightmost bit of x Remainder of x after division by two Recursive
More informationCS104: Data Structures and Object-Oriented Design (Fall 2013) October 24, 2013: Priority Queues Scribes: CS 104 Teaching Team
CS104: Data Structures and Object-Oriented Design (Fall 2013) October 24, 2013: Priority Queues Scribes: CS 104 Teaching Team Lecture Summary In this lecture, we learned about the ADT Priority Queue. A
More informationRandomized algorithms
Randomized algorithms March 10, 2005 1 What are randomized algorithms? Algorithms which use random numbers to make decisions during the executions of the algorithm. Why would we want to do this?? Deterministic
More informationMATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.
MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column
More informationWhy? A central concept in Computer Science. Algorithms are ubiquitous.
Analysis of Algorithms: A Brief Introduction Why? A central concept in Computer Science. Algorithms are ubiquitous. Using the Internet (sending email, transferring files, use of search engines, online
More informationCS473 - Algorithms I
CS473 - Algorithms I Lecture 9 Sorting in Linear Time View in slide-show mode 1 How Fast Can We Sort? The algorithms we have seen so far: Based on comparison of elements We only care about the relative
More information1/1 7/4 2/2 12/7 10/30 12/25
Binary Heaps A binary heap is dened to be a binary tree with a key in each node such that: 1. All leaves are on, at most, two adjacent levels. 2. All leaves on the lowest level occur to the left, and all
More informationAnalysis of a Search Algorithm
CSE 326 Lecture 4: Lists and Stacks 1. Agfgd 2. Dgsdsfd 3. Hdffdsf 4. Sdfgsfdg 5. Tefsdgass We will review: Analysis: Searching a sorted array (from last time) List ADT: Insert, Delete, Find, First, Kth,
More informationChapter 3. if 2 a i then location: = i. Page 40
Chapter 3 1. Describe an algorithm that takes a list of n integers a 1,a 2,,a n and finds the number of integers each greater than five in the list. Ans: procedure greaterthanfive(a 1,,a n : integers)
More informationThe Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge,
The Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge, cheapest first, we had to determine whether its two endpoints
More informationrecursion, O(n), linked lists 6/14
recursion, O(n), linked lists 6/14 recursion reducing the amount of data to process and processing a smaller amount of data example: process one item in a list, recursively process the rest of the list
More informationSequential Data Structures
Sequential Data Structures In this lecture we introduce the basic data structures for storing sequences of objects. These data structures are based on arrays and linked lists, which you met in first year
More informationAppendix: Solving Recurrences [Fa 10] Wil Wheaton: Embrace the dark side! Sheldon: That s not even from your franchise!
Change is certain. Peace is followed by disturbances; departure of evil men by their return. Such recurrences should not constitute occasions for sadness but realities for awareness, so that one may be
More informationLecture 1: Course overview, circuits, and formulas
Lecture 1: Course overview, circuits, and formulas Topics in Complexity Theory and Pseudorandomness (Spring 2013) Rutgers University Swastik Kopparty Scribes: John Kim, Ben Lund 1 Course Information Swastik
More informationSolution to Homework 2
Solution to Homework 2 Olena Bormashenko September 23, 2011 Section 1.4: 1(a)(b)(i)(k), 4, 5, 14; Section 1.5: 1(a)(b)(c)(d)(e)(n), 2(a)(c), 13, 16, 17, 18, 27 Section 1.4 1. Compute the following, if
More informationComputer Science 210: Data Structures. Searching
Computer Science 210: Data Structures Searching Searching Given a sequence of elements, and a target element, find whether the target occurs in the sequence Variations: find first occurence; find all occurences
More informationClassification - Examples
Lecture 2 Scheduling 1 Classification - Examples 1 r j C max given: n jobs with processing times p 1,...,p n and release dates r 1,...,r n jobs have to be scheduled without preemption on one machine taking
More informationAnalysis of Algorithms I: Binary Search Trees
Analysis of Algorithms I: Binary Search Trees Xi Chen Columbia University Hash table: A data structure that maintains a subset of keys from a universe set U = {0, 1,..., p 1} and supports all three dictionary
More information6. Standard Algorithms
6. Standard Algorithms The algorithms we will examine perform Searching and Sorting. 6.1 Searching Algorithms Two algorithms will be studied. These are: 6.1.1. inear Search The inear Search The Binary
More informationKevin James. MTHSC 412 Section 2.4 Prime Factors and Greatest Comm
MTHSC 412 Section 2.4 Prime Factors and Greatest Common Divisor Greatest Common Divisor Definition Suppose that a, b Z. Then we say that d Z is a greatest common divisor (gcd) of a and b if the following
More informationIntroduction to Algorithms March 10, 2004 Massachusetts Institute of Technology Professors Erik Demaine and Shafi Goldwasser Quiz 1.
Introduction to Algorithms March 10, 2004 Massachusetts Institute of Technology 6.046J/18.410J Professors Erik Demaine and Shafi Goldwasser Quiz 1 Quiz 1 Do not open this quiz booklet until you are directed
More informationCSC 180 H1F Algorithm Runtime Analysis Lecture Notes Fall 2015
1 Introduction These notes introduce basic runtime analysis of algorithms. We would like to be able to tell if a given algorithm is time-efficient, and to be able to compare different algorithms. 2 Linear
More informationA Note on Maximum Independent Sets in Rectangle Intersection Graphs
A Note on Maximum Independent Sets in Rectangle Intersection Graphs Timothy M. Chan School of Computer Science University of Waterloo Waterloo, Ontario N2L 3G1, Canada tmchan@uwaterloo.ca September 12,
More informationData Structures and Algorithms
Data Structures and Algorithms CS245-2016S-06 Binary Search Trees David Galles Department of Computer Science University of San Francisco 06-0: Ordered List ADT Operations: Insert an element in the list
More informationCS 103X: Discrete Structures Homework Assignment 3 Solutions
CS 103X: Discrete Structures Homework Assignment 3 s Exercise 1 (20 points). On well-ordering and induction: (a) Prove the induction principle from the well-ordering principle. (b) Prove the well-ordering
More information5.2 The Master Theorem
170 CHAPTER 5. RECURSION AND RECURRENCES 5.2 The Master Theorem Master Theorem In the last setion, we saw three different kinds of behavior for reurrenes of the form at (n/2) + n These behaviors depended
More informationAPP INVENTOR. Test Review
APP INVENTOR Test Review Main Concepts App Inventor Lists Creating Random Numbers Variables Searching and Sorting Data Linear Search Binary Search Selection Sort Quick Sort Abstraction Modulus Division
More informationLecture 3: Finding integer solutions to systems of linear equations
Lecture 3: Finding integer solutions to systems of linear equations Algorithmic Number Theory (Fall 2014) Rutgers University Swastik Kopparty Scribe: Abhishek Bhrushundi 1 Overview The goal of this lecture
More information7 Gaussian Elimination and LU Factorization
7 Gaussian Elimination and LU Factorization In this final section on matrix factorization methods for solving Ax = b we want to take a closer look at Gaussian elimination (probably the best known method
More informationIn mathematics, it is often important to get a handle on the error term of an approximation. For instance, people will write
Big O notation (with a capital letter O, not a zero), also called Landau's symbol, is a symbolism used in complexity theory, computer science, and mathematics to describe the asymptotic behavior of functions.
More informationRegular Languages and Finite Automata
Regular Languages and Finite Automata 1 Introduction Hing Leung Department of Computer Science New Mexico State University Sep 16, 2010 In 1943, McCulloch and Pitts [4] published a pioneering work on a
More informationRecursion. Definition: o A procedure or function that calls itself, directly or indirectly, is said to be recursive.
Recursion Definition: o A procedure or function that calls itself, directly or indirectly, is said to be recursive. Why recursion? o For many problems, the recursion solution is more natural than the alternative
More informationAlgorithm Analysis [2]: if-else statements, recursive algorithms. COSC 2011, Winter 2004, Section N Instructor: N. Vlajic
1 Algorithm Analysis []: if-else statements, recursive algorithms COSC 011, Winter 004, Section N Instructor: N. Vlajic Algorithm Analysis for-loop Running Time The running time of a simple loop for (int
More informationcsci 210: Data Structures Recursion
csci 210: Data Structures Recursion Summary Topics recursion overview simple examples Sierpinski gasket Hanoi towers Blob check READING: GT textbook chapter 3.5 Recursion In general, a method of defining
More informationCpt S 223. School of EECS, WSU
Priority Queues (Heaps) 1 Motivation Queues are a standard mechanism for ordering tasks on a first-come, first-served basis However, some tasks may be more important or timely than others (higher priority)
More informationMATHEMATICAL INDUCTION. Mathematical Induction. This is a powerful method to prove properties of positive integers.
MATHEMATICAL INDUCTION MIGUEL A LERMA (Last updated: February 8, 003) Mathematical Induction This is a powerful method to prove properties of positive integers Principle of Mathematical Induction Let P
More informationBinary Heap Algorithms
CS Data Structures and Algorithms Lecture Slides Wednesday, April 5, 2009 Glenn G. Chappell Department of Computer Science University of Alaska Fairbanks CHAPPELLG@member.ams.org 2005 2009 Glenn G. Chappell
More informationApplied Algorithm Design Lecture 5
Applied Algorithm Design Lecture 5 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 5 1 / 86 Approximation Algorithms Pietro Michiardi (Eurecom) Applied Algorithm Design
More informationTHE SCHEDULING OF MAINTENANCE SERVICE
THE SCHEDULING OF MAINTENANCE SERVICE Shoshana Anily Celia A. Glass Refael Hassin Abstract We study a discrete problem of scheduling activities of several types under the constraint that at most a single
More informationLemma 5.2. Let S be a set. (1) Let f and g be two permutations of S. Then the composition of f and g is a permutation of S.
Definition 51 Let S be a set bijection f : S S 5 Permutation groups A permutation of S is simply a Lemma 52 Let S be a set (1) Let f and g be two permutations of S Then the composition of f and g is a
More informationChapter 13: Query Processing. Basic Steps in Query Processing
Chapter 13: Query Processing! Overview! Measures of Query Cost! Selection Operation! Sorting! Join Operation! Other Operations! Evaluation of Expressions 13.1 Basic Steps in Query Processing 1. Parsing
More informationWell-Separated Pair Decomposition for the Unit-disk Graph Metric and its Applications
Well-Separated Pair Decomposition for the Unit-disk Graph Metric and its Applications Jie Gao Department of Computer Science Stanford University Joint work with Li Zhang Systems Research Center Hewlett-Packard
More informationBinary Search Trees. Data in each node. Larger than the data in its left child Smaller than the data in its right child
Binary Search Trees Data in each node Larger than the data in its left child Smaller than the data in its right child FIGURE 11-6 Arbitrary binary tree FIGURE 11-7 Binary search tree Data Structures Using
More informationCSE 326, Data Structures. Sample Final Exam. Problem Max Points Score 1 14 (2x7) 2 18 (3x6) 3 4 4 7 5 9 6 16 7 8 8 4 9 8 10 4 Total 92.
Name: Email ID: CSE 326, Data Structures Section: Sample Final Exam Instructions: The exam is closed book, closed notes. Unless otherwise stated, N denotes the number of elements in the data structure
More informationThe Running Time of Programs
CHAPTER 3 The Running Time of Programs In Chapter 2, we saw two radically different algorithms for sorting: selection sort and merge sort. There are, in fact, scores of algorithms for sorting. This situation
More informationa 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.
Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given
More informationFor example, we have seen that a list may be searched more efficiently if it is sorted.
Sorting 1 Many computer applications involve sorting the items in a list into some specified order. For example, we have seen that a list may be searched more efficiently if it is sorted. To sort a group
More informationCSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Linda Shapiro Winter 2015
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Linda Shapiro Today Registration should be done. Homework 1 due 11:59 pm next Wednesday, January 14 Review math essential
More informationClosest Pair of Points. Kleinberg and Tardos Section 5.4
Closest Pair of Points Kleinberg and Tardos Section 5.4 Closest Pair of Points Closest pair. Given n points in the plane, find a pair with smallest Euclidean distance between them. Fundamental geometric
More informationNear Optimal Solutions
Near Optimal Solutions Many important optimization problems are lacking efficient solutions. NP-Complete problems unlikely to have polynomial time solutions. Good heuristics important for such problems.
More information1) The postfix expression for the infix expression A+B*(C+D)/F+D*E is ABCD+*F/DE*++
Answer the following 1) The postfix expression for the infix expression A+B*(C+D)/F+D*E is ABCD+*F/DE*++ 2) Which data structure is needed to convert infix notations to postfix notations? Stack 3) The
More informationUnit 1 Number Sense. In this unit, students will study repeating decimals, percents, fractions, decimals, and proportions.
Unit 1 Number Sense In this unit, students will study repeating decimals, percents, fractions, decimals, and proportions. BLM Three Types of Percent Problems (p L-34) is a summary BLM for the material
More informationSYSTEMS OF PYTHAGOREAN TRIPLES. Acknowledgements. I would like to thank Professor Laura Schueller for advising and guiding me
SYSTEMS OF PYTHAGOREAN TRIPLES CHRISTOPHER TOBIN-CAMPBELL Abstract. This paper explores systems of Pythagorean triples. It describes the generating formulas for primitive Pythagorean triples, determines
More informationWhat Is Recursion? Recursion. Binary search example postponed to end of lecture
Recursion Binary search example postponed to end of lecture What Is Recursion? Recursive call A method call in which the method being called is the same as the one making the call Direct recursion Recursion
More informationApproximation Algorithms
Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NP-Completeness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms
More informationMathematical Induction. Lecture 10-11
Mathematical Induction Lecture 10-11 Menu Mathematical Induction Strong Induction Recursive Definitions Structural Induction Climbing an Infinite Ladder Suppose we have an infinite ladder: 1. We can reach
More information1. Give the 16 bit signed (twos complement) representation of the following decimal numbers, and convert to hexadecimal:
Exercises 1 - number representations Questions 1. Give the 16 bit signed (twos complement) representation of the following decimal numbers, and convert to hexadecimal: (a) 3012 (b) - 435 2. For each of
More informationComplexity Theory. IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar
Complexity Theory IE 661: Scheduling Theory Fall 2003 Satyaki Ghosh Dastidar Outline Goals Computation of Problems Concepts and Definitions Complexity Classes and Problems Polynomial Time Reductions Examples
More informationStudent Outcomes. Lesson Notes. Classwork. Discussion (10 minutes)
NYS COMMON CORE MATHEMATICS CURRICULUM Lesson 5 8 Student Outcomes Students know the definition of a number raised to a negative exponent. Students simplify and write equivalent expressions that contain
More informationIntroduction to Scheduling Theory
Introduction to Scheduling Theory Arnaud Legrand Laboratoire Informatique et Distribution IMAG CNRS, France arnaud.legrand@imag.fr November 8, 2004 1/ 26 Outline 1 Task graphs from outer space 2 Scheduling
More informationThe Goldberg Rao Algorithm for the Maximum Flow Problem
The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }
More informationOutline. Introduction Linear Search. Transpose sequential search Interpolation search Binary search Fibonacci search Other search techniques
Searching (Unit 6) Outline Introduction Linear Search Ordered linear search Unordered linear search Transpose sequential search Interpolation search Binary search Fibonacci search Other search techniques
More informationI. GROUPS: BASIC DEFINITIONS AND EXAMPLES
I GROUPS: BASIC DEFINITIONS AND EXAMPLES Definition 1: An operation on a set G is a function : G G G Definition 2: A group is a set G which is equipped with an operation and a special element e G, called
More informationSubstitute 4 for x in the function, Simplify.
Page 1 of 19 Review of Eponential and Logarithmic Functions An eponential function is a function in the form of f ( ) = for a fied ase, where > 0 and 1. is called the ase of the eponential function. The
More informationHomework until Test #2
MATH31: Number Theory Homework until Test # Philipp BRAUN Section 3.1 page 43, 1. It has been conjectured that there are infinitely many primes of the form n. Exhibit five such primes. Solution. Five such
More informationBinary Multiplication
Binary Multiplication Q: How do we multiply two numbers? eg. 12, 345 6, 789 111105 987600 8641500 + 74070000 83, 810, 205 10111 10101 10111 00000 1011100 0000000 + 101110000 111100011 Pad, multiply and
More informationIntroduction to Automata Theory. Reading: Chapter 1
Introduction to Automata Theory Reading: Chapter 1 1 What is Automata Theory? Study of abstract computing devices, or machines Automaton = an abstract computing device Note: A device need not even be a
More informationElementary Number Theory and Methods of Proof. CSE 215, Foundations of Computer Science Stony Brook University http://www.cs.stonybrook.
Elementary Number Theory and Methods of Proof CSE 215, Foundations of Computer Science Stony Brook University http://www.cs.stonybrook.edu/~cse215 1 Number theory Properties: 2 Properties of integers (whole
More information1 Review of Newton Polynomials
cs: introduction to numerical analysis 0/0/0 Lecture 8: Polynomial Interpolation: Using Newton Polynomials and Error Analysis Instructor: Professor Amos Ron Scribes: Giordano Fusco, Mark Cowlishaw, Nathanael
More informationSample Questions Csci 1112 A. Bellaachia
Sample Questions Csci 1112 A. Bellaachia Important Series : o S( N) 1 2 N N i N(1 N) / 2 i 1 o Sum of squares: N 2 N( N 1)(2N 1) N i for large N i 1 6 o Sum of exponents: N k 1 k N i for large N and k
More informationLecture 4 Online and streaming algorithms for clustering
CSE 291: Geometric algorithms Spring 2013 Lecture 4 Online and streaming algorithms for clustering 4.1 On-line k-clustering To the extent that clustering takes place in the brain, it happens in an on-line
More informationCOMP 250 Fall 2012 lecture 2 binary representations Sept. 11, 2012
Binary numbers The reason humans represent numbers using decimal (the ten digits from 0,1,... 9) is that we have ten fingers. There is no other reason than that. There is nothing special otherwise about
More informationChapter 6: Episode discovery process
Chapter 6: Episode discovery process Algorithmic Methods of Data Mining, Fall 2005, Chapter 6: Episode discovery process 1 6. Episode discovery process The knowledge discovery process KDD process of analyzing
More informationData Structures and Algorithms Written Examination
Data Structures and Algorithms Written Examination 22 February 2013 FIRST NAME STUDENT NUMBER LAST NAME SIGNATURE Instructions for students: Write First Name, Last Name, Student Number and Signature where
More informationChapter 3. Distribution Problems. 3.1 The idea of a distribution. 3.1.1 The twenty-fold way
Chapter 3 Distribution Problems 3.1 The idea of a distribution Many of the problems we solved in Chapter 1 may be thought of as problems of distributing objects (such as pieces of fruit or ping-pong balls)
More informationGREATEST COMMON DIVISOR
DEFINITION: GREATEST COMMON DIVISOR The greatest common divisor (gcd) of a and b, denoted by (a, b), is the largest common divisor of integers a and b. THEOREM: If a and b are nonzero integers, then their
More informationMath 55: Discrete Mathematics
Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What
More informationLecture 6 Online and streaming algorithms for clustering
CSE 291: Unsupervised learning Spring 2008 Lecture 6 Online and streaming algorithms for clustering 6.1 On-line k-clustering To the extent that clustering takes place in the brain, it happens in an on-line
More information