A Getle Itroductio to Algorithms: Part II Cotets of Part I:. Merge: (to merge two sorted lists ito a sigle sorted list.) 2. Bubble Sort 3. Merge Sort: 4. The Big-O, Big-Θ, Big-Ω otatios: asymptotic bouds 5. Basics of aalysis of complexity: The two basic techiques of programmig are iteratios (such as for, while, repeat loops) ad recursios. I this sectio, we will study how to do the accoutig for the time-complexity of a algorithm that cotais iteratios or recursios (or both). If you are aalyzig a fairly complex algorithm, this is the portio that requires most traiig ad care. Therefore I ca oly go ito these topics at a very superficial level. If you eed to lear more about this topic, the best thig is to do a lot of examples from referece []. 5.. Iteratios: The mai tool i evaluatio of time-complexity of a iterative algorithm is the summatio of a series. I our aalysis of BUBBLE, we came up with a time-complexity fuctio that looked like: c ( ) f ( ) = ci = c i = 2 =Θ( for sum of a arithmetic series. Other useful summatio formulas iclude the followig: geometric series: 2 + x =, if x <, x x i i x = 0 0 ) ad was computed by the well kow formula. x harmoic series: = l + O(), where O() deotes a costat. i 0 Fially, due to the liearity of summatio, the followig is useful: Θ( f ( i)) = Θ f ( i) ; ote that the Q-otatio applies to f(i), while o the right side it applies to the fuctio of the parameter.
5.2. Recurreces: computig performace of recursive algorithms. Oe of the most powerful techiques for programmig is recursio; a particularly useful algorithmic techique usig recursio is divide-ad-coquer. We saw its power i the desig of the Mergesort, which was much more efficiet tha the more obvious Bubblesort. However, it is sometimes ot as easy to aalyze the performace of recursive algorithms. Our aalysis of Mergesort used a rather brute-force techique, sometimes called buildig a recursio tree. The performace of most recursive methods (ad especially divide-ad-coquer) ca be writte i the followig form: f () if = T ( ) = at ( / b) + D( ) + C( ) if > Here, f() is the cost of computig the base-case of the recursio. At each step of the recursio, we divide the problem ito b equal sized sub-problems, ad it takes us D() uits of time to divide the problem ito these smaller parts. Fially, it takes us C() uits of time to merge, i.e. combie the solutios of the b sub-problems ito the solutio of the orgial problem. The expressio above is called a recurrece, ad there are three stadard methods for solvig recurreces. 5.2.. The Substitutio Method We first guess the solutio, ad the we prove that our guess is correct usig mathematical iductio. Example: T() = 2 T( /2 ) + This looks quite similar to the recurrece for our Mergesort, so we are tempted to guess that the asymptotic upper boud here is O( lg ). This is reiforced by the observatio that as icreases i size, the value of /2 will totally domiate the costat b. We ow use iductio to prove it: We must show that T() c lg for some properly chose value of c. Base case: Note that if we choose =, the c lg = c lg = 0, for all c. Therefore we shall use a larger value of, e.g. = 2. This is allowed sice the asymptotic otatio oly requires that T() c lg is true for all > 0 of our choosig. Thus, at = 2, c lg = 2c, ad we ca easily fid a c that guaratees that the iequality
will hold. Assume that the claim holds for /2, i.e. T( /2 ) c /2 lg( /2 ). The T() = 2 T( /2 ) + 2 (c /2 lg( /2 ) ) + c lg c lg 2 + = c lg (c ), which is true wheever c >. Obviously, good guesswork is eeded, alog with some cleveress to successfully apply this method. For example, cosider: Example: T() = 2 T(/2 + b) + This looks similar to the previous example, except for the additio of the costat b. We may guess that T() is O( lg ) sice asymptotically, b is egligible compared to /2, ad ay additioal computatioal burde due to it ca be absorbed by a suitably chose c i our O-otatio. You ca use the substitutio method to prove that this is ideed the case, but it will require some trickery (see hit i the exercises). A variatio of the substitutio method is by usig trasformatios (chage of variables). The followig is a example: T() = 2T( ) + lg The square-root term makes is difficult to guess a solutio to this recurrece; we try a substitutio: m = lg, ad get: T( 2 m ) = 2 T( 2 m/2 ) + m Reamig S(m) = T( 2 m ), we get: S(m) = 2 S(m/2) + m, ad we kow, from the proof earlier, that S(m) = O(m lg m) = O( lg lg lg ). 5.2.2. The Iteratio method This method does ot require us to guess a solutio. Here, we repeatedly expad the recurrece, ad attempt to idetify a patter that ca allow us to use algebraic techiques to simplify the solutio. The formulas for summatio of series (arithmetic, geometric, harmoic) are very useful i this method. I the followig example, I use the idetity /4 / 4 = /6 (prove it!). Example: T() = 3 T( /4 ) +
T() = + 3 T( /4 ) = + 3( /4 + 3 T( /6 ) = + 3( /4 + 3 ( /6 + 3 T( /64 ))) = + 3 /4 + 9 /6 + 27 T( /64 ) = + j= (i-) 3 j / 4 j + 3 i T( /4 i ) Which will hit the boudary coditio for T() whe = 4 i, that is, log 4 = i that is, after log 4 steps. Fially, usig / 4 j j, we get: T() 0 (3/4) i + 3 log 4 T() = 0 (3/4) i + 3 log 4 Θ() = 0 (3/4) i + log 43 Θ() [ote: 3 log 4 = log 43 (prove it!)] = 4 + O() = O() Note that whe we evaluated the complexity of the merge sort, we did a similar type of bookkeepig. I some cases, it really helps to make a picture of the effort ivolved i each step i the form of a tree (sometimes called a recursio tree). 5.2.3. The master method All recurreces of the form T() = at(/b) + f() ca be solved by the applicatio of a powerful theorem, called the master theorem. We will preset the theorem, but ot its proof, ad the show a example. Master Theorem: Give two costats, a recurrece: T() a T(/b) + f(), where /b ca be either / b, or / b. The T() ca be boud asymptotically by: (a) If f() = O( log b a-ε ) for some costat ε > 0, the T() = Θ( log b a ) (b) If f() = Θ( log b a ), the T() = Θ( log b a lg ) (c) f() = Ω( log b a+ε ) for some costat ε > 0, ad if af(/b) for some costat c < ad all sufficietly large, the T() = Θ( f()) The ituitive idea about the theorem is that we ca estimate the asymptotic boud o the recurrece by comparig f() with log b a : if f() is smaller, as i case (a), the the
recurrece is boud by log b a ; if the two are equal, as i case (b), the T() is O( f() lg ); fially, if f() is larger tha log b a, as i case (c), the T() is boud by f(). Example: T() = 9T(/3) + Here, a = 9, b = 3, f() = ; log 3 9 = 2 ; settig ε =, f() = = O( 2- ). Thus T() = Θ( 2 ). Example: T() = 2T(/2) + lg Here a = 2, b = 2, thus log b a =. f() = lg, so clearly case (a) ad (b) do ot apply. How about case (c)? It ca be show that for ay positive ε, however small, +ε will grow faster tha lg. Therefore oe of the three cases for the master theorem ca be applied, ad so some other meas must be foud to solve this recurrece. Exercises: (a) Provide a tight asymptotic boud to T() = 2T( /2 ) + 5. (b) Prove usig (i) substitutio, ad (ii) iteratio that T() = 2T( /2 + b) + is O( lg ). [Solutio for (i): Showig that T() 2b) lg ( 2b) is sufficiet to show that T() < c lg ; Assume that the statemet is true for /2 + b, that is: T( /2 + b) - b) lg (/2 - b). Now, T() = 2 T(/2 + b) + 2c (/2 b) lg (/2 b) + = c( - 2b) lg ( /2 b) + = c ( 2b) lg ( 2b) c ( 2b) lg 2 + = c ( 2b) lg ( 2b) (c ) + 2 b c c ( 2b) lg ( 2b) as log as 2 bc (c ) < 0, which is certaily true whe > 2b, ad c > 2. Note that we show that the iductive step is true for if it was true for /2+b, so we must also be certai that > /2 + b, which is true whe > b. To complete the solutio, we eed the boudary case, which i this case will probably be at =b, sice whe < b, T() = T(/2 + b) + has RHS larger tha LHS].
(c) Fid a upper boud for T() = T(/2 + ) +. (d) Usig the iteratio method (draw the recursio tree), provide a tight boud to the recurrece T() = 4 T( /2 ) +. (e) Use the master theorem to give bouds to: (i) T() = 4T(/2) + (ii) T() = 4T(/2) + 2 (iii) T() = 4T(/2) + 3