Many algorithms, particularly divide and conquer algorithms, have time complexities which are naturally



Similar documents
1/1 7/4 2/2 12/7 10/30 12/25

CS473 - Algorithms I

Recursive Algorithms. Recursion. Motivating Example Factorial Recall the factorial function. { 1 if n = 1 n! = n (n 1)! if n > 1

CSC148 Lecture 8. Algorithm Analysis Binary Search Sorting

Catalan Numbers. Thomas A. Dowling, Department of Mathematics, Ohio State Uni- versity.

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Linda Shapiro Winter 2015

Closest Pair Problem

Data Structures. Algorithm Performance and Big O Analysis

The Tower of Hanoi. Recursion Solution. Recursive Function. Time Complexity. Recursive Thinking. Why Recursion? n! = n* (n-1)!

1.2 Solving a System of Linear Equations

2. (a) Explain the strassen s matrix multiplication. (b) Write deletion algorithm, of Binary search tree. [8+8]

A Note on Maximum Independent Sets in Rectangle Intersection Graphs

5.2 The Master Theorem

Section IV.1: Recursive Algorithms and Recursion Trees

Full and Complete Binary Trees

Dynamic Programming. Lecture Overview Introduction

The Running Time of Programs

Math 55: Discrete Mathematics

Algorithms. Margaret M. Fleck. 18 October 2010

Why? A central concept in Computer Science. Algorithms are ubiquitous.

PRE-CALCULUS GRADE 12

5.4 Closest Pair of Points

Near Optimal Solutions

3. Mathematical Induction

The Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge,

Algebra Unpacked Content For the new Common Core standards that will be effective in all North Carolina schools in the school year.

csci 210: Data Structures Recursion

Indiana State Core Curriculum Standards updated 2009 Algebra I

Computer Algorithms. NP-Complete Problems. CISC 4080 Yanjun Li

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Lecture Notes on Polynomials

Zeros of a Polynomial Function

Mathematics for Algorithm and System Analysis

SECTION 10-2 Mathematical Induction

The program also provides supplemental modules on topics in geometry and probability and statistics.

Homework until Test #2

11 Multivariate Polynomials

Sequences and Series

Mathematical Induction. Lecture 10-11

Introduction to Algorithms March 10, 2004 Massachusetts Institute of Technology Professors Erik Demaine and Shafi Goldwasser Quiz 1.

Appendix: Solving Recurrences [Fa 10] Wil Wheaton: Embrace the dark side! Sheldon: That s not even from your franchise!

Integer Factorization using the Quadratic Sieve

Data Structures and Algorithms Written Examination

Lecture 15 An Arithmetic Circuit Lowerbound and Flows in Graphs

Vieta s Formulas and the Identity Theorem

Class Overview. CSE 326: Data Structures. Goals. Goals. Data Structures. Goals. Introduction

Some Polynomial Theorems. John Kennedy Mathematics Department Santa Monica College 1900 Pico Blvd. Santa Monica, CA

CHAPTER II THE LIMIT OF A SEQUENCE OF NUMBERS DEFINITION OF THE NUMBER e.

GRADES 7, 8, AND 9 BIG IDEAS

3.6. The factor theorem

KEANSBURG SCHOOL DISTRICT KEANSBURG HIGH SCHOOL Mathematics Department. HSPA 10 Curriculum. September 2007

MATH 4330/5330, Fourier Analysis Section 11, The Discrete Fourier Transform

Algorithm Design and Analysis Homework #1 Due: 5pm, Friday, October 4, 2013 TA === Homework submission instructions ===

Precalculus REVERSE CORRELATION. Content Expectations for. Precalculus. Michigan CONTENT EXPECTATIONS FOR PRECALCULUS CHAPTER/LESSON TITLES

Approximation Algorithms

Discrete Mathematics: Homework 7 solution. Due:

the recursion-tree method

Chapter 4. Polynomial and Rational Functions. 4.1 Polynomial Functions and Their Graphs

3. INNER PRODUCT SPACES

Lecture 18: Applications of Dynamic Programming Steven Skiena. Department of Computer Science State University of New York Stony Brook, NY

Common Core Unit Summary Grades 6 to 8

8 Primes and Modular Arithmetic

Entry Level College Mathematics: Algebra or Modeling

Lecture 1: Course overview, circuits, and formulas

6.042/18.062J Mathematics for Computer Science December 12, 2006 Tom Leighton and Ronitt Rubinfeld. Random Walks

Faster deterministic integer factorisation

Triangulation by Ear Clipping

The Characteristic Polynomial

How To Understand And Solve Algebraic Equations

Anchorage School District/Alaska Sr. High Math Performance Standards Algebra

Lectures 5-6: Taylor Series

Converting a Number from Decimal to Binary

MATH10040 Chapter 2: Prime and relatively prime numbers

The Prime Numbers. Definition. A prime number is a positive integer with exactly two positive divisors.

CS473 - Algorithms I

RECURSIVE ENUMERATION OF PYTHAGOREAN TRIPLES

HOLES 5.1. INTRODUCTION

DRAFT. Algebra 1 EOC Item Specifications

FACTORING LARGE NUMBERS, A GREAT WAY TO SPEND A BIRTHDAY

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous?

4.1 Introduction - Online Learning Model

Introduction to Logistic Regression

Mathematical Induction. Mary Barnes Sue Gordon

Efficiency of algorithms. Algorithms. Efficiency of algorithms. Binary search and linear search. Best, worst and average case.

recursion, O(n), linked lists 6/14

SHARP BOUNDS FOR THE SUM OF THE SQUARES OF THE DEGREES OF A GRAPH

Closest Pair of Points. Kleinberg and Tardos Section 5.4

Thnkwell s Homeschool Precalculus Course Lesson Plan: 36 weeks

The Goldberg Rao Algorithm for the Maximum Flow Problem

Lecture 19: Introduction to NP-Completeness Steven Skiena. Department of Computer Science State University of New York Stony Brook, NY

Essential Mathematics for Computer Graphics fast

Sorting revisited. Build the binary search tree: O(n^2) Traverse the binary tree: O(n) Total: O(n^2) + O(n) = O(n^2)

Answer: (a) Since we cannot repeat men on the committee, and the order we select them in does not matter, ( )

Randomized algorithms

Some facts about polynomials modulo m (Full proof of the Fingerprinting Theorem)

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

Notes on Factoring. MA 206 Kurt Bryan

α = u v. In other words, Orthogonal Projection

Continued Fractions and the Euclidean Algorithm

Transcription:

Recurrence Relations Many algorithms, particularly divide and conquer algorithms, have time complexities which are naturally modeled by recurrence relations. A recurrence relation is an equation which is dened in terms ofitself. Why are recurrences good things? 1. Many natural functions are easily expressed as recurrences: a n = a n;1 +1 a 1 =1;! a n = n (polynomial) a n = a n;1 a 1 =1 ;! a n = n;1 (exponential) a n = na n;1 a 1 =1 ;! a n = n! (weird function). It is often easy to nd a recurrence as the solution of a counting problem. Solving the recurrence can be done for many special cases as we will see, although it is somewhat of an art.

Recursion is Mathematical Induction! In both, we have general and boundary conditions, with the general condition breaking the problem into smaller and smaller pieces. The initial or boundary condition terminate the recursion. As we will see, induction provides a useful tool to solve recurrences { guess a solution and prove it by induction. T n = T n;1 +1 T 0 =0 n 0 1 3 4 5 6 7 T n 0 1 3 7 15 31 63 17 Guess what the solution is? Prove T n =n ; 1by induction: 1. Show that the basis is true: T 0 = 0 ; 1=0.. Now assume truefor T n;1. 3. Using this assumption show: T n = T n;1 +1=( n;1 ; 1) + 1 = n ; 1

Solving Recurrences No general procedure for solving recurrence relations is known, which is why it is an art. My approach is: Realize that linear, nite history, constant coecient recurrences always can be solved Check out any combinatorics or dierential equations book for aprocedure. Consider a n =a n;1 +a n; +1, a 1 =1,a =1 It has history =, degree = 1, and coecients of and 1. Thus it can be solved mechanically! Proceed: Find the characteristic equation, eg. ; = =0. Solve to get roots, which appear in the exponents. Take care of repeated roots and inhomogeneous parts. Find the constants to nish the job. a n = ;1=3+(1; p 3) n (1+ p 3)=3+(1+ p 3) n (;1+ p 3)=3 Systems like Mathematica and Maple have packages for doing this.

Guess a solution and prove by induction To guess the solution, play around with small values for insight. Note that you can do inductive proofs with the big-o's notations - just be sure you use it right. Example: Show that T (n) c n lg n for large enough c and n. Assume that it is true for n=, then T (n) cbn=c lg(bn=c) +n cn lg(bn=c)+ n dropping oors makes it bigger = cn(lg n ; (lg = 1)) + n log of division = cn lg n ; cn + n cn lg n whenever c>1 Starting with basis cases T () = 4, T (3) = 5, lets us complete the proof for c.

Try backsubstituting until you know what is going on Also known as the iteration method. Plug the recurrence back into itself until you see a pattern. Example: T (n) = 3T (bn=4c)+n. Try backsubstituting: T (n) = n +3(bn=4c +3T (bn=16c) = n +3bn=4c +9(bn=16c + 3T(bn=64c)) = n +3bn=4c +9bn=16c +7T (bn=64c) The (3=4) n term should now beobvious. Although there are only log 4 n terms before we getto T (1), it doesn't hurt to sum them all since this is a fast growing geometric series: T (n) n 1X i=0 i 3 +(n log 4 3 T (1)) 4 T (n) =4n + o(n) =O(n)

Recursion Trees Drawing a picture of the backsubstitution process gives you a idea of what is going on. We must keep track of two things { (1) the size of the remaining argument to the recurrence, and () the additive stu to be accumulated during this call. Example: T (n) =T (n=) + n T(n) n n T(n/) T(n/) (n/) (n/) n/ T(n/4) T(n/4) T(n/4) T(n/4) (n/4) (n/4) (n/4) (n/4) n/4 The remaining arguments are on the left, the additive terms ontheright. Although this tree has height lg n, the total sum at each level decreases geometrically, so: T (n) = 1X 1X n = i = n i=0 i=0 1= i =(n ) The recursion tree framework made this much easier to see than with algebraic backsubstitution.

See if you can use the Master theorem to provide an instant asymptotic solution The Master Theorem: {Leta 1andb>1beconstants, let f(n) be a function, and let T (n) be dened on the nonnegative integers by the recurrence T (n) =at (n=b) +f(n) where we interpret n=b to mean either bn=bc or dn=be. Then T (n) can be bounded asymptotically as follows: 1. If f(n) = O(n log b a; ) for some constant > 0, then T (n) =O(n log b a; ).. If f(n) =(n log b a ), then T (n) =(n log b a lg n). 3. If f(n) = (n log b a+ ) for some constant > 0, and if af(n=b) cf(n) for some constant c < 1, and all suciently large n, then T (n) = (f(n)).

Examples of the Master Theorem Which case of the Master Theorem applies? T (n) =4T (n=) + n Reading from the equation, a = 4, b =, and f(n) =n. Is n = O(n log 4; )=O(n ; )? Yes, so case 1 applies and T (n) =O(n ). T (n) =4T (n=) + n Reading from the equation, a = 4, b =, and f(n) =n. Is n = O(n log 4; )=O(n ; )? No, if >0, but it is true if =0,socase applies and T (n) =(n log n). T (n) =4T (n=) + n 3 Reading from the equation, a = 4, b =, and f(n) =n 3. Is n 3 =(n log 4+ )=(n + )? Yes, for 0<1, so case 3 might apply. Is 4(n=) 3 c n 3? Yes, for c 1=, so there exists a c < 1 to satisfy the regularity condition, so case 3 applies and T (n) =(n 3 ).

Why should the Master Theorem be true? Consider T (n) = at (n=b) + f(n). Suppose f(n) is small enough Say f(n) =0,ie. T (n) =at (n=b). Then we have a recursion tree where the only contribution is at the leaves. There will be log b n levels, with a l leaves at level l. T (n) =a log b n = n log b a Theorem.9 in CLR 0 0 0 1 1 1 1 so long as f(n) is small enough that it is dwarfed by this, we have case 1 of the Master Theorem!

Suppose f(n) is large enough If we draw the recursion tree for T (n) =at (n=b)+f(n). T(n) f(n) T(n/b)...... T(n/b) f(n/b)...... f(n/b) If f(n) is a big enough function, the one top call can be bigger than the sum of all the little calls. Example: f(n) =n 3 > (n=3) 3 +(n=3) 3 +(n=3) 3. fact this holds unless a 7! In In case 3 of the Master Theorem, the additive term dominates. In case, both parts contribute equally, which is why the log pops up. It is (usually) what we want to have happen in a divide and conquer algorithm.

Famous Algorithms and their Recurrence Matrix Multiplication The standard matrix multiplication algorithm for two n n matrices is O(n 3 ). 3 3 4 4 5 3 4 3 4 5 13 18 3 18 5 3 3 3 41 Strassen discovered a divide-and-conquer algorithm which takes T (n) =7T (n=) + O(n ) time. Since O(n lg 7 )dwarfs O(n ), case 1 of the master theorem applies and T (n) =O(n :81 ). This has been \improved" by more and more complicated recurrences until the current best in O(n :38 ).

Polygon Triangulation Given a polygon in the plane, add diagonals so that each face is a triangle None of the diagonals are allowed to cross. Triangulation is an important rst step in many geometric algorithms. The simplest algorithm might be to try each pair of points and check if they see each other. If so, add the diagonal and recur on both halves, for atotal of O(n 3 ). However, Chazelle gave an algorithm which runs in T (n) =T (n=)+o( p (n)) time. Since n 1= = O(n 1; ), by case 1 of the Master Theorem, Chazelle's algorithm is linear, ie. T (n) = O(n).

Sorting The classic divide and conquer recurrence is Mergesort's T (n) =T (n=) + O(n), which divides the data into equal-sized halves and spends linear time merging the halves after they are sorted Since n = O(n log ) = O(n) but not n = O(n 1; ), Case of the Master Theorem applies and T (n) = O(n log n). In case, the divide and merge steps balance out perfectly, as we usually hope for from a divide-and-conquer algorithm.

Mergesort Animations XEROX FROM SEDGEWICK

Approaches to Algorithms Design Incremental Job is partly done - do a little more, repeat until done. A good example of this approach is insertion sort A recursive technique Divide-and-Conquer Divide problem into sub-problems of the same kind. For subproblems thatare really small (trivial), solve them directly. Else solve them recursively. (conquer) Combine subproblem solutions to solve the whole thing (combine) A good example of this approach is Mergesort.