Solved Problems. Chapter Probability review

Size: px
Start display at page:

Download "Solved Problems. Chapter 14. 14.1 Probability review"

Transcription

1 Chapter 4 Solved Problems 4. Probability review Problem 4.. Let X and Y be two N 0 -valued random variables such that X = Y + Z, where Z is a Bernoulli random variable with parameter p (0, ), independent of Y. Only one of the following statements is true. Which one? (a) X + Z and Y + Z are independent (b) X has to be N 0 = {0,, 4,,... }-valued (c) The support of Y is a subset of the support of X (d) E[(X + Y )Z] = E[(X + Y )]E[Z]. (e) none of the above The correct answer is (c). (a) False. Simply take Y = 0, so that Y + Z = Z and X + Z = Z. (b) False. Take Y = 0. (c) True. For m in the support of Y (so that P[Y = m] > 0), we have P[X = m] P[Y = m, Z = 0] = P[Y = m]p[z = 0] = P[Y = m]( p) > 0. Therefore, m is in the support of X. (d) False. Take Y = 0. (e) False. Problem 4.. A fair die is tossed and its outcome is denoted by X, i.e., ( ) 4 5 X. / / / / / / 07

2 CHAPTER 4. After that, X independent fair coins are tossed and the number of heads obtained is denoted by Y. Compute:. P[Y = 4].. P[X = 5 Y = 4].. E[Y ]. 4. E[XY ].. For k =,...,, conditionally on X = k, Y has the binomial distribution with parameters k and. Therefore, {( k ) P[Y = i X = k] = i k, 0 i k 0, i > k, and so, by the law of total probability. P[Y = 4] = P[Y = 4 X = k]p[x = k] k= = ( 4 +. By the (idea behind the) Bayes formula ( ) ( ) (4.) ) [ = 9 ] P[X = 5 Y = 4] = P[X = 5, Y = 4] P[Y = 4] = P[Y = 4 X = 5]P[X = 5] P[Y = 4] = ( 5 4 ( 4 + ( 5 4 ) 5 ) 5 + ( ) ) [ = 0. Since E[Y X = k] = k (the expectation of a binomial with n = k and p = ), the law of total probability implies that 4. By the same reasoning, E[XY ] = E[Y ] = = E[Y X = k]p[x = k] = k= E[XY X = k]p[x = k] = k= ke[y X = k]p[x = k] = k= 4 k= k 9 ]. [ = 7 ] 4. E[kY X = k]p[x = k] k= k= [ k = 9 ]. Last Updated: December 4, Intro to Stochastic Processes: Lecture Notes

3 CHAPTER 4. Problem 4... An urn contains red ball and 0 blue balls. Other than their color, the balls are indistiguishable, so if one is to draw a ball from the urn without peeking - all the balls will be equally likely to be selected. If we draw 5 balls from the urn at once and without peeking, what is the probability that this collection of 5 balls contains the red ball?. We roll two fair dice. What is the probability that the sum of the outcomes equals exactly 7?. Assume that A and B are disjoint events, i.e., assume that A B =. Moreover, let P[A] = a > 0 and P[B] = b > 0. Calculate P[A B] and P[A B], using the values a and b:. P["the red ball is selected"] = ( 0 ) 4 ) = 5. ( 5. There are possible outcomes (pairs of numbers) of the above roll. Out of those, the following have the sum equal to 7 : (, ), (, 5), (, 4), (4, ), (5, ), (, ). Since the dice are fair, all outcomes are equally likely. So, the probability is. According to the axioms of probability: Problem 4.4. =. P[A B] = P[A] + P[B] = a + b, P[A B] = P[ ] = 0.. Consider an experiment which consists of independent coin-tosses. Let the random variable X denote the number of heads appearing. Write down the probability mass function of X.. There are 0 balls in an urn numbered through 0. You randomly select of those balls. Let the random variable Y denote the maximum of the three numbers on the extracted balls. Find the probability mass function of Y. You should simplify your answer to a fraction that does not involve binomial coefficients. Then calculate: P[Y 7].. A fair die is tossed 7 times. We say that a toss is a success if a 5 or appears; otherwise it s a failure. What is the distribution of the random variable X representing the number of successes out of the 7 tosses? What is the probability that there are exactly successes? What is the probability that there are no successes? 4. The number of misprints per page of text is commonly modeled by a Poisson distribution. It is given that the parameter of this distribution is λ = 0. for a particular book. Find the probability that there are exactly misprints on a given page in the book. How about the probability that there are or more misprints? Last Updated: December 4, Intro to Stochastic Processes: Lecture Notes

4 CHAPTER 4.. p 0 = P[{(T, T )}] = 4, p = P[{(T, H), (H, T )}] =, p = P[{(H, H)}] = 4, p k = 0, for all other k.. The random variable Y can take the values in the set {, 4,... 0}. For any i, the triplet resulting in Y attaining the value i must consist of the ball numbered i and a pair of balls with lower numbers. So, p i = P[Y = i] = ( i ) ( 0 ) = (i )(i ) Since the balls are numbered through 0, we have So, = (i )(i ) 40 P[Y 7] = P[Y = 7] + P[Y = 8] + P[Y = 9] + P[Y = 0]. P[Y 7] = = ( ) 40 = = 5.. X has a binominal distribution with parameters n = 7 and p = /, i.e., X b(7, /). ( ) ( ) 7 ( ) 4 P[X = ] = = 50 87, ( ) 7 P[X = 0] = = Let X denote the random variable which stands for the number of misprints on a given page. Then P[X = ] = 0. e ,! P[X ] = P[X < ] = (P[X = 0] + P[X = ]) ( ) 0. 0 = e e 0. 0!! = ( e e 0.) =.e Last Updated: December 4, 00 0 Intro to Stochastic Processes: Lecture Notes

5 CHAPTER 4. Problem 4.5. Let X and Y be two Bernoulli random variables with the same parameter p =. Can the support of their sum be equal to {0, }? How about the case where p is not necesarily equal to? Note that no particular dependence structure between X and Y is assumed. Let p ij, i = 0,, j = 0, be defined by p ij = P[X = i, Y = j]. These four numbers effectively specify the full dependence structure of X and Y (in other words, they completely determine the distribution of the random vector (X, Y )). Since we are requiring that both X and Y be Bernoulli with parameter p, we must have Similarly, we must have p = P[X = ] = P[X =, Y = 0] + P[X =, Y = ] = p 0 + p. (4.) p = p 00 + p 0, (4.) p = p 0 + p, (4.4) p = p 00 + p 0 (4.5) Suppose now that the support of X + Y equals to {0, }. Then p 00 > 0 and p 0 + p 0 > 0, but p = 0 (why?). Then, the relation (4.) implies that p 0 = p. Similarly, p 0 = p by relation (4.4). Relations (4.) and (4.5) tell us that p 00 = p. When p =, this implies that p 00 = 0 - a contradiction with the fact that 0 X+Y. When p <, there is still hope. We construct X and Y as follows: let X be a Bernoulli random variable with parameter p. Then, we define Y depending on the value of X. If X =, we set Y = 0. If X = 0, we set Y = 0 with probabilty p p p and with probability p. How do we know that Y is Bernoulli with probability p? We use the law of total probability: P[Y = 0] = P[Y = 0 X = 0]P[X = 0] + P[Y = 0 X = ]P[X = ] = p p ( p) + p = p. Similarly, P[Y = ] = P[Y = X = 0]P[X = 0] + P[Y = X = ]P[X = ] = ( p p )( p) = p. 4. Random Walks Problem 4.. Let {X n } n N0 be a symmetric simple random walk. For n N the average of the random walk on the interval [0, n] is defined by A n = n n X k. k=. Is {A n } n N0 a simple random walk (not necessarily symmetric)? Explain carefully using the definition.. Compute the covariance Cov(X k, X l ) = E[(X k E[X k ])(X l E[X l ])], for k l N Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

6 CHAPTER 4.. Compute the variance of A n, for n N. (Note: You may need to use the following identities: n k = k= n(n + ) and n k = k= n(n + )(n + ), for n N.). No it is not. The distribution of A is In particular, its support is {,,, }. For n =, A = X, which has the support {, }. Therefore, the support of the difference A A cannot be {, } (indeed, the difference must be able to take fractional values). This is in contradiction with the definition of the random walk which states that all increments must have distributions supported by {, }.. We write X k = k i= ξ i, and X l = l j= ξ j, where {ξ n } n N are the (iid) increments of {X n } n N0. Since E[X k ] = E[X l ] = 0, we have ( k ) l Cov(X k, X l ) = E[X k X l ] = E ξ i When the sum above is expanded, the terms of the form E[ξ i ξ j ], for i j will disappear because ξ i and ξ j are independent for i j. The only terms left are those of the form E[ξ i ξ i ]. Their value is, and there are k of them (remember k l). Therefore, i= Cov(X k, X l ) = k.. Let B n = n k= X k = na n. We know that E[A n ] = 0, and that Var[A n ] = Var[B n n ], so it suffices to compute [( n )] Var[B n ] = E[Bn] = E X l k= X k) ( n l= We expand the sum on the right and group together equal (k = l) and different indices (k l) to get Var[B n ] = n Var[X k ] + k= k<l n Cov(X k, X l ) = j= ξ j n k + k= k<l n k. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

7 CHAPTER 4. For each k =,,..., n, there are exactly n k values of l such that k < l n. Therefore, k<l n k = n k= k(n k), and so Therefore, Var[B n ] = n k + k= = (n + ) n k(n k) = (n + ) k= n(n + ) Var[A n ] = n k k= n(n + )(n + ) (n + )(n + ). n = n k= k n(n + )(n + ). Another way to approach this computation uses the representation X k = k i= ξ k, so ( n ) ( n ) k Var[B n ] = E X k = E ξ i k= ( n = E = n i= k=i n (n i + ) = i= ) ξ i = E where we use the fact that E[ξ i ξ j ] = 0 for i j. n i = i= k= i= ( n ) (n i + )ξ i i= n(n + )(n + ), Problem 4.7. Let {X n } n N0 be a simple random walk. The area A n under the random walk on the interval [0, n] is defined by n A n = X k + X n. k= (Note: Draw a picture of a path of a random walk and convince yourself that the expression above is really the area under the graph.) Compute the expectation and variance of A n. Problem 4.8. Let {X n } n N0 be a simple symmetric random walk (X 0 = 0, p = q = ). What is the probability that X n will visit all the points in the set {,,, 4} at least once during the time-interval n {0,..., 0}? (Note: write your answer as a sum of binomial coefficients; you don t have to evaluate it any further.) The random walk will visit all the points in {,,, 4} if and only if its maximum M 0 during {0,..., 0} is at least equal to 4. Since we start at X 0 = 0, visiting all of {,,, 4} at least once is the same as visiting 4 at least once. Therefore, the required probability is P[M 0 4]. We know (from the lecture notes) that P[M n = l] = P[X n = l] + P[X n = l + ]. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

8 CHAPTER 4. Therefore, 0 ( P[M 0 4] = P[X0 = l] + P[X 0 = l + ] ) l=4 = P[X 0 = 4] + P[X 0 = ] + P[X 0 = 8] + P[X 0 = 0] (( ) ( ) ( ) ) = Problem 4.9. Let {X n } n N0 be a simple symmetric random walk (X 0 = 0, p = q = ). What is the probability that X n will not visit the point 5 during the time-interval n {0,..., 0}? The random walk will not visit the point 5 if only if its maximum M 0 during {0,..., 0} is at most equal to 4. Therefore, the required probability is P[M 0 4]. We know (from the lecture notes) that P[M n = l] = P[X n = l] + P[X n = l + ]. Therefore, P[M 0 4] = 4 ( P[X0 = l] + P[X 0 = l + ] ) l=0 = P[X 0 = 0] + P[X 0 = ] + P[X 0 = 4] (( ) ( ) ( )) = Problem 4.0. A fair coin is tossed repeatedly and the record of the outcomes is kept. Tossing stops the moment the total number of heads obtained so far exceeds the total number of tails by. For example, a possible sequence of tosses could look like HHTTTHTHHTHH. What is the probability that the length of such a sequence is at most 0? Let X n, n N 0 be the number of heads minus the number of tails obtained so far. Then, {X n } n N0 is a simple symmetric random walk, and we stop tossing the coin when X hits for the first time. This will happen during the first 0 tosses, if and only if M 0, where M n denotes the (running) maximum of X. According to the reflection principle, P[M 0 ] = P[X 0 ] + P[X 0 4] = (P[X 0 = 4] + P[X 0 = ] + P[X 0 = 8] + P[X 0 = 0]) [( ) ( ) ( ) ( )] = 9 [ = ] Generating functions Problem 4.. If P (s) is the generating function of the random variable X, then the generating function of X + is Last Updated: December 4, 00 4 Intro to Stochastic Processes: Lecture Notes

9 CHAPTER 4. (a) P (s + ) (b) P (s) + (c) P (s + ) (d) sp (s ) (e) none of the above The correct answer is (d): using the representation P Y (s) = E[s Y ], we get P X+ = E[s X+ ] = E[s(s X )] = se[(s ) X ] = sp X (s ). Alternatively, we can use the fact that 0, k {0,, 4,... }, P[X + = k] =, k {,, 5,... }, where p n = P[X = n], n N 0, so that p k P X+ (s) = 0 + p 0 s + 0s + p s + 0s + p s 4 + = sp X (s ). Problem 4.. Let P (s) be the generating function of the sequence (p 0, p,... ) and Q(s) the generating function of the sequence (q 0, q,... ). If the sequence {r n } n N0 is defined by { 0, n r n = n k= p kq n k, n >, then its generating function is given by (Note: Careful! {r n } n N0 of {p n } n N0 and {q n } n N0. ) is not exactly the convolution (a) P (s)q(s) p 0 q 0 (b) (P (s) p 0 )(Q(s) q 0 ) (c) s (P (s) p 0)Q(s) (d) s P (s)(q(s) q 0) (e) s (P (s) p 0 )(Q(s) q 0 ) The correct answer is (c). Let R(s) be the generating function of the sequence {r n } n N0 and let ˆr be the convolution ˆr n = n k=0 p kq n k of the sequences {p n } n N0 and {q n } n N0, so that the generating function of ˆr is P (s)q(s). For n >, n ˆr n = p k q n k = r n + p 0 q n. k=0 Last Updated: December 4, 00 5 Intro to Stochastic Processes: Lecture Notes

10 CHAPTER 4. Therefore, for s (0, ), P (s)q(s) = Therefore, R(s) = P (s) p 0 s Q(s). Problem 4.. ˆr n s n = ˆr 0 + ˆr n s n n=0 = p 0 q 0 + n= (r n+ + p 0 q n )s n n= = p 0 q 0 + s (R(s) r s r 0 ) + p 0 q n s n = p 0 Q(s) + s R(s). A fair coin and a fair -sided die are thrown repeatedly until the the first time appears on the die. Let X be the number of heads obtained (we are including the heads that may have occurred together with the first ) in the count. The generating function of X is (a) (b) s 5s + s 5s (c) +s 7 5s (d) +s 7 4s (e) None of the above X can be seen as a random sum of -Bernoulli random variables with the number of summands given by G, the number of tries it takes to get the first heads. G is clearly geometrically distributed with parameter. The generating function of G is P G(s) = function of the whole random sum X is Problem Compute p 0 = P[X = 0].. Compute p = P[X = ]. P X (s) = P G (( + s)) = + s 7 5s. n= s 5s, so the generating The generating function of the N 0 -valued random variable X is given by P X (s) = s + s.. Does E[X] exist? If so, find its value; if not, explain why not.. p 0 = P X (0), so p 0 = 0, Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

11 CHAPTER 4.. p = P X (0), and P X(s) = so p =. s ( + s ),. If E[X] existed, it would be equal to P X (). However, lim P X(s) = +, s so E[X] (and, equivalently, P X ()) does not exist. Problem 4.5. Let N be a random time, independent of {ξ n } n N, where {ξ n } n N is a sequence of mutually independent Bernoulli ({0, }-valued) random variables with parameter p B (0, ). Suppose that N has a geometric distribution g(p g ) with parameter p g (0, ). Compute the distribution of the random sum N Y = ξ k. (Note: You can think of Y as a binomial random variable with random n.) k= Independence between N and {ξ n } n N allows us to use the fact that the generating function P Y (s) of Y is given by P Y (s) = P N (P B (s)), where P N (s) = pg q is the generating function of N (geometric distribution) and P gs B(s) = q B + p B s is the generating function of each ξ k (Bernoulli distribution). Therefore, P Y (s) = p g q g (q b + p B s) = p g q gq B qgp B q gq B s = p Y q Y s, where p Y = p g q g q B and q Y = p Y. P Y can be recognized as the generating function of a geometric random variable with parameter p Y. Problem 4.. Six fair gold coins are tossed, and the total number of tails is recorded; let s call this number N. Then, a set of three fair silver coins is tossed N times. Let X be the total number of times at least two heads are observed (among the N tosses of the set of silver coins). (Note: A typical outcome of such a procedure would be the following: out of the six gold coins 4 were tails and were heads. Therefore N = 4 and the 4 tosses of the set of three silver coins may look something like {HHT, T HT, T T T, HT H}, so that X = in this state of the world. ) Find the generating function and the pmf of X. You don t have to evaluate binomial coefficients. Let H k, k N, be the number of heads on the k th toss of the set of three silver coins. The distribution of H k is binomial so P[H k ] = P[H k = ]+P[H k = ] = /8+/8 = /. Let ξ k be the indicator {, H k, ξ k = {Hk } = 0, H k <. Last Updated: December 4, 00 7 Intro to Stochastic Processes: Lecture Notes

12 CHAPTER 4. The generating function P ξ (s) of each ξ k is a generating function of a Bernoulli random variable with p =, i.e. P ξ (s) = ( + s). The random variable N is has a binomial distribution with parameters n = and p =. Therefore, P N (s) = ( + s). By a theorem from the notes, the generating function of the value of the random sum X is given by P X (s) = P N (P ξ (s)) = ( + ( + s) ) = ( s). Therefore, X is a binomial random variable with parameters n = and p = /4, i.e. ( ) k p k = P[X = k] =, k = 0,...,. k 4 Problem 4.7. Tony Soprano collects his cut from the local garbage management companies. During a typical day he can visit a geometrically distributed number of companies with parameter p = 0.. According to many years worth of statistics gathered by his consigliere Silvio Dante, the amount he collects from the i th company is random with the following distribution ( ) $000 $000 $000 X i The amounts collected from different companies are independent of each other, and of the number of companies visited.. Find the (generating function of) the distribution of the amount of money S that Tony will collect on a given day.. Compute E[S] and P[S > 0].. The number N of companies visited has the generating function P N (s) = s, and the generting function of the amount X i collected from each one P X (s) = 0.s s s 000. The total amount collected is given by the random sum S = so the generating function of S is given by P S (s) = P N (P X (s)) = N X i, i= (0.s s s 000 ). Last Updated: December 4, 00 8 Intro to Stochastic Processes: Lecture Notes

13 CHAPTER 4.. The expectation of S is given by E[S] = P S() = P N(P X ())P X() = E[N] E[X ] = 9 $00 = $9800. To compute P[S > 0], we note that S = 0 if and only if no collections have been made, i.e., if the value of N is equal to 0. Since N is geometric with parameter p = 0., so P[S > 0] = P[S = 0] = P[N = 0] = p = 0.9. Problem 4.8. Let X be an N 0 -valued random variable, and let P (s) be its generating function. Then the generating function of 4X is given by. (P (s)) 4,. P (P (P (P (s)))),. P (s 4 ), 4. 4P (s), 5. none of the above. () Problem 4.9. Let X and Y be two N 0 -valued random variables, let P X (s) and P Y (s) be their generating functions and let Z = X Y, V = X + Y, W = XY. Then. P X (s) = P Z (s)p Y (s),. P X (s)p Y (s) = P Z (s),. P W (s) = P X (P Y (s)), 4. P Z (s)p V (s) = P X (s)p Y (s), 5. none of the above. (5) Problem 4.0. Let X be an N 0 -valued random variable, and let P (s) be its generating function, and let Q(s) = P (s)/( s). Then. Q(s) is a generating function of a random variable,. Q(s) is a generating function of a non-decreasing sequence of non-negative numbers,. Q(s) is a concave function on (0, ), 4. Q(0) =, 5. none of the above. () Last Updated: December 4, 00 9 Intro to Stochastic Processes: Lecture Notes

14 CHAPTER Random walks - advanced methods Problem 4.. Let {X n } n N0 be a simple random walk with p = P[X = ] (0, ). Then (a) X will visit every position l {...,, 0,,,... } sooner or later, with probability (b) E[X n] < E[X n+ ], for all n N 0. (c) E[X n ] < E[X n+ ], for all n N 0. (d) X n has a binomial distribution. (e) None of the above. The correct answer is (b). (a) False. When p <, the probability of ever hitting the level l = is less than. (b) True. X n+ = X n + ξ n+, so that, by the independence of X n and ξ n+, we have E[X n+] = E[X n] + E[X n ]E[ξ n+ ] + E[ξ n+] > E[X n], because E[X n ]E[ξ] = n(p ) (p ) 0 and E[ξ ] = > 0. (c) False. E[X n+ ] E[X n ] = p, which could be negative if p <. (d) False. The support of X n is { n, n+,..., n, n}, while the support of any binomial distribution has the form {0,,..., N}, for some N N. (e) False. Problem 4.. Throughout the problem, we let τ k be the first hitting time of the level k, for k Z = {...,,, 0,,,... } for a simple random walk {X n } n N0 with p = P[X = ] (0, ). Furthermore, let P k be the generating function of the distribution of τ k.. When p =, we have (a) P[τ < ] =, E[τ ] <, (b) P[τ < ] = 0, E[τ ] <, (c) P[τ < ] =, E[τ ] =, (d) P[τ < ] (0, ), E[τ ] =, (e) none of the above.. Either one of the following 4 random times is not a stopping time for a simple random walk {X n } n N0, or they all are. Choose the one which is not in the first case, or choose (e) if you think they all are. (a) the first hitting time of the level 4, Last Updated: December 4, 00 0 Intro to Stochastic Processes: Lecture Notes

15 CHAPTER 4. (b) the first time n such that X n X n X, (c) the first time the walk hits the level or the first time the walk sinks below 5, whatever happens first, (d) the second time the walk crosses the level 5 or the third time the walk crosses the level, whatever happens last, (e) none of the above.. We know that P (the generating function of the distribution of the first hitting time of level ) satisfies the following equation P (s) = ps + qsp (s). The generating function P of the first hitting time τ of the level then satisfies (a) P (s) = ps + qsp (s), (b) P (s) = qs + psp (s), (c) P (s) = ps qsp (s), (d) P (s) = ps + qsp (s), (e) none of the above. 4. Let P be the generating function of the first hitting time of the level. Then (a) P (s) = ps + qsp (s) 4, (b) P (s) = ps + qsp (s), (c) P (s) = P (s), (d) P (s) = P (s), (e) none of the above.. The correct answer is (c); see table at the very end of Lecture 7.. The correct answer is (e); first, second, or... hitting times are stopping times, and so are their minima or maxima. Note that for two stopping times T and T, the one that happens first is min(t, T ) and the one that happens last is max(t, T ).. The correct answer is (b); hitting is the same as hitting, if you switch the roles of p and q. 4. The correct answer is (d); to hit, you must first hit and then hit, starting from. The times between these two are independent, and distributed like τ. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

16 CHAPTER Branching processes Problem 4.. Let {Z n } n N0 be a simple branching process which starts from one individual. Each individual has exactly three children, each of whom survives until reproductive age with probability 0 < p <, and dies before he/she is able to reproduce with probability q = p, independently of his/her siblings. The children that reach reproductive age reproduce according to the same rule.. Write down the generating function for the offspring distribution For what values of p will the population go extinct with probability. (Hint: You don t need to compute much. Just find the derivative P () and remember the picture from class ) The number of children who reach reproductive age is binomially distributed with parameters and p. Therefore, P (s) = (ps + q).. We know that the extinction happens with probability if and only if the graphs of functions s and P (s) meet only at s =, for s [0, ]. They will meet once again somewhere on [0, ] if and only if P () >. Therefore, the extinction happens with certainty if P () = p(p + q) = p, i.e., if p /. Problem 4.4. Let {Z n } n N0 be a branching process with offspring distribution (p 0, p,... ). We set P (s) = n=0 p ns n. The extinction is inevitable if (a) p 0 > 0 (b) P ( ) > 4 (c) P () < (d) P () (e) None of the above The correct answer is (c). (a) False. Take p 0 = /, p = 0, p = /, and p n = 0, n >. Then the equation P (s) = s reads / + /s = s, and s = / is its smallest solution. Therefore, the probability of extinction is /. (b) False. Take p 0 = 0, p = /, p = /, and p n = 0, n >. Then the equation P (s) = s reads /s + /s = s, and s = 0 is its smallest solution. Therefore, the probability of extinction is 0, but P (/) = /4 + /8 > /4. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

17 CHAPTER 4. (c) True. The condition P () <, together with the convexity of the function P imply that the graph of P lies strictly above the 45-degree line for s (0, ). Therefore, the only solution of the equation P (s) = s is s =. (d) False. Take the same example as in (a), so that P () = 4/ >. On the other hand, the extinction equation P (s) = s reads / + /s = s. Its smallest solution if s = /. (e) False. Problem 4.5. Bacteria reproduce by cell division. In a unit of time, a bacterium will either die (with probability 4 ), stay the same (with probability 4 ), or split into parts (with probability ). The population starts with 00 bacteria at time n = 0.. Write down the expression for the generating function of the distribution of the size of the population at time n N 0. (Note: you can use n-fold composition of functions.). Compute the extinction probability for the population.. Let m n be the largest possible number of bacteria at time n. Find m n and compute the probability that there are exactly m n bacteria in the population at time n, n N Given that there are 000 bacteria in the population at time 50, what is the expected number of bacteria at time 5?. The generating function for the offspring distribution is P (s) = s + s. We can think of the population Z n at time n, as the sum of 00 independent populations, each one produced by one of 00 bacteria in the initial population. The generating function for the number of offspring of each bacterium is P (n) (s) = P (P (... P (s)... )) } {{ } n Ps Therefore, since sums of independent variables correspond to products in the world of generating functions, the generating function of Z n is given by [ 00 P Zn (s) = P (s)] (n).. As in the practice problem, the extinction probability of the entire population (which starts from 00) is the 00-th power of the extinction probability p of the population which starts from a single bacterium. To compute p, we write down the extinction equation s = s + s. Its solutions are s = and s =, so p = s =. Therefore, the extinction probability for the entire population is ( )00. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

18 CHAPTER 4.. The maximum size of the population at time n will occurr if each of the initial 00 bacteria splits into each time, so m n = 00 n. For that to happen, there must be 00 splits to produce the first generation, 00 for the second, 400 for the third,..., and n 00 for the n-th generation. Each one of those splits happens with probability and is independent of other splits in its generation, given the previous generations. It is clear that for Z n = m n, we must have Z n = m n,..., Z = m, so P[Z n = m n ] = P[Z n = m n, Z n = m n,... Z = m, Z 0 = m 0 ] = P[Z n = m n Z n = m n, Z n = m n,... ]P[Z n = m n, Z n = m n,... ] = P[Z n = m n Z n = m n ]P[Z n = m n Z n = m n, Z n = m n,... ]P[Z n = m n,.. =... = P[Z n = m n Z n = m n ]P[Z n = m n Z n = m n ]... P[Z 0 = m 0 ] = ( )m n ( )m n... ( )m 0 = ( )00(++ +n ) = ( )00 (n ) 4. The expected number of offspring of each of the 000 bacteria is given by P () = 5 4. Therefore, the total expected number of bacteria at time 5 is = 50. Problem 4.. A branching process starts from 0 individuals, and each reproduces according to the probability distribution (p 0, p, p,... ), where p 0 = /4, p = /4, p = /, p n = 0, for n >. The extinction probability for the whole population is equal to (a) (b) (c) 0 (d) 00 (e) 04 The correct answer is (e). The extinction probability for the population starting from each of the 0 initial individuals is given by the smallest solution of /4 + /4s + /s = s, which is s = /. Therefore, the extinction probability for the whole population is (/) 0 = /04. Problem 4.7. A (solitaire) game starts with N silver dollars in the pot. At each turn the number of silver dollars in the pot is counted (call it K) and the following procedure is repeated K times: a die is thrown, and according to the outcome the following four things can happen If the outcome is or the player takes silver dollar from the pot. If the outcome is nothing happens. If the outcome is 4 the player puts extra silver dollar in the pot (you can assume that the player has an unlimited supply of silver dollars). If the outcome is 5 or, the player puts extra silver dollars in the pot. Last Updated: December 4, 00 4 Intro to Stochastic Processes: Lecture Notes

19 CHAPTER 4. If there are no silver dollars on in the pot, the game stops.. Compute the expected number of silver dollars in the pot after turn n N.. Compute the probability that the game will stop eventually.. Let m n be the maximal possible number of silver dollars in the pot after the n-th turn? What is the probability that the actual number of silver dollars in the pot after n turns is equal to m n? The number of silver dollars in the pot follows a branching process {Z n } n N0 with Z 0 = and offspring distribution whose generating function P (s) is given by P (s) = + s + s + s.. The generating function P Zn of Z n is (P (P (... P (s)... ))), so E[Z n ] = P Z n () = (P ()) n = ( + + )n = n+ n. The probability p that the game will stop is the extinction probability of the process. We solve the extinction equation P (s) = s to get the extinction probability corresponding to the case Z 0 =, where there is silver dollar in the pot: P (s) = s + s + s + s = s (s )(s + )(s ) = 0. The smallest solution in [0, ] of the equation above is /. To get the true (corresponding to Z 0 = ) extinction probability we raise it to the third power to get p = 8.. The maximal number m n of silver dollars in the pot is achieved if we roll 5 or each time. In that case, there will be rolls in the first turn, 9 = in the second, 7 = in the third, etc. That means that after n turns, there will be at most m n = n+ silver dollars in the pot. In order to have exactly m n silver dollars in the pot after n turns, we must keep getting 5s and s throughout the first n turns. The probability of that is ( )n + n + + = ( ) n. After that, we must get a 5 or a in n throws and a 4 in a single throw during turn n. There are n possible ways to choose the order of the throw which produces the single 4, so the later probability is n ( )n ( ) = ( )n n. We multiply the two probabilities to get P[Z n = m n ] = ( ) n+ n. Problem 4.8. It is a well-known fact(oid) that armadillos always have identical quadruplets (four offspring). Each of the 4 little armadillos has a / chance of becoming a doctor, a lawyer or a scientist, independently of its siblings. A doctor armadillo will reproduce further with probability /, a lawyer with probability / and a scientist with probability /4, again, independently of everything else. If it reproduces at all, an armadillo reproduces only once in its life, and then leaves the armadillo scene. (For the purposes of this problem assume that armadillos reproduce asexually.) Let us call the armadillos who have offspring fertile. Last Updated: December 4, 00 5 Intro to Stochastic Processes: Lecture Notes

20 CHAPTER 4.. What is the distribution of the number of fertile offspring? Write down its generating function.. What is the generating function for the number of great-grandchildren an armadillo will have? What is its expectation? (Note: do not expand powers of sums). Let the armadillo population be modeled by a branching process, and let s suppose that it starts from exactly one individual at time 0. Is it certain that the population will go extinct sooner or later?. Each armadillo is fertile with probability p, where p = P[ Fertile ] = P[ Fertile Lawyer ] P[ Lawyer ] + P[ Fertile Doctor ] P[ Doctor ] + P[ Fertile Scientist ] P[ Scientist ] = = 5. Therefore, the number of fertile offspring is binomial with n = 4 and p = 5. The generating function of this distribution is P (s) = ( s)4.. To get the number of great-grandchildren, we first compute the generating function Q(s) of the number of fertile grandchildren. This is simply given by the composition of P with itself, i.e., P (P (s)) = ( ( s)4 ) 4. Finally, the number of great-grandchildren is the number of fertile grandchildren multiplied by 4. Therefore, its generating function is given by Q(s) = P (P (s 4 )) = ( ( s4 ) 4 ) 4. The compute the expectation, we need to evaluate Q (): so that Q (s) = (P (P (s 4 ))) = P (P (s 4 ))P (s 4 )4s and P (s) = 4 5 ( s), Q () = 4P ()P () = We need to consider the population of fertile armadillos. Its offspring distribution has generating function P (s) = ( s)4, so the population will go extinct with certainty if and only if the extinction probability is, i.e., if s = is the smallest solution of the extinction equation s = P (s). We know, however, that P () = 5 5 = 0 >, so there exists a positive solution to P (s) = s which is smaller than. Therefore, it is not certain that the population will become extinct sooner or later. Problem 4.9. Branching in alternating environments. Suppose that a branching process {Z n } n N0 is constructed in the following way: it starts with one individual. The individuals in odd generations reproduce according to an offspring distribution with generating function P odd (s) and those in even generations according to an offspring distribution with generating function P even (s). All independence assumptions are the same as in the classical case. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

21 CHAPTER 4.. Find an expression for the generating function P Zn of Z n.. Derive the extinction equation.. Since n = 0 is an even generation, the distribution of the number of offspring of the initial individual is given by P even (s). Each of the Z individuals in the generation n = reproduce according the generating function P odd, so P Z (s) = P Z k= Z (s) = P even (P odd (s)).,k Similarly, P Z (s) = P even (P odd (P even (s))), and, in general, for n N 0, P Zn (s) = P even (P odd (P even (... P odd (s)... )),, and } {{ } n Ps P Zn+ (s) = P even (P odd (P even (... P even (s)... )),. } {{ } n+ Ps (4.). To compute the extinction probability, we must evaluate the limit p = P[E] = lim n P[Z n = 0] = lim n P Z n (0). Since the limit exists (see the derivation in the classical case in the notes), we know that any subsequence also converges towards the same limit. In particular, p = lim n x n, where x n = P[Z n = 0] = P Zn (0). By the expression for P Zn above, we know that x n+ = P even (P odd (x n )), and so p = lim n x n+ = lim n P even (P odd (x n )) = P even (P odd (lim n x n )) = P even (P odd (p)), which identifies p = P even (P odd (p)), (4.7) as the extinction equation. I will leave it up to you to figure out whether p is characterized as the smallest positive solution to (4.7). Problem 4.0. In a branching process, the offspring distribution is given by its characteristic function where a, b, c > 0. P (s) = as + bs + c (i) Find the extiction probability for this branching process. Last Updated: December 4, 00 7 Intro to Stochastic Processes: Lecture Notes

22 CHAPTER 4. (ii) Give a condition for sure extinction. (i) By Proposition 7.5 in the lecture notes, the extinction probability is the smallest non-negative solution to P (x) = x. So, in this case: ax + bx + c = x ax + (b )x + c = 0. The possible solutions are x, = ( (b ) ± (b ) a 4ac ). Recall that it is necessary that P () = a + b + c =. So, x, = ( (b ) ± ( (a + c) ) a 4ac ) = ( b ± (a + c) a 4ac ) = ( ) b ± (a c) a = ( ) a + c ± a c. a Now, we differentiate two cases: If a > c, then So, and x, = ( ) ( ) a + c ± a c = a + c ± (a c). a a The solution we are looking for is c/a. If a c, then So, and x = (a + c + a c) = a x = a (a + c a + c) = c a <. x, = ( ) ( ) a + c ± a c = a + c ± (c a). a a x = a (a + c + c a) = c a x = a (a + c c + a) = a a =. The solution we are looking for is, i.e., in this case there is sure extinction. Last Updated: December 4, 00 8 Intro to Stochastic Processes: Lecture Notes

23 CHAPTER 4. (ii) The discussion above immediately yields the answer to question (ii): the necessary and sufficient condition for certain extinction is a c. Problem 4.. The purpose of this problem is to describe a class of offspring distributions (pmfs) for which an expression for P Zn (s) can be obtained. An N 0 -valued distribution is said to be of fractional-linear type if its generating function P has the following form P (s) = as + b cs, (4.8) for some constants a, b, c 0. In order for P to be a generating function of a probability distribution we must have P () =, i.e. a + b + c =, which will be assumed throughout the problem.. What (familiar) distributions correspond to the following special cases (in each case identify the distribution and its parameters): (a) c = 0 (b) a = 0. Let A be the following matrix [ ] [ a b A = and let A n a (n) b = (n) ] c c (n) d (n) be its n th power (using matrix multiplication, of course). prove that Using mathematical induction P (P (... P (s)... ) = a(n) s + b (n) } {{ } c (n) s + d (n). (4.9) n Ps. Take a = 0 and b = c = /. Show inductively that A n = [ ] (n ) n n. (4.0) n n + Use that to write down the generating function of Z n in the linear-fractional form (4.8). 4. Find the extinction probability as a function of a, b and c in the general case (don t forget to use the identity a + b + c =!). Note: It seems that not everybody is familiar with the principle of mathematical induction. It is a logical device you can use if you have a conjecture and want to prove that it is true for all n N. Your conjecture will be typically look like For each n N, the statement P (n) holds. Last Updated: December 4, 00 9 Intro to Stochastic Processes: Lecture Notes

24 CHAPTER 4. where P (n) is some assertion which depends on n (For example, P (n) could be n = n(n + ) (4.) The principle of mathematical induction says that you can prove that your statement is true for all n N by doing the following two things:. Prove the statement for n =, i.e., prove P (). (induction basis). Prove that the implication P (n) P (n + ) always holds, i.e., prove the statement for n + if you are, additionally, allowed to use the statement for n as a hypothesis. (inductive step) As an example, let us prove the statement (4.) above. For n =, P () reads =, which is evidently true. Supposing that P (n) holds, i.e., that we can add n + to both sides to get n = n(n + ), ++ +n+(n+) = n(n + ) n(n + ) + (n + ) +(n+) = = n + n + = (n + )(n + ), which is exactly P (n + ). Therefore, we managed to prove P (n + ) using P (n) as a crutch. The principle of mathematical induction says that this is enough to be able to conclude that P (n) holds for each n, i.e., that (4.) is a ture statement for all n N. Back to the solution of the problem:. (a) When c = 0, P (s) = as + b - Beronulli distribution with success probability a = b. (b) For a = 0, P (s) = b cs - Geometric distribution with success probability b = c.. For n =, a () = a, b () = b, c () = c and d () =, so clearly P (s) = a() s + b () c () s + d (). Suppose that the equality (4.9) holds for some n N. Then [ a (n+) b (n+) ] [ c (n+) d (n+) = A n+ = A n a (n) b A = (n) ] [ ] [ a b a a c (n) d (n) = (n) c b (n) b a (n) + b (n) ] c a c (n) c d (n) b c (n) + d (n) On the other hand, by the inductive assumption, P (P (... P (s)... ) = a(n) P (s) + b (n) } {{ } c (n) P (s) + d (n) = a(n) as+b cs + b(n) c (n) as+b cs + d(n) n+ Ps = (a a(n) c b (n) )s + b a (n) + b (n) (a c (n) c d (n) )s + b c (n) + d (n). Therefore, (4.9) also holds for n +. By induction, (4.9) holds for all n N. Last Updated: December 4, 00 0 Intro to Stochastic Processes: Lecture Notes

25 CHAPTER 4.. Here A = [ ] 0 / = / [ ] [ ] 0 = ( ) + so the statement holds for n = (induction basis). Suppose that (4.0) holds for some n. Then [ ] [ ] A n+ = A n A = (n ) n 0 n n n + = [ ] n (n ) + n n+ n n + (n + ) = [ ] n n + n+ (n + ) n + which is exactly (4.0) for n + (inductive step). Thus, (4.0) holds for all n N. By the previous part, the generating function of Z n is given by P Zn (s) = (n )s + n ns + (n + ). We divide the numerator and the denominator by (n + ) to get the above expression into the form dictated by (4.8): P Zn (s) = n n+ s + n n+ n n+ s. Note that a = n n+, b = n n+ and c = n n+ so that a + b + c = n++n+n n+ =, as required. 4. For the extinction probability we need to find the smallest solution of s = as + b cs in [0, ]. The equation above transforms into a quadratic equation after we multiply both sides by cs s cs = as + b, i.e., cs + (a )s + b = 0. (4.) We know that s = is a solution and that a = b c, so we can factor (4.) as cs + (a )s + b = (s )(cs b). If c = 0, the only solution is s =. If c 0, the solutions are and b c. Therefore, the extinction probability P[E] is given by {, c = 0, P[E] = min(, b c ), otherwise. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

26 CHAPTER 4. Problem 4.. (Too hard to appear on an exam, but see if you can understand its solution. Don t worry if you can t.) For a branching process {Z n } n N0, denote by S the total number of individuals that ever lived, i.e., set S = Z n = + Z n. n=0 (i) Assume that the offspring distribution has the generating function given by Find the generating function P S in this case. n= P (s) = p + qs. (ii) Assume that the offspring distribution has the generating function given by Find P S in this case. P (s) = p/( qs). (iii) Find the general expression for E[S] and calculate this expectation in the special cases (i) and (ii). In general, the r.v. S satisfies the relationship P S (s) = E[s + n= Zn ] = = k=0 k=0 E[s + n= Zn Z = k]p[z = k] se[s Z + n= Zn Z = k]p[z = k] When Z = k, the expression Z + n= Z n counts the total number of individuals in k separate and independent Branching processes - one for each of k members of the generation at time k =. Since this random variable is the sum of k independent random variables, each of which has the same distribution as S (why?), we have E[s Z + n= Zn Z = k] = [P S (s)] k. Consequently, P S is a solution of the following equation P S (s) = s [P S (s)] k P[Z = k] = sp (P S (s)). k=0 (i) In this case, P S (s) = s(p + qp S (s)) P S (s) = sp sq. Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

27 CHAPTER 4. (iii) Here, P S (s) must satisfy P S (s)( qp S (s)) sp = 0, i.e., qp S (s) P S (s) + sp = 0. Solving the quadratic, we get as the only sensible solution (the one that has P S (0) = 0): P S (s) = 4spq. q Note that P S () < if p < q. Does that surprise you? Should not: it is possible that S = +. (iii) Using our main recursion for P S (s), we get So, for s =, If µ, E[S] = +. If µ <, P S(s) = (sp Z (P S (s))) = P Z (P S (s)) + sp Z (P (s))p (s). E[S] = P S() = P Z (P S ()) + P Z (P S ())P s() = + µe[s]. E[S] = µ. 4. Markov chains - classification of states Problem 4.. Let {X n } n N0 be a simple symmetric random walk, and let {Y n } n N0 be a random process whose value at time n N 0 is equal to the amount of time (number of steps, including possibly n) the process {X n } n N0 has spent above 0, i.e., in the set {,,... }. Then (a) {Y n } n N0 is a Markov process (b) Y n is a function of X n for each n N. (c) X n is a function of Y n for each n N. (d) Y n is a stopping time for each n N 0. (e) None of the above The correct answer is (e). Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

28 CHAPTER 4. (a) False. The event {Y = } corresponds to exactly two possible paths of the random walk - {X =, X = 0, X = } and {X =, X = 0, X = }, each occurring with probability 8. In the first case, there is no way that Y 4 =, and in the second one, Y 4 = if and only if, additionally, X 4 =. Therefore, On the other hand, P[Y 4 = Y = ] = P[Y = ] P[Y 4 = and Y = ] (4.) = 4P[X =, X = 0, X =, X 4 = ] = 4. P[Y 4 = Y =, Y =, Y =, Y 0 = 0] = P[Y 4 = and X =, X = 0, X = ] P[X =, X = 0, X = ] = 0. (b) False. Y n is a function of the entire past X 0, X,..., X n, but not of the individual value X n. (c) False. Except in trivial cases, it is impossible to know the value of X n if you only know how many of the past n values are positive. (d) False. The fact that, e.g., Y 0 = (meaning that X hits exactly once in its first 0 steps and immediately returns to 0) cannot possibly be known at time. (e) True. Problem 4.4. Suppose that all classes of a Markov chain are recurrent, and let i, j be two states such that i j. Then (a) for each state k, either i k or j k (b) j i (c) p ji > 0 or p ij > 0 (d) n= p(n) jj < (e) None of the above The correct answer is (b). (a) False. Take a chain with two states, where p = p =, and set i = j =, k =. (b) True. Recurrent classes are closed, so i and j belong to the same class. Therefore j i. (c) False. Take a chain with 4 states,,, 4 where p = p = p 4 = p 4 =, and set i =, j =. (d) False. That would mean that j is transient. (e) False. Last Updated: December 4, 00 4 Intro to Stochastic Processes: Lecture Notes

29 CHAPTER 4. Problem 4.5. Which of the following statements is true? Give a short explanation (or a counterexample where appropriate) for your choice. Here, {X n } n N0 be a Markov chain with state space Z.. {X n} n N0 must be a Markov chain.. {X n} n N0 does not have to be a Markov chain.. There exists a set T and a function f : S T such that {f(x n )} n N0 is not a Markov chain. 4. There exists a set T and a function f : S T which is not a bijection (one-to-one and onto) such that {f(x n )} n N is a Markov chain, 5. None of the above. Problem 4.. Two containers are filled with ping-pong balls. The red container has 00 red balls, and the blue container has 00 blue balls. In each step a container is selected; red with probability / and blue with probability /. Then, a ball is selected from it - all balls in the container are equally likely to be selected - and placed in the other container. If the selected container is empty, no ball is transfered. Once there are 00 blue balls in the red container and 00 red balls in the blue container, the game stops (and we stay in that state forever). We decide to model the situation as a Markov chain.. What is the state space S we can use? How large is it?. What is the initial distribution?. What are the transition probabilities between states? Don t write the matrix, it is way too large; just write a general expression for p ij, i, j S. 4. How many transient states are there? There are many ways in which one can solve this problem. Below is just one of them.. In order to describe the situation being modeled, we need to keep track of the number of balls of each color in each container. Therefore, one possibility is to take the set of all quadruplets (r, b, R, B), r, b, R, b {0,,,..., 00} and this state space would have 0 4 elements. We know, however, that the total number of red balls, and the total number of blue balls is always equal to 00, so the knowledge of the composition of the red (say) container is enough to reconstruct the contents of the blue container. In other words, we can use the number of balls of each color in the red container only as our state, i.e. S = {(r, b) : r, b = 0,,..., 00}. This state space has 0 0 = 00 elements. Last Updated: December 4, 00 5 Intro to Stochastic Processes: Lecture Notes

30 CHAPTER 4.. The initial distribution of the chain is deterministic: P[X 0 = (00, 0)] = and P[X 0 = i] = 0, for i S \ {(00, 0)}. In the vector notation, a (0) = (0, 0,...,, 0, 0,..., 0), where is at the place corresponding to (00, 0).. Let us consider several separate cases, under the understandin that p ij = 0, for all i, j not mentioned explicitely below: (a) One of the containers is empty. In that case, we are either in (0, 0) or in (00, 00). Let us describe the situation for (0, 0) first. If we choose the red container - and that happens with probability - we stay in (0, 0): p (0,0),(0,0) =. If the blue container is chosen, a ball of either color will be chosen with probability =, so p (0,0),(,0) = p (0,0),(0,) = 4. By the same reasoning, p (00,00),(0,0) = and p (00,00),(99,00) = p (00,00),(00,99) = 4. (b) We are in the state (0, 00). By the description of the model, this is an absorbing state, so p (0,00),(0,00) =. (c) All other states. Suppose we are in the state (r, b) where (r, b) {(0, 00), (0, 0), (00, 00)}. r If the red container is chosen, then the probability of getting a red ball is r+b, so Similarly, p (r,b),(r,b) = p (r,b),(r,b ) = r r+b. b r+b. In the blue container there are 00 r red and 00 b blue balls. Thus, and p (r,b),(r+,b) = p (r,b),(r,b+) = 00 r 00 r b, 00 b 00 r b. (d) The state (0, 00) is clearly recurrent (it is absorbing). Let us show that all other (000 of them) are transient. It will be enough to show that i (0, 00), for each i S, because (0, 00) is in a recurrent class by itself. To do that, we just need to find a path from any state (r, b) to (0, 00) through the transition graph: (r, b) (r, b) (0, b) (0, b + ) (0, 00). Last Updated: December 4, 00 Intro to Stochastic Processes: Lecture Notes

31 CHAPTER 4. Problem 4.7. Let {Z n } n N0 be a Branching process (with state space S = {0,,,, 4,... } = N 0 ) with the offspring probability given by p 0 = /, p = /. Classify the states (find classes), and describe all closed sets. If i = 0, the i is an absorbing state (and, therefore, a class in itself). For i N, p ij > 0 if j is of the form k for some k i (out of i individuals, i k die and the remaining k have two children). In particular, noone ever communicates with odd i N. Therefore, each odd i is a class in itself. For even i, it communicates with all even numbers from 0 to i; in particular, i i + and i i. Therefore, all positive even numbers intercommunicate and form a class. Clearly, {0} is a closed set. Every other closed set will contain a positive even number; indeed, after one step we are in an even state, no matter where be start from, and this even number could be positive, unless the initial state was 0). Therefore, any closed set different from {0} will have to contain the set of all even numbers N 0 = {0,, 4,... }, and, N 0 is a closed set itself. On the other hand, if we add any collection of odd numbers to N 0, the resulting set will still be closed (exiting it would mean hitting an odd number outside of it). Therefore, closed sets are exactly the sets of the form C = {0} or C = N 0 D, where D is any set of odd numbers. Problem 4.8. Which of the following statements is true? Give a short explanation (or a counterexample where appropriate) for your choice. {X n } n N0 is a Markov chain with state space S.. p (n) ij f (n) ij for all i, j S and all n N.. If states i and j intercommunicate, then there exists n N such that p (n) ij > 0 and p (n) ji > 0.. If all rows of the transition matrix are equal, then all states belong to the same class. 4. If f ij < and f ji <, then i and j do not intercommunicate. 5. If P n I, then all states are recurrent. (Note: We say that a sequence {A n } n N of matrices converges to the matrix A, and we denote it by A n A, if (A n ) ij A ij, as n, for all i, j.). TRUE. p (n) ij is the probability that X n = j ( given that X 0 = i). On the other hand f (n) ij is the probability that X n = j and that X k j for all k =,..., n (given X 0 = i). Clearly, the second event is a subset of the first, so f (n) ij p (n) ij.. FALSE. Consider the following Markov chain. Last Updated: December 4, 00 7 Intro to Stochastic Processes: Lecture Notes

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

6.3 Conditional Probability and Independence

6.3 Conditional Probability and Independence 222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

3. Mathematical Induction

3. Mathematical Induction 3. MATHEMATICAL INDUCTION 83 3. Mathematical Induction 3.1. First Principle of Mathematical Induction. Let P (n) be a predicate with domain of discourse (over) the natural numbers N = {0, 1,,...}. If (1)

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Basic Probability Concepts

Basic Probability Concepts page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)

More information

Definition and Calculus of Probability

Definition and Calculus of Probability In experiments with multivariate outcome variable, knowledge of the value of one variable may help predict another. For now, the word prediction will mean update the probabilities of events regarding the

More information

Answer Key for California State Standards: Algebra I

Answer Key for California State Standards: Algebra I Algebra I: Symbolic reasoning and calculations with symbols are central in algebra. Through the study of algebra, a student develops an understanding of the symbolic language of mathematics and the sciences.

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2009 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4.

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4. Difference Equations to Differential Equations Section. The Sum of a Sequence This section considers the problem of adding together the terms of a sequence. Of course, this is a problem only if more than

More information

Chapter 3. Cartesian Products and Relations. 3.1 Cartesian Products

Chapter 3. Cartesian Products and Relations. 3.1 Cartesian Products Chapter 3 Cartesian Products and Relations The material in this chapter is the first real encounter with abstraction. Relations are very general thing they are a special type of subset. After introducing

More information

8 Divisibility and prime numbers

8 Divisibility and prime numbers 8 Divisibility and prime numbers 8.1 Divisibility In this short section we extend the concept of a multiple from the natural numbers to the integers. We also summarize several other terms that express

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock

More information

Lecture 13: Martingales

Lecture 13: Martingales Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Chapter 3. Distribution Problems. 3.1 The idea of a distribution. 3.1.1 The twenty-fold way

Chapter 3. Distribution Problems. 3.1 The idea of a distribution. 3.1.1 The twenty-fold way Chapter 3 Distribution Problems 3.1 The idea of a distribution Many of the problems we solved in Chapter 1 may be thought of as problems of distributing objects (such as pieces of fruit or ping-pong balls)

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

More information

Continued Fractions and the Euclidean Algorithm

Continued Fractions and the Euclidean Algorithm Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

Mathematical Induction

Mathematical Induction Mathematical Induction In logic, we often want to prove that every member of an infinite set has some feature. E.g., we would like to show: N 1 : is a number 1 : has the feature Φ ( x)(n 1 x! 1 x) How

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

Bayesian Tutorial (Sheet Updated 20 March)

Bayesian Tutorial (Sheet Updated 20 March) Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +

More information

Solution to Homework 2

Solution to Homework 2 Solution to Homework 2 Olena Bormashenko September 23, 2011 Section 1.4: 1(a)(b)(i)(k), 4, 5, 14; Section 1.5: 1(a)(b)(c)(d)(e)(n), 2(a)(c), 13, 16, 17, 18, 27 Section 1.4 1. Compute the following, if

More information

1. Prove that the empty set is a subset of every set.

1. Prove that the empty set is a subset of every set. 1. Prove that the empty set is a subset of every set. Basic Topology Written by Men-Gen Tsai email: b89902089@ntu.edu.tw Proof: For any element x of the empty set, x is also an element of every set since

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

INCIDENCE-BETWEENNESS GEOMETRY

INCIDENCE-BETWEENNESS GEOMETRY INCIDENCE-BETWEENNESS GEOMETRY MATH 410, CSUSM. SPRING 2008. PROFESSOR AITKEN This document covers the geometry that can be developed with just the axioms related to incidence and betweenness. The full

More information

Generating Functions

Generating Functions Chapter 10 Generating Functions 10.1 Generating Functions for Discrete Distributions So far we have considered in detail only the two most important attributes of a random variable, namely, the mean and

More information

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,

More information

CONTINUED FRACTIONS AND PELL S EQUATION. Contents 1. Continued Fractions 1 2. Solution to Pell s Equation 9 References 12

CONTINUED FRACTIONS AND PELL S EQUATION. Contents 1. Continued Fractions 1 2. Solution to Pell s Equation 9 References 12 CONTINUED FRACTIONS AND PELL S EQUATION SEUNG HYUN YANG Abstract. In this REU paper, I will use some important characteristics of continued fractions to give the complete set of solutions to Pell s equation.

More information

MATH10040 Chapter 2: Prime and relatively prime numbers

MATH10040 Chapter 2: Prime and relatively prime numbers MATH10040 Chapter 2: Prime and relatively prime numbers Recall the basic definition: 1. Prime numbers Definition 1.1. Recall that a positive integer is said to be prime if it has precisely two positive

More information

Binomial lattice model for stock prices

Binomial lattice model for stock prices Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Tenth Problem Assignment

Tenth Problem Assignment EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Chapter 31 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M.

Chapter 31 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M. 31 Geometric Series Motivation (I hope) Geometric series are a basic artifact of algebra that everyone should know. 1 I am teaching them here because they come up remarkably often with Markov chains. The

More information

Math 55: Discrete Mathematics

Math 55: Discrete Mathematics Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What

More information

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Peter Richtárik Week 3 Randomized Coordinate Descent With Arbitrary Sampling January 27, 2016 1 / 30 The Problem

More information

Notes on Probability Theory

Notes on Probability Theory Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a

More information

LEARNING OBJECTIVES FOR THIS CHAPTER

LEARNING OBJECTIVES FOR THIS CHAPTER CHAPTER 2 American mathematician Paul Halmos (1916 2006), who in 1942 published the first modern linear algebra book. The title of Halmos s book was the same as the title of this chapter. Finite-Dimensional

More information

Mathematical Induction. Mary Barnes Sue Gordon

Mathematical Induction. Mary Barnes Sue Gordon Mathematics Learning Centre Mathematical Induction Mary Barnes Sue Gordon c 1987 University of Sydney Contents 1 Mathematical Induction 1 1.1 Why do we need proof by induction?.... 1 1. What is proof by

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

WRITING PROOFS. Christopher Heil Georgia Institute of Technology

WRITING PROOFS. Christopher Heil Georgia Institute of Technology WRITING PROOFS Christopher Heil Georgia Institute of Technology A theorem is just a statement of fact A proof of the theorem is a logical explanation of why the theorem is true Many theorems have this

More information

7: The CRR Market Model

7: The CRR Market Model Ben Goldys and Marek Rutkowski School of Mathematics and Statistics University of Sydney MATH3075/3975 Financial Mathematics Semester 2, 2015 Outline We will examine the following issues: 1 The Cox-Ross-Rubinstein

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

5.1 Radical Notation and Rational Exponents

5.1 Radical Notation and Rational Exponents Section 5.1 Radical Notation and Rational Exponents 1 5.1 Radical Notation and Rational Exponents We now review how exponents can be used to describe not only powers (such as 5 2 and 2 3 ), but also roots

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011

Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011 Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011 Name: Section: I pledge my honor that I have not violated the Honor Code Signature: This exam has 34 pages. You have 3 hours to complete this

More information

Probabilistic Strategies: Solutions

Probabilistic Strategies: Solutions Probability Victor Xu Probabilistic Strategies: Solutions Western PA ARML Practice April 3, 2016 1 Problems 1. You roll two 6-sided dice. What s the probability of rolling at least one 6? There is a 1

More information

Linear Algebra I. Ronald van Luijk, 2012

Linear Algebra I. Ronald van Luijk, 2012 Linear Algebra I Ronald van Luijk, 2012 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents 1. Vector spaces 3 1.1. Examples 3 1.2. Fields 4 1.3. The field of complex numbers. 6 1.4.

More information

Review for Test 2. Chapters 4, 5 and 6

Review for Test 2. Chapters 4, 5 and 6 Review for Test 2 Chapters 4, 5 and 6 1. You roll a fair six-sided die. Find the probability of each event: a. Event A: rolling a 3 1/6 b. Event B: rolling a 7 0 c. Event C: rolling a number less than

More information

The last three chapters introduced three major proof techniques: direct,

The last three chapters introduced three major proof techniques: direct, CHAPTER 7 Proving Non-Conditional Statements The last three chapters introduced three major proof techniques: direct, contrapositive and contradiction. These three techniques are used to prove statements

More information

A Few Basics of Probability

A Few Basics of Probability A Few Basics of Probability Philosophy 57 Spring, 2004 1 Introduction This handout distinguishes between inductive and deductive logic, and then introduces probability, a concept essential to the study

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Notes on Determinant

Notes on Determinant ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without

More information

Lecture 5 Principal Minors and the Hessian

Lecture 5 Principal Minors and the Hessian Lecture 5 Principal Minors and the Hessian Eivind Eriksen BI Norwegian School of Management Department of Economics October 01, 2010 Eivind Eriksen (BI Dept of Economics) Lecture 5 Principal Minors and

More information

Lecture 13 - Basic Number Theory.

Lecture 13 - Basic Number Theory. Lecture 13 - Basic Number Theory. Boaz Barak March 22, 2010 Divisibility and primes Unless mentioned otherwise throughout this lecture all numbers are non-negative integers. We say that A divides B, denoted

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

The Prime Numbers. Definition. A prime number is a positive integer with exactly two positive divisors.

The Prime Numbers. Definition. A prime number is a positive integer with exactly two positive divisors. The Prime Numbers Before starting our study of primes, we record the following important lemma. Recall that integers a, b are said to be relatively prime if gcd(a, b) = 1. Lemma (Euclid s Lemma). If gcd(a,

More information

MATH 10034 Fundamental Mathematics IV

MATH 10034 Fundamental Mathematics IV MATH 0034 Fundamental Mathematics IV http://www.math.kent.edu/ebooks/0034/funmath4.pdf Department of Mathematical Sciences Kent State University January 2, 2009 ii Contents To the Instructor v Polynomials.

More information

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a

More information

Mathematical Induction

Mathematical Induction Mathematical Induction (Handout March 8, 01) The Principle of Mathematical Induction provides a means to prove infinitely many statements all at once The principle is logical rather than strictly mathematical,

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 2

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 2 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 2 Proofs Intuitively, the concept of proof should already be familiar We all like to assert things, and few of us

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to

More information

Solutions for Practice problems on proofs

Solutions for Practice problems on proofs Solutions for Practice problems on proofs Definition: (even) An integer n Z is even if and only if n = 2m for some number m Z. Definition: (odd) An integer n Z is odd if and only if n = 2m + 1 for some

More information

MATH 140 Lab 4: Probability and the Standard Normal Distribution

MATH 140 Lab 4: Probability and the Standard Normal Distribution MATH 140 Lab 4: Probability and the Standard Normal Distribution Problem 1. Flipping a Coin Problem In this problem, we want to simualte the process of flipping a fair coin 1000 times. Note that the outcomes

More information

n k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n + 1 +...

n k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n + 1 +... 6 Series We call a normed space (X, ) a Banach space provided that every Cauchy sequence (x n ) in X converges. For example, R with the norm = is an example of Banach space. Now let (x n ) be a sequence

More information

SECTION 10-2 Mathematical Induction

SECTION 10-2 Mathematical Induction 73 0 Sequences and Series 6. Approximate e 0. using the first five terms of the series. Compare this approximation with your calculator evaluation of e 0.. 6. Approximate e 0.5 using the first five terms

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Hypothesis Testing for Beginners

Hypothesis Testing for Beginners Hypothesis Testing for Beginners Michele Piffer LSE August, 2011 Michele Piffer (LSE) Hypothesis Testing for Beginners August, 2011 1 / 53 One year ago a friend asked me to put down some easy-to-read notes

More information

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws.

Matrix Algebra. Some Basic Matrix Laws. Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Matrix Algebra A. Doerr Before reading the text or the following notes glance at the following list of basic matrix algebra laws. Some Basic Matrix Laws Assume the orders of the matrices are such that

More information

Financial Mathematics and Simulation MATH 6740 1 Spring 2011 Homework 2

Financial Mathematics and Simulation MATH 6740 1 Spring 2011 Homework 2 Financial Mathematics and Simulation MATH 6740 1 Spring 2011 Homework 2 Due Date: Friday, March 11 at 5:00 PM This homework has 170 points plus 20 bonus points available but, as always, homeworks are graded

More information

Stationary random graphs on Z with prescribed iid degrees and finite mean connections

Stationary random graphs on Z with prescribed iid degrees and finite mean connections Stationary random graphs on Z with prescribed iid degrees and finite mean connections Maria Deijfen Johan Jonasson February 2006 Abstract Let F be a probability distribution with support on the non-negative

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information