Definition A r.v. X has a normal distribution with mean µ and variance σ 2, where µ R, and σ > 0, if its density is f(x) = 1. 2σ 2.

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Definition 6.1.1. A r.v. X has a normal distribution with mean µ and variance σ 2, where µ R, and σ > 0, if its density is f(x) = 1. 2σ 2."

Transcription

1 Chapter 6 Brownian Motion 6. Normal Distribution Definition 6... A r.v. X has a normal distribution with mean µ and variance σ, where µ R, and σ > 0, if its density is fx = πσ e x µ σ. The previous definition makes sense because f is a nonnegative function and πσ e x µ σ dx =. Note that by the changes of variables x µ σ = t πσ e x µ σ dx = π e t By a change to polar coordinates, e x x dx = e y dx dy = 0 = π e r r dr = π e r 0 = π. 0 dt. π e r 0 r dθ dr Theorem 6.. If X has a normal distribution with parameters µ and σ, then E[X] = µ and VarX = σ. Proof. By the change of variables x µ σ = t, E[X] = x πσ e x µ σ dx = µ + tσ πσ e t = µ because the function t πσ e t πσ e t By the change of variables x µ σ E[X µ ] = x µ = σ t d π e t = σ dt + σ t πσ e t dt = µ, is odd. = t and integration by parts, πσ e x µ σ π e t dt = σ. 6 dt dx = σ t π e t dt Q.E.D.

2 6 CHAPTER 6. BROWNIAN MOTION The following formulas could be useful: x e dx = π, xe x dx = 0, x e x dx = π A normal r.v. with mean zero and variance one is called a standard normal r.v. Probabilities for a standard normal r.v. can be found in table at the back of the book. Given a standard normal r.v. Z, to find PZ a, where a > 0, we do To find P Z a, where a > 0, we do P[Z a] = P [Z a] = P[Z a]. P[ Z a] = P[ a Z a] = P [Z a] P[Z a] = P [Z a] P [Z a] = P [Z a]. We usually will denote the density of a standard normal r.v. by φx = π e x. We denote the cumulative distribution function of a standard normal r.v. by Φx = x π e t dt. Theorem 6.. The moment generating function of a r.v. X with a normal distribution with parameters µ and σ, is M X t = e tµ+ t σ. Proof. By the change of variables x µ σ = y, M X t = = e tµ+ t σ etx πσ e x µ σ π e y σt dy = e tµ+ t σ dx = etµ+tσy π e y A standard normal r.v. Z has moment generating function: M Z t = e t.. dy Q.E.D. Theorem 6.3. If X has a normal distribution with parameters µ and σ, then Z = X µ σ a standard normal r.v. Proof. The moment generating function of Z = X µ σ is has M X t = E[e t X µ σ ] = e tµ σ E[e tx σ ] = e t, which is the moment generating function of a standard normal r.v. Q.E.D. Theorem 6.4. Let X,..., X n be independent r.v. s with a normal distribution. Then, n i= X i has a normal distribution. Proof. Let µ i = E[e tx i ] and let σi moment generating function of n i= X i is M n = exp = VarX i. Then, E[e tx i ] = exp tµ i + t σi. So, the i= X t = n i i= M X i t = n i= exp tµ i + t σi t n i= µ i + t n i= σ i.

3 6.. NORMAL DISTRIBUTION 63 Now, exp t n i= µ i + t n i= σ i with mean n i= µ i and variance n is the moment generating function of a normal distribution i= σ i. Since the moment generating function determines the distribution, we conclude that n i= X i has a normal distribution with mean n i= µ i and variance n i= σ i. Q.E.D. Example 6.. If X is a normal random variable with parameters µ = and σ = 4, compute P[5 X > 3]. Solution: We standardize the variable: Z = X 4 3 P[5 X > 3] = P[3 < X 5] = P 4 has a standard normal distribution. So, Z 5 4 = P[ < Z ] = P[Z ] P[Z ] = = Example 6.. A manufacturer produces an item which has a label weight of 0.4 grams. Let X denote the weight of a single mint selected at random from the production line. X has a normal distribution with mean.37 and variance 0.6. Find P[X < 0.4]. Solution: Z = X µ σ has a standard normal distribution. So, [ ] Z = P[Z.45] = P[X < 0.4] = P Example 6.3. Let Z be a standard normal r.v. Find c such that P [ Z c] =.95. Solution: Using the previous formula, 0.95 = P [ Z c] = P [Z c]. So, P[Z c] = = From the normal table, c =.96. Exercise 6.. The height, X, that a college high jumper will clear each time she jumps is a normal random variable with mean 6 feet and variance inches. What is the probability the jumper will clear 6 feet 4 inches in on a single jump? Exercise 6.. The length of the time required to complete a college achievement test is found to be normally distributed with mean 70 minutes and standard deviation minutes. When should the test be terminated if we wish to allow sufficient time of 90 % of the students to complete the test? Exercise 6.3. Suppose that X is a standard normal random variable with parameters µ = and σ = 4. Find P [X > ]. Example 6.4. If X and Y are independent identically distributed standard normal random variables, compute Pr{ X + Y 3}. Solution: X + Y has normal distribution with mean zero and variance. So, P[ X + Y 3] = P[ N0, 3 ] = P[ 0.70 N0,.] = = 0.73.

4 64 CHAPTER 6. BROWNIAN MOTION Example 6.5. If X and Y are independent identically distributed standard normal random variables, compute the density of U = + X + Y. Solution: By the previous theorem, U has a normal distribution. E[U] = and VarU = VarX + VarY =. So, the density of U is u f U u = e π u = e 4 π. 6. Central Limit Theorem If Y is a normal r.v. with mean µ and variance σ, then Y µ has a standard normal distribution. The central limit theorem says that we can do something like that for sums of σ i.i.d.r.v. s. Let X,..., X n be i.i.d.r.v. s with mean µ and variance σ. Then, [ n ] n E X i = E[X i ] = nµ and i= n Var X i = i= i= n VarX i = nσ. The central limit theorem says that for n large enough n i= X i E[ n i= X n i] i= Var n i= X = X i µ i σ n has approximately a normal distribution. Precisely, the central limit theorem says that: i= Theorem 6.5. Let {X i } i= be a sequence of i.i.d.r.v. s with finite second moment. Then, for each a R, [ n lim P i= X ] i E[X i ] n σ a = Φa, n where Φa = P[N0, a]. and The central limit theorem is also true for open intervals and bounded intervals: [ n lim P i= X ] i E[X i ] < a = P [Z < a] n nσ [ lim P a n n i= X i E[X i ] nσ ] b = P[a Z b], where Z denotes a r.v. with a standard normal distribution. Instead of n i= X i, we can use x = n n i= X i, and get that ] lim P n [ x µ σ/ n a = P[N0, a].

5 6.. CENTRAL LIMIT THEOREM 65 For most distributions, the central limit theorem gives a good approximation to the probability if n 30. For some distributions n = 0 is enough. A common application of the central limit theorem is the estimation of probabilities of a binomial distribution with large n. In this case, Theorem 6.6. Let S n be the a r.v. with a binomial distribution with parameters n and p, then [ ] lim P S n np a = Φa. n np p When handling discrete distributions, a continuity correction can be made. If a is an integer, then for each 0 t < [ ] S P [S n a] = P [S n a + t] = P n np a+t np Φ a+t np. np p np p np p Since each 0 t < gives a different approximation, the average of these t s is taken. So, a+0.5 np P [S n a] Φ np p a 0.5 np P [S n < a] Φ np p b+0.5 np P [a S n b] Φ a 0.5 np Φ np p np p b 0.5 np P [a < S n < b] Φ a+0.5 np Φ np p np p Example 6.6. Let S be the number of heads in 000 independent tosses of a fair coin. Find the probability that 480 S 55. Solution: S has a binomial distribution with parameters n = 000 and p =. Then, E[S] = np = 000 = 500 and VarS = np p = 000 = 50. So, P[480 S 55] P[ Z ] = Φ Φ.965 = = Exercise 6.4. Let X be the number of heads in 0000 independent tosses of a fair coin. Find the probability that 4900 X 550. Exercise 6.5. One thousand independent throws of a fair coin will be made. Let X be the number of heads in these thousand independent throws. Find the smallest value of a so that P [500 a X a] Exercise 6.6. Let X be the number of heads in 0,000 independent tosses of a fair coin. Find c so that the probability that 5000 c X c is 95 %. Exercise 6.7. One thousand independent rolls of a fair die are made. Let X be the number of times that the number 6 appears. Find the probability that 50 X < 00.

6 66 CHAPTER 6. BROWNIAN MOTION Exercise 6.8. One thousand independent rolls of a fair die will be made. number of sixes which will appear. Find the probability that 60 X 90. Let X be the Exercise 6.9. One thousand independent rolls of a fair die will be made. Let X be the sum of these thousand independent rolls. Find the value of a so that P [350 a X a] = 0.9. Exercise 6.0. It is assumed that the probability that a student in college has a GPA in high school of A- or above is 5%. Suppose that 500 students from that college are chosen at random, what is the probability that 0 students or less of these 500 students have a GPA in high school of A- or above. Exercise 6.. Let Y,..., Y 00 independent identically distributed random variables with exponential distribution of parameter λ =. Find approximately Pr[Y + + Y 00 0]. Exercise 6.. One has 00 bulbs whose light times are independent exponentials with mean 5 hours. If the bulbs are used one at time, with a failed bulb being immediately replaced by a new one, what is the probability that there is still a working bulb after 55 hours. Exercise 6.3. From past experience a professor knows that the test score of a student taking her final examination is a random variable with mean 70 and variance 5. Assuming that the distribution of grades is approximately bell shaped, what can be said about the probability that student s score is between 60 and 80? Exercise 6.4. A manufacturer of booklets packages them in boxes of 000. It is known that, on the average, the booklets weigh ounce, with a standard deviation of 0.05 oz. The manufacturer is interested in calculating P [00 booklets weigh more 00.4 oz], a number hat would help detect whether too many booklets are being put in a box. Estimate last probability. Exercise 6.5. The scores of a reference population in the Wechsler Intelligence Scale top 5% of the population? Exercise 6.6. Suppose that the weight in ounces of major league baseball is a random variable with mean µ = 5 and standard deviation σ = /5. Find the probability that a carton of 44 baseballs has a total weight less than 75 ounces. Exercise 6.7. A manufacturer packages some item in boxes of hundred. The weight of each item has mean oz. and standard deviation 0. oz. Approximate the probability that a box weights more than 0 oz.

7 6.3. BROWNIAN MOTION Brownian Motion Definition A stochastic process {Bt : t 0} is said to be a Brownian motion process with variance parameter σ > 0 if: i B0 = 0. ii independent increments For each 0 t < t < < t m, Bt, Bt Bt,..., Bt m Bt m are independent r.v. s. iii stationary increments For each 0 s < t, Bt Bs has a normal distribution with mean zero and variance σ t s. If σ =, we said that {Bt : t 0} is a standard Brownian motion. If {Bt : t 0} is a Brownian motion process with variance parameter σ > 0, then {σ Bt : t 0} is a standard Brownian motion. So, the study of Brownian motion reduces to the case of a standard Brownian motion. Unless, it is said otherwise, {Bt : t 0} will denote a standard Brownian motion. Theorem 6.7. Let {Bt : t 0} a standard Brownian motion. Then, the probability density function of Bt is f Bt x = e x t. πt Example 6.7. Let {Bt : t 0} a standard Brownian motion. Let 0 < s < t. Let a, b R. Show that: i CovBs, Bt = s. ii VarBt Bs = t s. iii VaraBs + bbt = a + b s + b t s. iv The distribution of abs + bbt is normal with mean zero. Solution: i Since Bs and Bt Bs are independent r.v. s, CovBs, Bt = CovBs, Bs + Bt Bs = CovBs, Bs + CovBs, Bt Bs = VarBs = s. ii Since Bt Bs has a normal distribution with mean zero and variance t s, VarBt Bs = t s. iii VaraBs + bbt = VaraBs + bbs + Bt Bs = Vara + bbs + bbt Bs = Vara + bbs + VarbBt Bs = a + b s + b t s iv Since Bs and Bt Bs are independent r.v. s and they have a normal distribution, a + bbs + bbt Bs = abs + bbt has a normal distribution.

8 68 CHAPTER 6. BROWNIAN MOTION Example 6.8. Suppose that {Xt : t 0} is a Brownian motion process with variance parameter σ = 9. Find: i P{X 5} ii Var3X X5. iii P{X X3 4} Solution: We have that {Bt : t 0} is a standard Brownian motion, where Bt = Xt 3. i We have that ii P{X } = P{B 4} = P{N0, 4 } = P{Z.884} = Var3X X5 = Var9B 6B5 = Var3B 6B5 B = = 7 iii X X3 has a normal distribution with mean zero and variance VarX X3 = Var3B 6B3 = Var 3B 6B3 B = = 54 So, P{X X3 4} = P{Z 4 54 } = P{Z } = Example 6.9. Let {Bt : t 0} a standard Brownian motion. Let 0 < s < t < u. Show that E[BsBtBu] = 0. Solution: Let X = Bs, let Y = Bt Bs and let let Z = Bu Bt. Then, Z and Y and Z are independent r.v. s with mean zero and E[X ] = s, E[Y ] = t s and let let E[Z ] = u t. So, E[BsBtBu] = E[XX + Y X + Y + Z] = E[X 3 + X Y Z + XY + X Z + XY Z] = E[X 3 ] + E[X ]E[Y ]E[Z] + E[X]E[Y ] + E[X ]E[Z] + E[X]E[Y ]E[Z] = 0 Exercise 6.8. Let {Bt : t 0} a standard Brownian motion. Let 0 < s < t < u. Let a, b, c R. Show that: i VaraBs + bbt + cbu = a + b + c s + b + c t s + c u t. ii The distribution of abs + bbt + cbu is normal with mean zero. Exercise 6.9. Find the density function of X = Bs + Bt, where 0 s < t? Exercise 6.0. Find the density function of X = B 3B3 + 4B5. Exercise 6.. Suppose that {Xt : t 0} is a Brownian motion process with variance parameter σ = 8. Find: i P{ X4 X > 0}. ii Var3 + X4 X + X3. iii Cov3 + X4 X, 5 X3.

9 6.3. BROWNIAN MOTION 69 Recall that if X,..., X m is a r.v. with joint density function f X,...,X m x,..., x m, and h is a smooth transformation on the domain of X,..., X m, then the joint density of Y,..., Y m = hx,..., X m, is f Y,...,Y m y,..., y m = f X,...,X m h y,..., y m Jh y,..., y m, where Jh y,..., y m is the determinant of the partial derivatives of the function h the Jacobian of the h. Theorem 6.8. Let {Bt : t 0} a standard Brownian motion. Let 0 < t <, t m. Then, the joint probability density function of Bt,..., Bt m is exp x t + x x t t + + xm x m t m t m f Bt,...,Bt mx,..., x m = π m. t t t t m t m Proof. Let X = Bt, X = Bt Bt,..., X m = Bt m Bt m. X, X,..., X m are independent r.v. s. X i has normal distribution with mean 0 and variance t i t i. So, the density of X i is f Xi x i = is πti t i e x i f X,...,X m x,..., x m = m i= f X i x i = m i= t i t i and the joint density function of X, X,..., X m x i πti t i e t i t i = π m t t t t m t m exp x t x t t x m t m t m Consider the transformation y, y,..., y m = hx, x,..., x m = x, x + x,..., x + + x m. We have that h y, y,..., y m = y, y y,..., y m y m and Jh y,..., y m =. Then, Y, Y,..., Y m = hx, X,..., X m = Bt, Bt,..., Bt m. The joint density of Y, Y,..., Y m is f Y,...,Y m y,..., y m = f X,...,X mh y,..., y m Jh y,..., y m = f X,...,X my, y y,..., y m y m = π exp y m t t t t m t m t y y ym y m t t t m t m Q.E.D. Markov property of the Brownian motion: Given 0 < s < < s k < t < < t m, we have that the distribution of Bt,..., Bt m given Bs,..., Bs k agrees with the distribution of Bt,..., Bt m given Bs k. Theorem 6.9. Let {Bt : t 0} a standard Brownian motion. Given 0 < s < < s k < t < < t m, we have that the conditional density of Bt,..., Bt m given Bs,..., Bs k agrees with the conditional density of Bt,..., Bt m given Bs k. Proof. Let X = Bs,..., X k = Bs k, Y = Bt,..., Y m = Bt m.

10 70 CHAPTER 6. BROWNIAN MOTION The joint density of X,..., X k, Y,..., Y m is f X,...,X k,y,...,y m x,..., x k, y,..., y m = π m+k s s s s k s k t s k t t t m t m exp x s x x x k x k y x k y y ym y m s s s k s k t s k t s t m t m. The joint density of X,..., X k is f X,...,X k x,..., x k = π exp x k s s s s k s k s x x x k x k s s s k s k So, the conditional density of Y,..., Y m given X,..., X k is f Y,...,Y m X,...,X k y,..., y m x,..., x k = π exp y x k y y ym y m m t s k t t t m t m t s k t t t m t m. The joint density of X k, Y,..., Y m is f Xm,Y,...,Y m x k, y,..., y m = The joint density of X k is π +m sk t s k s s t m t m exp x k s k y x k y y ym y m t s k t t t m t m. f Xk x k = πsk exp x k s k So, the conditional density of Y,..., Y m given X k is f Y,...,Y m X k y,..., y m x k = π exp y x k y y ym y m m t s k t t t m t m t s k t t t m t m. Therefore, the two conditional densities f Y,...,Y m X,...,X k y,..., y m x,..., x k and f Y,...,Y m X k y,..., y m x k agree. Q.E.D. Theorem 6.0. For 0 s < t, the distribution of Bt given Bs is normal with mean Bs and variance t s. Bt Bs d NBs, t s. Proof. By the computations in the previous theorem, the conditional density of Y = Bt given X = Bs is f Y X y x = exp y x. πt s t s This is the density of a normal r.v. with mean x and variance t s. Q.E.D. Example 6.0. Suppose that {Xt : t 0} is a Brownian motion process with variance parameter σ = 4. Find: i P{X3 X = /} ii P{X4 X > X = /}.

11 6.3. BROWNIAN MOTION 7 Solution: We have that {Bt : t 0} is a standard Brownian motion, where Bt = Xt. i So, P{X3 X = } = P{B3 B = } = P{B3 B B = } = P{B3 B B = } = P{B3 B } = P{Z 0.5} = ii Since X and X4 X are independent r.v. s P{X4 X > X = /} = P{X4 X > } = P{B4 B > } = P{Z > } = P{Z > } = Theorem 6.. For 0 s < t, the distribution of Bs given Bt is normal with mean s st s Bt and variance. Bs Bt d N s st s Bt,. t t t t Proof. Let X = Bs and let Y = Bt. By the computations in two theorems above, the joint density of X, Y f X,Y x, y = exp x y x. π st s s t s The marginal density of Y is f Y y = πt exp y t. The conditional density of X given Y = y is f X Y x y = = π st s t = π st s t exp x π st s s y x t s = exp y πt t π st s t y x y + xy x t s t s t s t s exp exp = exp π st s t tx sy + xy st s tt s t s x sy t st s t, exp y x y x t s t s which is the density of a normal r.v. with mean sy t and variance st s t. Q.E.D. Example 6.. Suppose that {Xt : t 0} is a Brownian motion process with variance parameter σ = 3. Find: P{X X = 5 4 } Solution: We have that { P X X = 5 } { = P B B = 5 } { = P B 3 3 B = 5 } 3. We have that [ E B B = 5 3] =

12 7 CHAPTER 6. BROWNIAN MOTION and So, { P B 3 3 B = 5 } 3 = P{Z Var B B = 5 3 = } = P{Z } = Hitting Time Let {Bt : t 0} be a standard Brownian motion. Let a > 0. Let T a denote the first time the Brownian motion process hits a. Then for t > 0, So, P{Bt a} = Pr{T a t} Pr{Bt a T a t} + Pr{T a > t} Pr{Xt a T a > t} = Pr{T a t} + Pr{T a > t}0 = Pr{T a t}. Pr{T a t} = P{Bt a} = P{N0, a t } = Φ a t = a π t e y dy. The density of T a is We also have that f Ta t = e a t π a t 3/ = ae a t. πt 3/ Pr{T a t} = P{Bt a} = P{ Bt a}. Example 6.. Let T a be the time until a standard Brownian motion process hits a. Show that E[T a ] =. Solution: because lim t E[T a ] = ae a t πt / a πt / 0 tf Ta t dt = = and 0 t ae a t dt = ae a t dt =, πt 3/ πt / a πt / dt =. Exercise 6.. Suppose that you own one share of a stock whose price changes according to a standard Brownian motion process. Suppose that you purchased the stock at a price a and the present time price is b, where b < a. You decide to sell the stock when it reaches the price a. What is the average time it takes the stock to recover to the original purchase price? Example 6.3. Let T a be the time until a standard Brownian motion process hits a. Calculate the following: i P{T 8}. ii The median of T. 0

13 6.4. HITTING TIME 73 Solution: i P{T 8} = P{B8 } = P ii Let m be the median of T. Then, { Z 8 } = P{Z } = / = P{T m} = P{Bm }. So, 0.75 = P{Bm < } = P{Z < m }, m = and m = Exercise 6.3. Let T a be the time until a standard Brownian motion process hits a. Calculate the following probabilities and percentiles: i P{T 3 8}. ii The 99-th percentile of T 3. Problem 6.. # 5, November 000. You own one share of a stock. The price is 8. The stock price changes according to a standard Brownian motion process, with time measured in months. Calculate the probability that the stock reaches a price of at some time within the next 4 months. A B C D E Solution: Let T 3 be the hitting time of 3. We need to find Theorem 6.. Given a > 0 > b, then Pr{T 3 4} = Φ3/ = P{Bt hits a before b} = b a + b. Example 6.4. You own one share of a stock. The price is 8. The stock price changes according to a standard Brownian motion process, with time measured in months. Calculate the probability that the stock reaches a price of before it reaches. Solution: 6 9. Let {Bt : t 0} be a standard Brownian motion. The maximum process over [0, t] is M t = max 0 s t Bs. For a > 0, and So, the density of M t is M t has the density of Bt. Pr{M t a} = Pr{T a t} = P{ Bt a} Pr{M t a} = P{ Bt a} = a 0 f Mt a = { πt e a t if a 0 0 if a < 0 πt e x t dx.

14 74 CHAPTER 6. BROWNIAN MOTION Example 6.5. E[M t ] = and Solution: E[M t ] = t π 0 and VarM t = tπ π a e a t πt t da = e a t π E[M t ] = E[Bt ] = t. a= 0 = t π Example 6.6. Suppose that {Xt 0 t < } is a Brownian motion process with variance parameter 9. Calculate the following: i P{max t [0,] Xt 4}. ii The median value of max t [0,] Xt. iii The mean value of max t [0,] Xt. Solution: i ii P{max t [0,] Xt 4} = P{max t [0,] Bt 4} = P{ B 4} 3 3 = P{ 3 4 Z 3 4 } = E[max Xt] = 3E[max Bt] = 3 t [0,] t [0,] π = 6. π iii Let m be mean value of max t [0,] Xt. Then, So, P{Z / = P{max t [0,] Xt m} = P{max t [0,] Bt m/3} = P{ B m/3} = P{B m/3} = P{Z m 3 }. 3 m } = 3/4, m 3 = and m =.8668 Continuity of the Brownian motion: In some sense, a function of t, the Brownian motion is a continuous function on t. A Brownian motion can be used to give probabilities to continuous function. The price of a stock is a continuous function. A Brownian motion can be used to estimate probabilities related with the price of a stock over time. 6.5 Modeling Stock Prices Definition X is said to have a lognormal distribution with parameters µ and σ, if ln X has a normal distribution with mean µ and variance σ. If X is said to have a lognormal distribution with parameters µ and σ, then X has the distribution of e µ+σz, where Z is a standard normal r.v. Example 6.7. Let X be a r.v. σ = 5. Find Pr{X 3}. with lognormal distribution with parameters µ = and

15 6.5. MODELING STOCK PRICES 75 Solution: Pr{X 3} = Pr{ln X ln 3} = Pr{Z ln 3 5 } = Pr{Z } = Example 6.8. Let X be a r.v. with lognormal distribution with parameters µ and σ. Show that: σ µ+ i E[X] = e. ii E[X ] = e µ+σ. iii VarX = e µ+σ e σ. iv For t > 0, Pr{X t} = Φ ln t µ σ, where Φt = Pr{Z t}. and Solution: We have that X has the distribution of e µ+σz. So, E[X] = E[e µ+σz σ µ+ ] = e E[X ] = E[e µ+σz ] = e µ+σ. Example 6.9. Let X be a lognormal r.v. with E[X] = e 5 and E[X ] = e 4. Find: i PrX e 7. ii The third quartile of X. So, Solution: We have that e 5 σ µ+ = E[X] = e, e 4 = E[X ] = e µ+σ. 5 = µ + σ, 4 = µ + σ Hence, µ = 3 and σ = 4. Thus, PrX e 7 = Prln X m = PZ = PZ = Let q be the third quartile of X. Then, and = 0.75 = PrX m = Prln X 7 = PZ ln m 3 4 ln m 3 and m = Definition We say that a stochastic process {Xt : t 0} is a Brownian motion with drift coefficient µ and variance parameter σ if Xt = σbt + µt, where {Bt : t 0} is a standard Brownian motion. If {Xt : t 0} is a Brownian motion with drift coefficient µ and variance parameter σ, then: i X0 = 0. ii {Xt : t 0} has stationary and independent increments iii Xt is normally distributed with mean µt and variance σ t.

16 76 CHAPTER 6. BROWNIAN MOTION Definition We say that a stochastic process {Xt : t 0} is a geometric Brownian motion with drift coefficient µ and variance parameter σ if Xt = e σbt+µt, where {Bt : t 0} is a standard Brownian motion. Suppose that a share of a certain stock is currently i.e. t = 0 selling for P 0. Set P t = P 0 Zt = P 0 e σbt+µt, where Zt is a geometric Brownian motion with drift coefficient µ and variance parameter σ and Bt is a standard Brownian motion. {P t : t 0} is called a price process model. The geometric Brownian motion is used to model prices of stock, because it is assumed that the rate of interest earned over disjoint interval of times are independent r.v. s. The difference P t P s is the interest I[s, t] earned over the period [s, t]. The effective P t P s rate of interest earned in the period [s, t] is, i.e. this is the interest earned per unit of investment in that period of time. I[s, t] = For the price process and 0 < t < < t m, P s P t P s. P s I[t, t ] = P t P t P t = e σbt Bt +µt t,..., I[t m, t m ] = P tm P t m P t m = e σbtm Bt m +µt m t m. So, I[t, t ],..., I[t m, t m ] are independent r.v. s. The average accumulation of P t is At = E[P t] = P 0 e µt+ σ t. Given an accumulation function At, the force of interest at time t is The force of interest of At = E[P t] is δ t = A t At = d ln At. dt δ t = d dt ln At = µ + σ. Exercise 6.4. To model the price of a share of stock with current price of 0, we use the model P t = 0Xt, where Xt is a geometric Brownian motion model with µ = 0 and σ = i Calculate the probability that the price hits 0e 0.5 by time. ii Calculate the median time that it is required for the price to hits 0e 0.5. iii Calculate the mean time that it is required for the price to hits 0e 0.5. iv Calculate the probability that the maximum price of the stock over the time interval [0, ] exceeds 0e v Calculate the median value of the maximum price of the stock over the time interval [0, ]. vi Calculate the median value of the maximum price of the stock over the time interval [0, ]. vii Calculate the expected value and the variance of the price of the stock at the time. viii Calculate the probability that the stock increases by more than 0% in a two year period. Exercise 6.5. To model the price of a share of stock with current price of 0, we use the model P t = Xt, were Xtis a geometric Brownian motion model with µ = and

17 6.5. MODELING STOCK PRICES 77 σ = i Calculate the probability that the price increases by more than 0% before it decreases by more than 0%. ii Calculate the probability that the price increases by more than 5% before it decreases by more than 0%. Exercise 6.6. Suppose that stock currently sells for 00 and that the price process {P t : t 0}is modeled by a multiple of a geometric Brownian motion, where the drift parameter µ = 0.0 and the variance parameter is σ = 0.0. At a force of interest of δ = 0.0 the present value of the price at time t is e δt P t. Calculate the following: i E[P ], VarP and Pr{P 00e 0.03 }. ii The probability the maximum present value of the price over [0, ] exceeds 00e iii The probability the present value of the price hits 00e 0.4 before time 5. iv The probability the present value of the price hits 00e 0.0 before it hits 00e 0.. Problem 6.. # 7, Sample Test. You are given: The logarithm of the price of a stock can be modeled by Brownian motion with drift coefficient µ = 0 and variance parameter σ = The price of the stock at time t = 0 is 0. Calculate the probability that the price of the stock will be or greater at some time between t = 0 and t =. A 0.03 B 0.8 C 0.73 D 0.36 E Solution: Let P t be the price of the stock at time t. P t = 0e 0.04Bt, where Bt is a standard Brownian motion. We need to find Pr{sup 0 t P t } = Pr{sup 0 t 0e 0.04Bt } = Pr{sup 0 t Bt 5 ln/0} = Pr{ N0, 5 ln/0} = Problem 6.3. # 8, November 00. The value of currency in country M is currently the same as in country N i.e., unit in country M can be exchanged for unit in country N. Let Ct denote the difference between the currency values in country M and N at any point in time i.e., unit in country M will exchange for + Ct at time t. Ct is modeled as a Brownian motion process with drift 0 and variance parameter 0.0. An investor in country M currently invests in a risk free investment in country N that matures at.5 units in the currency of country N in 5 years. After the first year, unit in country M is worth.05 in country N. Calculate the conditional probability after the first year that when the investment matures and the funds are exchanged back to country M, the investor will receive at least.5 in the currency of country M. A 0.3 B 0.4 C 0.5 D 0.6 E 0.7 Solution: We have that Ct = 0.0Bt = 0.Bt, where Bt0 is a standard Brownian motion. We have to find

18 78 CHAPTER 6. BROWNIAN MOTION P{.5 C5 >.5 C =.05} = P{ > + 0.B5 + 0.B =.05} = P{0 > B5 + 0.B =.05} = P{0 > B5 B =.5} = P{B5 B < 0.5 B =.5} = P{B5 B < 0.5} = P{N0, 0.5} = 0.4. Problems. Let {Bt : t 0} be a standard Brownian motion. i Find the mean and the variance of B + B5 3B3. ii Find Pr B + B5 3B3 3.. Let {Bt : t 0} be a standard Brownian motion. Show that for 0 < s < t < u < v, the variance of abs + bbt + cbu + dbv is a + b + c + d s + b + c + d t s + c + d u t + d v u. 3. Let {Xt : t 0} be a Brownian motion process with variance parameter σ = 3. Find the following: i Pr X 5. ii Pr X3 X4 5. iii Pr X X + 3X Let {Bt : t 0} be a standard Brownian motion process. Find the following: i Pr B 3 B =. ii Pr B3 B4 B = 3. iii Pr0 B3 4 B5 = Let {Xt : t 0} be a Brownian motion process with variance parameter σ = 9. Find the following: i Pr3 X 9 X = 6. ii Pr X X5 X3 = 3 6. Let {Bt : t 0} be a standard Brownian motion. Let t > 0 and let M t = sup 0 s t Bs be the maximum process over t. Find: i Pr M 9 5. ii The median of M Let {Bt : t 0} be a standard Brownian motion. Let t > 0 and let M t = sup 0 s t Bs be the maximum process over t. Find: i PrM 4 5. ii The density of M 4. iii The median of M 4. iv The first and the third quartile of M 4. v The mean and the variance of M 4.

19 6.5. MODELING STOCK PRICES Let {Bt : t 0} be a standard Brownian motion. Let T a be the first time that the Brownian motion hits a. Find the following: i PrT 4. ii The density of T. iii The median of T. iv Pr T 3 5. v The first and the third quartile the 5 % and 75 % percentile of T Use a Geometric Brownian Motion model with µ = 0 and σ = 0.09 to model the ratio P t/0, where P t is the price at time t for a share of a stock currently selling at 0. What is the probability that the stock increases by more than 0% in a two-year period? 0. You own one share of a stock. The price is. The stock price P t changes according to the model P t = e 0.5Bt, where Bt, t 0, is a standard Brownian motion and the time is measured in months. i Calculate the probability that the stock reaches a price of 30 at some time within the next 4 months. ii Calculate the median of the time that it takes the price to reach 30.. You own one share of a stock. The current price of the stock is 0. Let P t be the P t stock price at time t. changes according to a Geometric Brownian Motion model P 0 with µ = 0. and σ = 0.4, with time measured in months. Find: i The mean and the variance of the price of the stock one year from now. ii The probability that the price of the stock 6 months from now is more than 30.. You own share of stock currently worth 0. Assume that the change in value of this share over time follows a standard Brownian motion process where time is measured in months. What is the probability that the price three months from now is greater than 05? 3. You own one share of a stock. The price is 50. The stock price P t changes according to a geometric Brownian Motion model with µ = 0 and σ = 0.36, with time measured in months. Let S be the time, P t hits 70. Show that the distribution of S is that of T a, where T a is the time that a standard Brownian motion hits a > 0, for some a. Find: i a. ii Pr S 4. iii The median of S. iv The first and the third quartile the 5 % and 75 % percentile of S. 4. Using a Geometric Brownian Motion model with µ = 0, σ = 0.09 to model the ratio P t/0, where P t is the price at time for a share of a stock currently selling at 0, calculate the probability that the price hits 5 by time. 5. You own one share of a stock. The price is 8. Let P t be the stock price at time t. changes according to a Geometric Brownian Motion model with µ = 0. and σ = 0.49, P t P 0

20 80 CHAPTER 6. BROWNIAN MOTION with time measured in months. i Find the mean and the variance of the price of the stock in 6 months. ii Find the probability that the price of the stock will be higher 0% or more in 6 months. iii The mean and the variance of P 6. iv The median of P 6. v The first and the third quartile of P 6.

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 5. Life annuities. Extract from: Arcones Manual for the SOA Exam MLC. Spring 2010 Edition. available at http://www.actexmadriver.com/ 1/114 Whole life annuity A whole life annuity is a series of

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

1 Geometric Brownian motion

1 Geometric Brownian motion Copyright c 006 by Karl Sigman Geometric Brownian motion Note that since BM can take on negative values, using it directly for modeling stock prices is questionable. There are other reasons too why BM

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Chapter 2: Binomial Methods and the Black-Scholes Formula

Chapter 2: Binomial Methods and the Black-Scholes Formula Chapter 2: Binomial Methods and the Black-Scholes Formula 2.1 Binomial Trees We consider a financial market consisting of a bond B t = B(t), a stock S t = S(t), and a call-option C t = C(t), where the

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Chapter 4 Expected Values

Chapter 4 Expected Values Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Lecture 13: Martingales

Lecture 13: Martingales Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

2.6. Probability. In general the probability density of a random variable satisfies two conditions:

2.6. Probability. In general the probability density of a random variable satisfies two conditions: 2.6. PROBABILITY 66 2.6. Probability 2.6.. Continuous Random Variables. A random variable a real-valued function defined on some set of possible outcomes of a random experiment; e.g. the number of points

More information

3. Continuous Random Variables

3. Continuous Random Variables 3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the

More information

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu September 19, 2015 Outline

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

MATH 201. Final ANSWERS August 12, 2016

MATH 201. Final ANSWERS August 12, 2016 MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

More information

LogNormal stock-price models in Exams MFE/3 and C/4

LogNormal stock-price models in Exams MFE/3 and C/4 Making sense of... LogNormal stock-price models in Exams MFE/3 and C/4 James W. Daniel Austin Actuarial Seminars http://www.actuarialseminars.com June 26, 2008 c Copyright 2007 by James W. Daniel; reproduction

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304.  jones/courses/141 Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17 4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles

More information

Conditional expectation

Conditional expectation A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

10.2 Series and Convergence

10.2 Series and Convergence 10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and

More information

MATH 425, PRACTICE FINAL EXAM SOLUTIONS.

MATH 425, PRACTICE FINAL EXAM SOLUTIONS. MATH 45, PRACTICE FINAL EXAM SOLUTIONS. Exercise. a Is the operator L defined on smooth functions of x, y by L u := u xx + cosu linear? b Does the answer change if we replace the operator L by the operator

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

More information

Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-Normal Distribution

Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-Normal Distribution Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-ormal Distribution October 4, 200 Limiting Distribution of the Scaled Random Walk Recall that we defined a scaled simple random walk last

More information

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables 4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. DISCRETE RANDOM VARIABLES.. Definition of a Discrete Random Variable. A random variable X is said to be discrete if it can assume only a finite or countable

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b Continuous Random Variables The probability that a continuous random variable, X, has a value between a and b is computed by integrating its probability density function (p.d.f.) over the interval [a,b]:

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Examination 110 Probability and Statistics Examination

Examination 110 Probability and Statistics Examination Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

More information

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University The Normal Distribution Alan T. Arnholt Department of Mathematical Sciences Appalachian State University arnholt@math.appstate.edu Spring 2006 R Notes 1 Copyright c 2006 Alan T. Arnholt 2 Continuous Random

More information

The Black-Scholes pricing formulas

The Black-Scholes pricing formulas The Black-Scholes pricing formulas Moty Katzman September 19, 2014 The Black-Scholes differential equation Aim: Find a formula for the price of European options on stock. Lemma 6.1: Assume that a stock

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Errata for ASM Exam C/4 Study Manual (Sixteenth Edition) Sorted by Page 1 Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Practice exam 1:9, 1:22, 1:29, 9:5, and 10:8

More information

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

4. Joint Distributions

4. Joint Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 4. Life Insurance. Extract from: Arcones Manual for the SOA Exam MLC. Fall 2009 Edition. available at http://www.actexmadriver.com/ 1/14 Level benefit insurance in the continuous case In this chapter,

More information

Black-Scholes Option Pricing Model

Black-Scholes Option Pricing Model Black-Scholes Option Pricing Model Nathan Coelen June 6, 22 1 Introduction Finance is one of the most rapidly changing and fastest growing areas in the corporate business world. Because of this rapid change,

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS Contents 1. Moment generating functions 2. Sum of a ranom number of ranom variables 3. Transforms

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

Some stability results of parameter identification in a jump diffusion model

Some stability results of parameter identification in a jump diffusion model Some stability results of parameter identification in a jump diffusion model D. Düvelmeyer Technische Universität Chemnitz, Fakultät für Mathematik, 09107 Chemnitz, Germany Abstract In this paper we discuss

More information

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage 4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams

Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams Review for Final Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams Histogram Boxplots Chapter 3: Set

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Contents. 3 Survival Distributions: Force of Mortality 37 Exercises Solutions... 51

Contents. 3 Survival Distributions: Force of Mortality 37 Exercises Solutions... 51 Contents 1 Probability Review 1 1.1 Functions and moments...................................... 1 1.2 Probability distributions...................................... 2 1.2.1 Bernoulli distribution...................................

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

Topic 8: The Expected Value

Topic 8: The Expected Value Topic 8: September 27 and 29, 2 Among the simplest summary of quantitative data is the sample mean. Given a random variable, the corresponding concept is given a variety of names, the distributional mean,

More information

Lecture 12: The Black-Scholes Model Steven Skiena. http://www.cs.sunysb.edu/ skiena

Lecture 12: The Black-Scholes Model Steven Skiena. http://www.cs.sunysb.edu/ skiena Lecture 12: The Black-Scholes Model Steven Skiena Department of Computer Science State University of New York Stony Brook, NY 11794 4400 http://www.cs.sunysb.edu/ skiena The Black-Scholes-Merton Model

More information

Chapter 2, part 2. Petter Mostad

Chapter 2, part 2. Petter Mostad Chapter 2, part 2 Petter Mostad mostad@chalmers.se Parametrical families of probability distributions How can we solve the problem of learning about the population distribution from the sample? Usual procedure:

More information

Solved Problems. Chapter 14. 14.1 Probability review

Solved Problems. Chapter 14. 14.1 Probability review Chapter 4 Solved Problems 4. Probability review Problem 4.. Let X and Y be two N 0 -valued random variables such that X = Y + Z, where Z is a Bernoulli random variable with parameter p (0, ), independent

More information

Some special discrete probability distributions

Some special discrete probability distributions University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that

More information