Lecture 13: Martingales

Size: px
Start display at page:

Download "Lecture 13: Martingales"

Transcription

1 Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of independent random variables and related models 1.5 An exponential martingale 1.6 Likelihood ratios 3. Square Integrable Martingales 2.1 Doob s martingale 2.2 Doob decomposition 2.3 Doob decomposition for square integrable martingales 2.4 Doob-Kolmogorov inequality 4. Convergence of Martingales 3.1 A.s. convergence of martingales 3.2 Law of large numbers for martingales 3.3 Central limit theorem for martingales 5. Stopping times 4.1 Definition and basic properties of stopping times 4.2 Stopped martingales 1

2 4.3 Optional stopping theorems 4.4 Wald equation 4.5 A fair game example 1. Definition of a Martingale 1.1 Filtrations < Ω, F, P > be a probability space; S = {S 0, S 1,...} is a finite or infinite sequence of random variables defined on this probability space. F = {F 0, F 1,...} is a finite or infinite sequence of σ-algebras such that F 0, F 1,... F. (1) If FN = {F 0, F 1,..., F N } is a finite sequence of σ-algebras, then the infinite sequence of σ-algebras F = {F 0, F 1,..., F N, F N,...} = {F n N, n = 0, 1,...} is called a natural continuation of the initial finite sequence of σ-algebras. (2) If SN = {S 0, S 1,..., S N } is a finite sequence of random variables, then the infinite sequence of random variable S = {S 0, S 1,..., S N, S N,...} = {S n N, n = 0, 1,...} is called a natural continuation of the initial finite sequence of random variables. The continuation construction described above us restrict consideration by infinite sequences of random variables and σ- algebras. Definition A sequence of σ-algebras F = {F 0, F 1,...} is a filtration if it is a nondecreasing sequence, i.e., F 0 F 1 2

3 . Examples (1) F 0 = F 1 = F 2 =. Two extreme cases are, where F 0 = {, Ω} or F 0 = F. (2) S = {S 0, S 1,...} is a sequence of random variables, and F n = σ(s k, k = 0,..., n), n = 0, 1,.... Then, F = {F0, F 1,...} is a filtration. It is called the natural filtration generated by the sequence S. Note that in this case the random variable S n is F n -measurable for every n = 0, 1,.... (3) S = {S 0, S 1,...} and X = { X 0, X 1,...} are, respectively, a sequence of random variables and a sequence of random vectors, and F n = σ(s k, X k, k = 0,..., n), n = 0, 1,.... Then, F = {F 0, F 1,...} is a filtration. Note that in this case the random variable S n is also F n -measurable for every n = 0, 1,.... (4) If F 0 F 1,, F N is a finite nondecreasing sequence of σ-algebras, then its natural continuation F = {F n N, n = 0, 1,...} is also a nondecreasing sequence of σ-algebras, i.e., it is a filtration. (5) If F n = σ(s k, k = 0,..., n), n = 0, 1,..., N is the finite sequence of σ-algebras generated by a finite sequence of random variables S 0,..., S N }, then F = {F n N, n = 0, 1,...} is a natural filtration for the infinite sequence of random variables S = S n N, n = 0, 1,...}. 3

4 1.2 Definition of a martingale and its basic properties < Ω, F, P > be a probability, S = {S0, S 1,...} is a sequence of random variables defined on this probability space, and F = {F 0, F 1,...} is a filtration at this probability space. Definition A sequence of random variables S = {S 0, S 1,...} is F-adapted to a filtration F = {F 0, F 1,...} if the random variable S n is F n -measurable (i.e., event {S n B} F n, B B 1 ) for every n = 0, 1,.... Definition A F-adapted sequence of random variables S = {S 0, S 1,...} is a F-martingale (martingale with respect to a filtration F) if it satisfies the following conditions: (a) E S n <, n = 0, 1,...; (b) E(S n+1 /F n ) = S n, n = 0, 1,.... (1) Condition (b) can be written in the equivalent form E(S n+1 S n /F n ) = 0, n = 0, 1,.... (2) A F-adapted sequence of random variables S = {S 0, S 1,...} is F-submartingale or F-supermartingale if (a) E S n <, n = 0, 1,... and, respectively (b ) E(S n+1 /F n ) S n, n = 0, 1,... or (b ) E(S n+1 /F n ) S n, n = 0, 1,.... (3) If S = {S 0, S 1,...} is a martingale with respect to a filtration F = {F n = σ(x 0,..., X n ), n = 0, 1,...} generated by the sequence of random variables X == {X 0, X 1,...}, then the notation E(S n+1 /X 0,..., X n ) = E(S n+1 /F n ) and one can speak about S as the martingale with respect to the sequence of ran- 4

5 dom variable X. (4) If S = {S 0, S 1,...} is a martingale with respect to a natural filtration F = {F n = σ(s 0,..., S n ), n = 0, 1,...} then the notation E n S n+m = E(S n+m /F n ) may be used and one can refer to the sequence S as a martingale without a specification of the corresponding filtration. The martingales possess the following basic properties: 1. If S = {S 0, S 1,...} is F-martingale, then E(S n+m /F n ) = S n, 0 n < n + m <. 2. If S = {S 0, S 1,...} is F-martingale, then ES n = ES 0, n = 0, 1,.... (a) E(S n+m /F n ) = E(E(S n+m /F n+m 1 )/F n ) = E(S n+m 1 /F n ); (b) Iterate this relation to get the above formula. ES n = E(E(S n /F 0 )) = ES 0, n = 0, 1, If S = {S 0, S 1,...} and S = {S 0, S 1,...} are two Fmartingales, and a, b R 1, then the sequence S = {S 0 = as 0 + bs 0, S 1 = as 1 + bs 1,...} is also a F-martingale. E(S n+1 /F n ) = E(aS n+1 + bs n+1/f n ) 5

6 4. If S = {S 0, S 1,...} is F-martingale and ES 2 n <, n = 0, 1,..., then this sequence is non-decreasing, i.e., ES 2 n ES 2 n+1, n = 0, 1,.... = ae(s n+1/f n ) + be(s n+1/f n ) = as n + bs n = S n, n = 0, 1,.... (a) ES n (S n+1 S n ) = E(E(S n (S n+1 S n )/F n )) = E(S n E(S n+1 S n )/F n )) = E(S n 0) = 0; 1.3 Sums of independent random variables and related models (1) Let X 1, X 2,... is a sequence of independent random variables and S = {S n, n = 0, 1,...}, where S n = S 0 + X X n, n = 0, 1,..., S 0 = const. Let also F = {F n, n = 0, 1,...}, where F n = σ(x 1,..., X n ) = σ(s 0, S 1,..., S n ), n = 0, 1,..., F 0 = {, Ω}. In this case, the sequence S is a F martingale if and only if EX n = 0, n = 1, 2,.... (b) 0 E(S n+1 S n ) 2 = ES 2 n+1 2ES n+1 S n + ES 2 n = ES 2 n+1 ES 2 n 2ES n (S n+1 S n ) = ES 2 n+1 ES 2 n. (a) In this case, the sequence S is F-adapted; 6

7 (b) E(S n+1 S n /F n ) = E(X n+1 /F n ) = EX n+1, n = 0, 1,.... (2) Let X n, n = 1, 2,... is a sequence of independent random variables taking values +1 and 1 with probabilities p n and q n = 1 p n, respectively. In this case, EX n = p n q n and, therefore, the following compensated sequence is a F-martingale, where S n = S 0 + n X k A n, n = 0, 1,..., k=1 A n = n (p k q k ), n = 0, 1,.... k=1 1.4 Products of independent random variables and related models (1) Let X 1, X 2,... is a sequence of independent random variables and S = {S n, n = 0, 1,...}, where S n = S 0 n k=1 X k, n = 0, 1,..., S 0 = const. Let also F = {F n, n = 0, 1,...}, where F n = σ(x 1,..., X n ) = σ(s 0, S 1,..., S n ), n = 0, 1,..., F 0 = {, Ω}. In this case, the sequence S is a F martingale if EX n = 1, n = 1, 2,.... (a) In this case, the sequence S is F-adapted; 7

8 (b) S n+1 S n = S n (X n+1 1), n = 0, 1,...; (c) E(S n+1 S n /F n ) = E(S n (X n+1 1)/F n ) = S n E(X n+1 1/F n ) = S n (EX n+1 1), n = 0, 1,.... (d) If random variables X n are a.s. positive, for example if X n = e Y n, n = 1, 2,..., and S 0 > 0, then a.s. S n > 0, n = 0, 1,.... In this case condition EX n = 1, n = 1, 2,... is also necessary condition for the sequence S to be a F-martingale. (2) S n = S 0 exp{ n k=1 Y k }, n = 0, 1,..., where S 0 = const and Y k = N(µ k, σk), 2 k = 1, 2,... are independent normal random variables. In this case, E exp Y n = e µ n+ 1 2 σ2 n, n = 1, 2,... and therefore, the following condition is necessary and sufficient condition under which the sequence S = {S n, n = 0, 1,...} is a F-martingale, µ n σ2 n = 0, n = 1, 2,.... (3) S n = S 0 ( q p ) n k=1 Y k, n = 0, 1,..., where S 0 = const and Y 1, Y 2,... is a sequence of Bernoulli random variables taking values 1 and 1 with probabilities, respectively, p and q = 1 p, where 0 < p < 1. In this case, E( q p )Y n = ( q p )p + ( q p ) 1 q = 1, n = 1, 2,... and, therefore, the sequence S = {S n, n = 0, 1,...} is a F-martingale. 1.5 An exponential martingale Let X 1, X 2,... are i.i.d. random variables such that ψ(t) = Ee tx 1 < for some t > 0. Let also Y n = X X n, n = 8

9 1, 2,... and S n = ety n ψ(t) n = n k=1 Let also F = {F n, n = 0, 1,...}, where e tx n ψ(t), n = 1, 2,..., S 0 = 1. F n = σ(x 1,..., X n ) = σ(s 0, S 1,..., S n ), n = 0, 1,..., F 0 = {, Ω}. In this case, S = {S0, S 1,...} is a F-martingale (known as an exponential martingale). (a) S = {S 0, S 1,...} is a F-adapted sequence; e (b) S n+1 = S txn n ψ(t), n = 0, 1,... e (c) E(S n+1 /F n ) = E(S txn n ψ(t) /F n) = S n E( etxn ψ(t) /F n)) = S n 1 = S n 1.6 Likelihood ratios Let X 1, X 2,... be i.i.d. random variables and let f 0 (x) and f 1 (x) are different probability density functions. For simplicity assume that f 0 (x) > 0, x R 1. Let us define the so-called likelihood ratios, S n = f 1(X 1 )f 1 (X 2 ) f 1 (X n ) f 0 (X 1 )f 0 (X 2 ) f 0 (X n ), n = 1, 2,..., S 0 = 1. Let also the filtration F = {F 0, F 1,...}, where F n = σ(x 1,..., X n ), n = 0, 1,..., F 0 = {, Ω}. (1) S = {S 0, S 1,...} is F-adapted sequence of random variables since, S n is a non-random Borel function of random variables X 1,..., X n. 9

10 (2) Due to independence of random variables X n, we get f E(S n+1 /F n ) = E(S 1 (X n+1 ) n f 0 (X n+1 ) /F n) = S n E f 1(X n+1 ) f 0 (X n+1 ), n = 0, 1,.... (3) Thus, the sequence S = {S 0, S 1,...} is a F-martingale under hypopiesis that the common probability density functions for random variable X n is f 0 (x). Indeed, E f 1(X n+1 ) f 0 (X n+1 ) = f 1 (x) f 0 (x) f 0(x)dx = f 1 (x)dx = 1, n = 0, 1, Square Integrable Martingales 2.1 Doob s martingale Definition Let S ba a random variable such that E S < and F = {F 0, F 1,...} is a filtration. In this case the following sequence is a F-martingale (Doob s martingale) S n = E(S/F n ), n = 0, 1,.... E(S n+1 /F n ) = E(E(S/F n+1 )/F n ) = E(S/F n ) = S n, n = 0, 1, Doob decomposition Definition A sequence of random variables A = {A n, n = 0, 1,...} is a predictable with respect to a filtration F = {F 0, F 1,...} ( F-predictable sequence) if A 0 = 0 and the random variable A n is F n 1 -measurable for every n = 1, 2,.... Theorem 13.1 (Doob decomposition). Let S = {S 0, S 1,...} be a sequence of random variables adapted to a filtration F = 10

11 {F 0, F 1,...} and such that E S n <, n = 0, 1,.... Then, S can be uniquely decomposed in the sum of two sequences M = {M 0, M 1,...}, which is a F-martingale, and A = {A 0, A 1,...}, which is a F-predictable sequence, S n = M n + A n, n = 0, 1,.... (a) Let define M 0 = S 0, A 0 = 0 and M n = M 0 + n (S k E(S k /F k 1 )), A n = S n M n, n = 1, 2,.... k=1 (b) M n is F n -measurable since S k, k = 0, 1,..., n and E(S k /F k 1 ), k = 1,..., n are F n -measurable. (c) E(M n+1 /F n ) = M 0 + n+1 k=1 E((S k E(S k /F k 1 ))/F n ) = M 0 + n k=1 E((S k E(S k /F k 1 ))/F n ) +E((S n+1 E(S n+1 /F n ))/F n ) = M n + 0 = M n. (d) A n = S n S 0 n k=1 (S k E(S k /F k 1 )) = n k=1 E(S k /F k 1 ) n 1 k=0 S k is a F n 1 -measurable random variable. (d) Note also that A n+1 A n = E(S n+1 /F n ) S n, n = 0, 1,.... (f) Suppose that S n = M n + A n, n = 0, 1,... is another decomposition in the sum of a F n -martingale and F n -predictable sequence. Then A n+1 A n = E(A n+1 A n/f n ) = E((S n+1 S n ) (M n+1 M n)/f n ) = E(S n+1 /F n ) S n (M n M n) = E(S n+1 /F n ) S n = A n+1 A n, n = 0, 1,.... (g) Since A 0 = A 0 = 0, we get using (e), A n = A 0 + n 1 k=0(a k+1 A k ) 11

12 = A 0 + n 1 k=0(a k+1 A k) = A n, n = 0, 1,.... (h) M n = S n A n = S n A n = M n, n = 0, 1,.... (1) If S = {S 0, S 1,...} is a submartingale, then A = {A 0, A 1,...} is an a.s. nondecreasing sequence, i.e., P (0 = A 0 A 1 A 2 ) = 1. Indeed, according (d) A n+1 A n = E(S n+1 /F n ) S n 0, n = 0, 1,.... < Ω, F, P > be a probability space; F = {F 0, F 1,...} is a finite or infinite sequence of σ-algebras such that F 0, F 1,... F. S = {S 0, S 1,...} is a F-adapted sequence of random variables defined on this probability space. Lemma Let a sequence S is a F-martingale such that E S n 2 <, n = 0, 1,.... Then the sequence S 2 = {S 2 0, S 2 1,...} is a submartingale. 2.3 Doob decomposition for square integrable martingales E(S 2 n+1/f n ) = E((S n + (S n+1 S n )) 2 /F n ) = E(S 2 n + 2S n (S n+1 S n ) + (S n+1 S n ) 2 /F n ) = E(S 2 n/f n ) + 2S n E(S n+1 S n /F n ) + E((S n+1 S n ) 2 /F n ) 12

13 = S 2 n + E((S n+1 S n ) 2 /F n ) S 2 n, n = 0, 1,.... (1) According Doob decomposition theorem the sequence S 2 can be uniquely decomposed in the sum of a F-martingale M = {M 0, M 1,...}, and a F-predictable sequence A = {A 0, A 1,...}, where S 2 n = M n + A n, n = 0, 1,..., M n = S0 2 + n (Sk 2 E(Sk/F 2 k 1 )), A n = Sn 2 M n. k=1 (2) A n = n k=1 E((S k S k 1 ) 2 /F k 1 ), n = 0, 1,.... A n = S n S 2 0 n k=1 (S 2 k E(S 2 k/f k 1 )) = = n k=1 (E(S 2 k/f k 1 ) S 2 k 1) = n k=1 (E(S 2 k 1 +2S k 1 (S k S k 1 )+(S k S k 1 ) 2 )/F k 1 ) S 2 k 1) = n k=1 E((S k S k 1 ) 2 /F k 1 ). (3) The F-predictable sequence A is a.s. nonnegative and nondecreasing in this case, i.e. P (0 = A 0 A 1 ) = 1. (4) ES 2 n = EM n + EA n = ES EA n, n = 0, 1,.... (5) The sequence {ES 2 0, ES 2 1,...} is nondecreasing, i.e., ES 2 n ES 2 n+1, n = 0, 1,

14 (a) EM n = EM 0 = ES 2 0, n = 0, 1,...; (b) 0 = EA 0 EA 1 EA 2 ; (c) ES 2 n = EM n + EA n = ES EA n ES 2 n+1, n = 0, 1,.... Definition In this case, the notation S = { S 0, S 1,...} is used for F-predictable sequence A and it is called a quadratic characteristic of the martingale S, i.e., S n = n k=1 E((S k S k 1 ) 2 /F k 1 ), n = 0, 1,.... (6) Since S is a nondecreasing sequence, there exists with probability 1 a finite or infinite limit, Example S = lim n S n. If the martingale S is the sum of independent random variables X 1, X 2,..., i.e., S n = X X n, n = 0, 1,... such that E X k 2 <, EX k = 0, k = 1,..., then the quadratic characteristic S n = EX EX 2 n, n = 0, 1,... = V arx 1 + V arx n. The quadratic characteristic is a non-random sequence. 2.4 Doob-Kolmogorov inequality Theorem 13.2 (Doob-Kolmogorov inequality). If S = {S 0, S 1,...} is a square integrable F-martingale, then P ( max 0 k n S n ε) 1 ε 2ES2 n, ε > 0, n = 0, 1,

15 (a) A i = { S j ε, j = 1,..., i 1, S i ε}, i = 0, 1,..., n; (b) n i=0a i = {max 0 k n S n ε}; (c) A = Ω \ ( n i=0a i ); (d) ES 2 n = n i=0 ES 2 ni Ai + ES 2 ni A n i=0 ES 2 ni Ai ; (e) E(S n S i )S i I Ai = EE(S n S i )S i I Ai /F i ) = ES i I Ai E(S n S i /F i ) = 0; (f) ES 2 ni Ai = E((S n S i + S i ) 2 I Ai = E(S n S i ) 2 I Ai + 2E(S n S i )S i I Ai + ES 2 i I Ai ES 2 i I Ai ε 2 EI Ai = ε 2 P (A i ); 3. Convergence of Martingales 3.1 A.s. convergence of martingales Theorem Let a sequence S is a F-martingale such that ESn 2 < M, n = 0, 1,..., where M = const <. Then there exists a random variable S such that S n a.s. S as n. (g) E(S 2 n) n i=0 ES 2 ni Ai n i=0 ε 2 P (A i ) = ε 2 P (max 0 k n S n ε). (a) Since ES 2 n is bounded and non-decreases in n, we can choose M = lim n ES 2 n; (b) S (m) n = S m+n S m, n = 0, 1,..., for m = 0, 1,...; 15

16 (c) F n (m) = σ(s (m) k = S m+k S m, k = 0,..., n), n = 0, 1,...; (d) F (m) = {F (m) 0, F (m) 1,...}; (e) F (m) n F m+n, n = 0, 1,...; (f) E(S (m) n+1/f n (m) = E(S m+n /F (m) n (g) {S (m) n ) = E(E(S m+n+1 /F m+n )/F n (m) ) ) = E(S (m) /F (m) ) = S n (m), n = 0, 1,...;, n = 0, 1,...} is a F (m) -martingale; n n (i) E(S m+n S m ) 2 = ES 2 m+n 2ES m+n S m + ES 2 m = ES 2 m+n ES 2 m 2E(S m+n S m )S m = ES 2 m+n ES 2 m. (h) P (max m i m+n S i S m ε) 1 ε 2 (ES 2 m+n ES 2 m). (j) P (max m i< S i S m ε) 1 ε 2 (M ES 2 m); (k) P ( m=0 i=m { S i S m ε} = lim m P (max m i< S i S m ε) lim m 1 ε 2 (M ES 2 m) = 0; (l) P ( S i S m ε for infinitely many i, m) = 0, for any ε > 0; (m) P (ω : lim m max i m S i (ω) S m (ω) = 0) = 1; (n) A non-random sequence a n a as n if and only if max i m a i a m 0 as m. (o) P (ω : lim n S n (ω) = S(ω)) = 1. Theorem 13.4**. Let a sequence S is a F-martingale such that E S n < M, n = 0, 1,..., where M = const <. Then n 16

17 there exists a random variable S such that Example S n a.s. S as n. As is known the series diverges the alternating series converges. Let X 1, X 2,... be i.i.d. random variables taking value +1 and 1 with equal probabilities. Let also, S n = n X i i=1 i, n = 1, 2,.... This sequence is a martingale (with respect to the natural filtration generating by this sequence). Now, ESn 2 = V ars n = n 1 i=1 i 1 2 i=1 <. Thus, a.s. there exists a random variable S such that S i2 n S as n, i.e., the random harmonic series X i i=1 i converges with probability Law of large numbers for martingales Theorem Let a sequence S is a F-martingale such that ESn 2 a.s. <, n = 0, 1,... and S n. Then, for any nondecreasing function f(x) such that f(x) 1 and 0 f(x) 2 dx <, S n a.s. 0 as n. f( S n ) (a) Y n = n S i S i 1 i=1 f( S i ), n = 1, 2,..., Y 0 = 0; (b) E(Y n+1 Y n /F n ) = E( S n+1 S n f( S n+1 ) /F n) = 1 E( S n f( S n+1 ) /F n) = (c) EY n = n i=1 E S i S i 1 f( S i ) S n f( S n+1 ) S n f( S n+1 ) f( S n+1 ) E(S n+1/f n ) = 0, n = 0, 1,...; = n i=1 EE( S i S i 1 f( S i ) /F i 1) = 0, n 1; (d) Y n+1 Y n = E(Y n+1 Y n ) 2 /F n ) = E( (S n+1 S n ) 2 f( S n+1 ) 2 /F n ) 17

18 = 1 f( S n+1 ) 2 E(S n+1 S n ) 2 /F n ) = S n+1 S n f( S n+1 ) 2 ; (e) Y n = n 1 S k+1 S k k=0 f( S k+1 ) S k+1 2 S k f(x) 2 dx 0 f(x) 2 dx = M a.s., for n = 1, 2,.... (f) EY 2 n = EY E Y n M, n = 0, 1,.... (g) By Theorem 1, Y n = n i=1 S i S i 1 f( S i ) a.s. Y as n ; (h) Lemma (Kronecker). If a n as n and n x k k=1 a k c as n where c < then 1 nk=1 a n x k 0 as n. 1 (i) ni=1 f( S n ) (S i S i 1 ) = 1 f( S n ) (S n S 0 ) a.s. 0 as n. 1 (j) f( S n ) S n a.s. 0 as n. Example Let X 1, X 2,... be independent random variables such that σ 2 n = V arx n <, EX n = µ n, n = 1, 2,.... Denote b n = nk=1 σ 2 k, a n = n k=1 µ k. Assume that b n as n. Let also, S n = n (X k µ k ), n = 1, 2,..., S 0 = 0. k=1 This sequence is a martingale (with respect to the natural filtration generating by this sequence) and ES n = 0, S n = b n, n = 0, 1,.... Choose, f(x) = max(x, 1). This function is non-decreasing and 0 f(x) 2 dx = x 2 dx <. 18

19 (1) By Theorem 2, S n = f(b n) b n b n S n f(b n ) a.s. 1 0 = 0 as n. (2) If a n bn c, then, also, 1 n X k b n k=1 a.s. c as n. (3) If σn 2 = σ 2, EX n = µ, n = 1, 2,..., then b n = nσ 2, a n = µn, n = 1, 2,..., c = µ σ and 2 1 nσ 2 n X k k=1 a.s. c as n. 3.3 Central limit theorem for martingales Theorem 13.6**. Let a sequence S is a F-martingale such that ES 2 n <, n = 0, 1,... and the following condition hold: (1) S n n (2) Then, P σ 2 > 0 as n ; n i=1 E((S i S i 1 ) 2 I( S i S i 1 ε n) n 0 as n, ε > 0. S n nσ d S as n, where S is a standard normal random variable with the mean 0 and the variance Stopping times 19

20 4.1 Definition and basic properties of stopping times < Ω, F, P > be a probability space; F = {F 0, F 1,...} is a finite or infinite sequence of σ-algebras such that F 0, F 1,... F; T = T (ω) be a random variable defined on a probability space < Ω, F, P > and taking values in the set {0, 1,..., + }; S = {S 0, S 1,...} is a F-adapted sequence of random variables defined on the probability space < Ω, F, P >. Definition The random variable T is called a stopping time for filtration F if, {T = n} F n, n = 1, 2,.... (1) The equivalent condition defining a stopping time T is to require that {T n} F n, n = 1, 2,... or {T > n} F n, n = 1, 2,.... (a) Events {T = k} F k F n, k = 0, 1,..., n, thus, event {T n} = n k=0{t = k} F n and in sequel {T > n} F n ; (2) Let H 0, H 1,... be a sequence of Borel subsets of a real line. A hitting time T = min(n 0 : S n H n ) is an example of stopping time. (b) Events {T > n 1} F n 1 F n, thus, {T = n} = {T > n 1} \ {T > n} F n. 20

21 (3) If T and T are stopping times for a filtration F then T +T, max(t, T ), min(t, T ) are stopping times. {T = n} = {S 0 / H 0,..., S n 1 / H n 1, S n H n } F n, n = 0, 1,.... (a) {T = T + T = n} = n k=0({t = k, T = n k} F n ; (b) {T = max(t, T ) n} = {T n, T n} F n ; (b) {T = min(t, T ) > n} = {T > n, T > n} F n ; 21

22 4.2 Stopped martingales Theorem If S = {S n, n = 0, 1,...} is a F-martingale and T is a stopping time for filtration F, then the sequence S = {S n = S T n, n = 0, 1,...} is also a F-martingale. (a) S T n = n 1 k=0 S k I(T = k) + S n I(T > n 1); (b) E(S n+1/f n ) = S T (n+1) /F n ) = n k=0 E(S k I(T = k)/f n )+ E(S n+1 I(T > n)/f n ) = n k=0 I(T = k)s k + I(T > n)e(s n+1 /F n ) = n 1 k=0 I(T = k)s k + I(T = n)s n + I(T > n)s n = n 1 k=0 I(T = k)s k + S n I(T > n 1) (1) ES n = ES 0, n = 0, 1,... Theorem If S = {S n, n = 0, 1,...} is a F-martingale and T is a stopping time for filtration F such that P (T N) = 1 for some integer constant N 0, then = S T n = S n ES n = ES 0 = ES T 0 = ES 0. (a) S T = S T N = S N a.s. (b) ES T = ES N = ES 0. ES T = ES 0. 22

23 4.3 Optional stopping theorems Theorem If S = {S n, n = 0, 1,...} is a F-martingale and T is a stopping time for filtration F such that (1) P (T < ) = 1; (2) E S T < ; (3) E(S n /T > n)p (T > n) = ES n I(T > n) 0 as n. Then, ES T = ES 0. (a) ES T = ES T I(T n) + ES T I(T > n); (b) E S T = k=0 E( S T /T = k)p (T = k) < ; (c) ES T I(T > n) k=n+1 E( S T /T = k)p (T = k) 0 as n ; (d) ES T I(T n) = ES T I(T n) + ES n I(T > n) ES n I(T > n) = ES T n ES n I(T > n); (e) ES n I(T > n) 0 as n ; (f) ES T = lim n ES T n = ES 0. Example Let consider so-called symmetric random walk that is S n = ni=1 X i, n = 0, 1,..., where X 1, X 2,... are i.i.d. Bernoulli random variables taking value 1 and 1 with equal probabilities. 23

24 In this case, sequence S = {S n, n = 0, 1,...} is a martingale with respect to the natural filtration F generated by random variables X 1, X 2,.... Let a, b be strictly positive integers and T = min(n 1 : S n = a or b). It is a hitting time, and, therefore, a stopping time for the filtration F. (1) P (S T = a) = b a+b. (2) ET = ab. (a) S T can take only two values a or b with probabilities P a = P (S T = a) and 1 P a respectively; (b) Thus, E S T <, i.e., condition (2) of Theorem 13.9 holds. (c) A = {X k = 1, k = 1,... a + b}, B c = { a < c + S k < b, k = 1,... a + b}, a < c < b; (d) P (A) = ( 1 2 )a+b = p > 0; (e) B c A, a < c < b; (f) max a<c<b P (B c ) P (A) 1 p < 1; (g) P (T > a + b) = P (B 0 ) 1 p; (h) P (T > 2(a + b)) = a<c<b P (T > a + b, S a+b = c)p (B c ) (1 p) a<c<b P (T > a + b, S a+b = c) = (1 p)p (T > a + b) (1 p) 2 ; (i) P (T > k(a + b)) (1 p) k, k = 1, 2,...; 24

25 (j) P (T > n) 0 as n, i.e P (T < ) = 1; (k) Thus, condition (1) of Theorem 13.9 holds. (l) ES n I(T > n) (a + b)ei(t > n) 0 as n ; (m) Thus condition (3) of Theorem 2 also holds; (n) ES T = ap a + b(1 P a ) = ES 0 = 0; (o) P a = b a+b. (p) Consider a random sequence V n = S 2 n n, n = 0, 1,.... The non-random sequence n is a quadratic characteristic for the submartingale S 2 n and V n is a martingale with respect to the natural filtration F generated by random variables X 1, X 2,...; (q) ET < that follows from (i); (r) V T S T 2 + T max(a, b) 2 + T ; (s) E V T max(a, b) 2 + ET < ; (t) EV T I(T > n) (max(a, b) 2 + n)p (T > n) 0 as n ; (u) By Theorem 2, EV T = a 2 P a + b 2 (1 P a ) ET = EV 0 = 0; (w) ET = ab. Theorem If S = {S n, n = 0, 1,...} is a F-martingale and T is a stopping time for filtration F such that (1) ET < ; (2) E S n+1 S n /F n ) K <, n = 0, 1,...; Then, ES T = ES 0. (a) Z 0 = S 0, Z n = S n S n 1, n = 1, 2,...; 25

26 (b) W = Z Z T ; (c) S T W ; (d) EW = n=0 nk=0 EZ k I(T = n) = k=0 n=k EZ k I(T = n) = k=0 EZ k I(T k) = E S 0 + k=1 EZ k I(T k); (e) I(T k) is F k 1 -measurable, for k = 1, 2,...; (f) EZ k I(T k) = EE(Z k I(T k)/f k 1 ) = EI(T k)e(z k /F k 1 ) KP (T k); (g) EW E S 0 + k=1 KP (T k) = E S 0 + KET < ; (h) ES T n ES T E S T n S T I(T > n) 2EW I(T > n); (i) k=0 EW I(T = k) = EW < ; (j) EW I(T > n) = k>n EW I(T = k) 0 as n ; (k) ES T n = ES 0 ; (l) It follows from (h) - (k) that ES T = ES Wald equation Theorem (Wald Equation). Let X 1, X 2,... be i.i.d. random variables such that E X 1 <, EX 1 = µ. Let also Y n = n k=1 X k, n = 1, 2,.... Then, the centered sequence S n = Y n µn, n = 1, 2,..., S 0 = 0 is a martingale with respects to F generated by random variables X 1, X 2,.... Lets also T is a stopping time with respect to filtration F such that ET <. Then, EY T = µet. 26

27 (a) E( S n+1 S n /F n ) = E X 1 µ 2µ, n = 1, 2,...; (b) S T = Y T µt ; (c) Theorem 4 implies that ES T = ES 0 = 0; (d) EY T µet = ES T = A fair game example A gambler flips a fair coin and wins his bet if it comes up heads and losses his bet if it comes up tails. The so-called martingale betting strategy can be viewed as a way to win in a fair gain is a to keep doubling the bet until the gambler eventually wins. (1) Let X n, n = 1, 2,... be independent random variables taking value 2 n 1 and 2 n 1 with equal probabilities. These random variables represent the outcomes of sequential flips of the coin. Thus, S n is the gain of the gambler after n flips. Since EX n = 0, n = 1, 2,..., the sequence S n = X X n, n = 1, 2,..., S 0 = 0 is a martingale with respect to the natural filtration generating by the random variables X 1, X 2,.... (2) Let T be the number of flip in which the gambler first time win. By the definition, P (T = n) = ( 1 2 )n, n = 1, 2,.... Then, according the definition of T and the description of martingale betting strategy, S T = 2 T ( T 1 ) = 1. (3) This, seems, contradicts to optional stopping theorem since T is a stopping time with ET <, and, the equality ES T = 27

28 ES 0 = 0 may be expected, while according (2) ES T = 1. (4) The explanation of this phenomenon is that conditions of optional stopping Theorems does not hold. (a) Theorem 13.8 can not be implied, since T is not bounded random variable; (b) Theorem 13.9 can not be implied, since E(S n I(T > n)) = ( n 1 ) 1 2 = 2n 1 n as n ; n (c) Theorem can not be implied, since E( S n+1 S n /F n ) = E X n+1 = 2 n as n. LN Problems 1. Please, characterize a sequence of random variables S = {S 0, S 1,...} which is a martingale with respect to the filtration F = {F 0, F 1,...} for the case where F 0 = F 1 = F 2 =, in particular, if F 0 = {, Ω}. 2. Let X 1, X 2,... be i.i.d random variables, EX 1 = 0, V arx 1 = σ 2 <. Let also S 0 = 0 and S n = n k=1 X k nσ 2. Prove that the sequence S = {S 0, S 1,...} is a martingale with respect to filtration F = {F 0, F 1,...}, where F n = σ(x 1,..., X n ), n = 0, 1,..., F 0 = {, Ω}. 3. Suppose that a sequence S = {S 0, S 1,...} is a martingale and also a predictable sequence with respect to a filtration 28

29 F = {F 0, F 1,...}. Show that, in this case P (S n = S 0 ) = 1, n = 0, 1, Suppose that a sequence S = {S 0, S 1,...} is a martingale with respect to a filtration F = {F 0, F 1,...}. Show that the sequence S = {S 0 = S 2 0, S 1 = S 2 1,...} is a submartingale with respect to a filtration F. 5. Suppose that sequences S = {S 0, S 1,...} and S = {S 0, S 1,...} are submartingales with respect to a filtration F = {F 0, F 1,...}. Show that the sequence S = {S 0 = S 0 S 0, S 1 = S 1 S 1,...} is also a submartingale with respect to a filtration F. 6. Let F = {F 0, F 1,...} is a filtration and let F be a minimal σ-algebra which contains all σ-algebra F n, n = 0, 1,.... Let also S a random variable such that P (S 0) = 1 and ES <. Define, S n = E(S/F n ), n = 0, 1,.... Prove that S n a.s. S as n. 7. Let F = {F 0, F 1,...} is a filtration and let F be a minimal σ-algebra which contains all σ-algebra F n, n = 0, 1,.... Let also S a random variable such that E S <. Prove that E(S/F n ) a.s. E(S/F) as n. 8. Let X 1, X 2,... be a sequence of independent random variables such that EXn 2 = b n <, EX n = a n 0, n = 1, 2,.... Let define random variables S n = n X k k=1 a k, n = 1, 2,..., S 0 = 1. Prove that the sequence S = {S 0, S 1,...} is a martingale with respect to filtration generated by random variables X 1, X 2,... and prove that condition b k k=1 m < implies that there exist 2 k 29

30 a random variable S such that S n a.s. S as n. 9. Let X 1, X 2,... be a sequence of independent random variables such that EX 2 n = b k <, EX n = 0, n = 1, 2,.... Define S n = nk=1 X k, n =, 1, 2,..., S 0 = 0. Prove that the sequence S = {S 0, S 1,...} is a martingale with respect to filtration generated by random variables X 1, X 2,... and that condition k=1 b k < implies that there exist a random variable S such that S n a.s. S as n. 10. Please, re-formulate Theorem 4 for the case where the random variables S n = X X n, n = 1, 2,... are sums of independent random variables and F is the natural filtration generated by the random variables X 1, X 2, Let X 1, X 2,... be non-negative i.i.d. random variables such that EX 1 = µ > 0. Let also Y n = n k=1 X k, n = 1, 2,... and T u = min(n 1 : Y n u), u > 0. Prove, that T u is a stopping time such that ET u <, u > Let X 1, X 2,... be non-negative i.i.d. random variables such that EX 1 = µ > 0. Let also Y n = n k=1 X k, n = 1, 2,... and T u = min(n 1 : Y n u), u > 0. Prove, EY Tu = µet u, u > 0. 30

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

SOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY

SOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY SOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY WATSON LADD Abstract. Martingales are a very simple concept with wide application in probability. We introduce the concept of a martingale, develop

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100 Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

MARTINGALES AND GAMBLING Louis H. Y. Chen Department of Mathematics National University of Singapore

MARTINGALES AND GAMBLING Louis H. Y. Chen Department of Mathematics National University of Singapore MARTINGALES AND GAMBLING Louis H. Y. Chen Department of Mathematics National University of Singapore 1. Introduction The word martingale refers to a strap fastened belween the girth and the noseband of

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Gambling Systems and Multiplication-Invariant Measures

Gambling Systems and Multiplication-Invariant Measures Gambling Systems and Multiplication-Invariant Measures by Jeffrey S. Rosenthal* and Peter O. Schwartz** (May 28, 997.. Introduction. This short paper describes a surprising connection between two previously

More information

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 Martingale Ideas in Elementary Probability Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 William Faris University of Arizona Fulbright Lecturer, IUM, 1995 1996

More information

6.042/18.062J Mathematics for Computer Science December 12, 2006 Tom Leighton and Ronitt Rubinfeld. Random Walks

6.042/18.062J Mathematics for Computer Science December 12, 2006 Tom Leighton and Ronitt Rubinfeld. Random Walks 6.042/8.062J Mathematics for Comuter Science December 2, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Random Walks Gambler s Ruin Today we re going to talk about one-dimensional random walks. In

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

n k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n + 1 +...

n k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n + 1 +... 6 Series We call a normed space (X, ) a Banach space provided that every Cauchy sequence (x n ) in X converges. For example, R with the norm = is an example of Banach space. Now let (x n ) be a sequence

More information

10.2 Series and Convergence

10.2 Series and Convergence 10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

CHAPTER II THE LIMIT OF A SEQUENCE OF NUMBERS DEFINITION OF THE NUMBER e.

CHAPTER II THE LIMIT OF A SEQUENCE OF NUMBERS DEFINITION OF THE NUMBER e. CHAPTER II THE LIMIT OF A SEQUENCE OF NUMBERS DEFINITION OF THE NUMBER e. This chapter contains the beginnings of the most important, and probably the most subtle, notion in mathematical analysis, i.e.,

More information

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,

More information

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)! Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following

More information

MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators...

MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators... MATH4427 Notebook 2 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................

More information

Taylor and Maclaurin Series

Taylor and Maclaurin Series Taylor and Maclaurin Series In the preceding section we were able to find power series representations for a certain restricted class of functions. Here we investigate more general problems: Which functions

More information

RANDOM INTERVAL HOMEOMORPHISMS. MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis

RANDOM INTERVAL HOMEOMORPHISMS. MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis RANDOM INTERVAL HOMEOMORPHISMS MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis This is a joint work with Lluís Alsedà Motivation: A talk by Yulij Ilyashenko. Two interval maps, applied

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails

A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails 12th International Congress on Insurance: Mathematics and Economics July 16-18, 2008 A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails XUEMIAO HAO (Based on a joint

More information

A Martingale System Theorem for Stock Investments

A Martingale System Theorem for Stock Investments A Martingale System Theorem for Stock Investments Robert J. Vanderbei April 26, 1999 DIMACS New Market Models Workshop 1 Beginning Middle End Controversial Remarks Outline DIMACS New Market Models Workshop

More information

Generating Functions

Generating Functions Chapter 10 Generating Functions 10.1 Generating Functions for Discrete Distributions So far we have considered in detail only the two most important attributes of a random variable, namely, the mean and

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS Eusebio GÓMEZ, Miguel A. GÓMEZ-VILLEGAS and J. Miguel MARÍN Abstract In this paper it is taken up a revision and characterization of the class of

More information

BANACH AND HILBERT SPACE REVIEW

BANACH AND HILBERT SPACE REVIEW BANACH AND HILBET SPACE EVIEW CHISTOPHE HEIL These notes will briefly review some basic concepts related to the theory of Banach and Hilbert spaces. We are not trying to give a complete development, but

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015. Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

More information

Undergraduate Notes in Mathematics. Arkansas Tech University Department of Mathematics

Undergraduate Notes in Mathematics. Arkansas Tech University Department of Mathematics Undergraduate Notes in Mathematics Arkansas Tech University Department of Mathematics An Introductory Single Variable Real Analysis: A Learning Approach through Problem Solving Marcel B. Finan c All Rights

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Charles M. Grinstead and J. Laurie Snell: Published by AMS. Solutions to the exercises SECTION 1.2

Charles M. Grinstead and J. Laurie Snell: Published by AMS. Solutions to the exercises SECTION 1.2 Charles M Grinstead and J Laurie Snell: INTRODUCTION to PROBABILITY Published by AMS Solutions to the exercises SECTION As n increases, the proportion of heads gets closer to /, but the difference between

More information

Math/Stats 342: Solutions to Homework

Math/Stats 342: Solutions to Homework Math/Stats 342: Solutions to Homework Steven Miller ([email protected]) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Math 55: Discrete Mathematics

Math 55: Discrete Mathematics Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 5 Life annuities Extract from: Arcones Manual for the SOA Exam MLC Fall 2009 Edition available at http://wwwactexmadrivercom/ 1/70 Due n year deferred annuity Definition 1 A due n year deferred

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

LECTURE 15: AMERICAN OPTIONS

LECTURE 15: AMERICAN OPTIONS LECTURE 15: AMERICAN OPTIONS 1. Introduction All of the options that we have considered thus far have been of the European variety: exercise is permitted only at the termination of the contract. These

More information

TAKE-AWAY GAMES. ALLEN J. SCHWENK California Institute of Technology, Pasadena, California INTRODUCTION

TAKE-AWAY GAMES. ALLEN J. SCHWENK California Institute of Technology, Pasadena, California INTRODUCTION TAKE-AWAY GAMES ALLEN J. SCHWENK California Institute of Technology, Pasadena, California L INTRODUCTION Several games of Tf take-away?f have become popular. The purpose of this paper is to determine the

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information

How to Gamble If You Must

How to Gamble If You Must How to Gamble If You Must Kyle Siegrist Department of Mathematical Sciences University of Alabama in Huntsville Abstract In red and black, a player bets, at even stakes, on a sequence of independent games

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Chapter 2: Binomial Methods and the Black-Scholes Formula

Chapter 2: Binomial Methods and the Black-Scholes Formula Chapter 2: Binomial Methods and the Black-Scholes Formula 2.1 Binomial Trees We consider a financial market consisting of a bond B t = B(t), a stock S t = S(t), and a call-option C t = C(t), where the

More information

INSURANCE RISK THEORY (Problems)

INSURANCE RISK THEORY (Problems) INSURANCE RISK THEORY (Problems) 1 Counting random variables 1. (Lack of memory property) Let X be a geometric distributed random variable with parameter p (, 1), (X Ge (p)). Show that for all n, m =,

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2009 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

ON FIBONACCI NUMBERS WITH FEW PRIME DIVISORS

ON FIBONACCI NUMBERS WITH FEW PRIME DIVISORS ON FIBONACCI NUMBERS WITH FEW PRIME DIVISORS YANN BUGEAUD, FLORIAN LUCA, MAURICE MIGNOTTE, SAMIR SIKSEK Abstract If n is a positive integer, write F n for the nth Fibonacci number, and ω(n) for the number

More information

Notes from Week 1: Algorithms for sequential prediction

Notes from Week 1: Algorithms for sequential prediction CS 683 Learning, Games, and Electronic Markets Spring 2007 Notes from Week 1: Algorithms for sequential prediction Instructor: Robert Kleinberg 22-26 Jan 2007 1 Introduction In this course we will be looking

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include 2 + 5.

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include 2 + 5. PUTNAM TRAINING POLYNOMIALS (Last updated: November 17, 2015) Remark. This is a list of exercises on polynomials. Miguel A. Lerma Exercises 1. Find a polynomial with integral coefficients whose zeros include

More information

Some probability and statistics

Some probability and statistics Appendix A Some probability and statistics A Probabilities, random variables and their distribution We summarize a few of the basic concepts of random variables, usually denoted by capital letters, X,Y,

More information

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié.

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié. Random access protocols for channel access Markov chains and their stability [email protected] Aloha: the first random access protocol for channel access [Abramson, Hawaii 70] Goal: allow machines

More information

The Heat Equation. Lectures INF2320 p. 1/88

The Heat Equation. Lectures INF2320 p. 1/88 The Heat Equation Lectures INF232 p. 1/88 Lectures INF232 p. 2/88 The Heat Equation We study the heat equation: u t = u xx for x (,1), t >, (1) u(,t) = u(1,t) = for t >, (2) u(x,) = f(x) for x (,1), (3)

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

Bayesian logistic betting strategy against probability forecasting. Akimichi Takemura, Univ. Tokyo. November 12, 2012

Bayesian logistic betting strategy against probability forecasting. Akimichi Takemura, Univ. Tokyo. November 12, 2012 Bayesian logistic betting strategy against probability forecasting Akimichi Takemura, Univ. Tokyo (joint with Masayuki Kumon, Jing Li and Kei Takeuchi) November 12, 2012 arxiv:1204.3496. To appear in Stochastic

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles... MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability

More information

Linear Risk Management and Optimal Selection of a Limited Number

Linear Risk Management and Optimal Selection of a Limited Number How to build a probability-free casino Adam Chalcraft CCR La Jolla [email protected] Chris Freiling Cal State San Bernardino [email protected] Randall Dougherty CCR La Jolla [email protected] Jason

More information

Continued Fractions and the Euclidean Algorithm

Continued Fractions and the Euclidean Algorithm Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction

More information

Discrete Probability Distributions

Discrete Probability Distributions Chapter Discrete Probability Distributions. Simulation of Discrete Probabilities. As n increases, the proportion of heads gets closer to /2, but the difference between the number of heads and half the

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Peter Richtárik Week 3 Randomized Coordinate Descent With Arbitrary Sampling January 27, 2016 1 / 30 The Problem

More information

Mathematical Finance

Mathematical Finance Mathematical Finance Option Pricing under the Risk-Neutral Measure Cory Barnes Department of Mathematics University of Washington June 11, 2013 Outline 1 Probability Background 2 Black Scholes for European

More information

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168 SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION Ravi PHATARFOD School of Mathematical Sciences Monash University, Clayton, Victoria, Australia 3168 In this paper we consider the problem of gambling with

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

An example of a computable

An example of a computable An example of a computable absolutely normal number Verónica Becher Santiago Figueira Abstract The first example of an absolutely normal number was given by Sierpinski in 96, twenty years before the concept

More information

Modeling and Analysis of Information Technology Systems

Modeling and Analysis of Information Technology Systems Modeling and Analysis of Information Technology Systems Dr. János Sztrik University of Debrecen, Faculty of Informatics Reviewers: Dr. József Bíró Doctor of the Hungarian Academy of Sciences, Full Professor

More information

MATHEMATICAL METHODS OF STATISTICS

MATHEMATICAL METHODS OF STATISTICS MATHEMATICAL METHODS OF STATISTICS By HARALD CRAMER TROFESSOK IN THE UNIVERSITY OF STOCKHOLM Princeton PRINCETON UNIVERSITY PRESS 1946 TABLE OF CONTENTS. First Part. MATHEMATICAL INTRODUCTION. CHAPTERS

More information

Lecture Notes on Measure Theory and Functional Analysis

Lecture Notes on Measure Theory and Functional Analysis Lecture Notes on Measure Theory and Functional Analysis P. Cannarsa & T. D Aprile Dipartimento di Matematica Università di Roma Tor Vergata [email protected] [email protected] aa 2006/07 Contents

More information

Basics of Statistical Machine Learning

Basics of Statistical Machine Learning CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu [email protected] Modern machine learning is rooted in statistics. You will find many familiar

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

Introduction to General and Generalized Linear Models

Introduction to General and Generalized Linear Models Introduction to General and Generalized Linear Models General Linear Models - part I Henrik Madsen Poul Thyregod Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby

More information

Lecture Notes on Elasticity of Substitution

Lecture Notes on Elasticity of Substitution Lecture Notes on Elasticity of Substitution Ted Bergstrom, UCSB Economics 210A March 3, 2011 Today s featured guest is the elasticity of substitution. Elasticity of a function of a single variable Before

More information

Message-passing sequential detection of multiple change points in networks

Message-passing sequential detection of multiple change points in networks Message-passing sequential detection of multiple change points in networks Long Nguyen, Arash Amini Ram Rajagopal University of Michigan Stanford University ISIT, Boston, July 2012 Nguyen/Amini/Rajagopal

More information

Every Positive Integer is the Sum of Four Squares! (and other exciting problems)

Every Positive Integer is the Sum of Four Squares! (and other exciting problems) Every Positive Integer is the Sum of Four Squares! (and other exciting problems) Sophex University of Texas at Austin October 18th, 00 Matilde N. Lalín 1. Lagrange s Theorem Theorem 1 Every positive integer

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

A simple criterion on degree sequences of graphs

A simple criterion on degree sequences of graphs Discrete Applied Mathematics 156 (2008) 3513 3517 Contents lists available at ScienceDirect Discrete Applied Mathematics journal homepage: www.elsevier.com/locate/dam Note A simple criterion on degree

More information

BROWNIAN MOTION 1. INTRODUCTION

BROWNIAN MOTION 1. INTRODUCTION BROWNIAN MOTION 1.1. Wiener Process: Definition. 1. INTRODUCTION Definition 1. A standard (one-dimensional) Wiener process (also called Brownian motion) is a stochastic process {W t } t 0+ indexed by nonnegative

More information

Universal Algorithm for Trading in Stock Market Based on the Method of Calibration

Universal Algorithm for Trading in Stock Market Based on the Method of Calibration Universal Algorithm for Trading in Stock Market Based on the Method of Calibration Vladimir V yugin Institute for Information Transmission Problems, Russian Academy of Sciences, Bol shoi Karetnyi per.

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

THE NUMBER OF GRAPHS AND A RANDOM GRAPH WITH A GIVEN DEGREE SEQUENCE. Alexander Barvinok

THE NUMBER OF GRAPHS AND A RANDOM GRAPH WITH A GIVEN DEGREE SEQUENCE. Alexander Barvinok THE NUMBER OF GRAPHS AND A RANDOM GRAPH WITH A GIVEN DEGREE SEQUENCE Alexer Barvinok Papers are available at http://www.math.lsa.umich.edu/ barvinok/papers.html This is a joint work with J.A. Hartigan

More information

VERTICES OF GIVEN DEGREE IN SERIES-PARALLEL GRAPHS

VERTICES OF GIVEN DEGREE IN SERIES-PARALLEL GRAPHS VERTICES OF GIVEN DEGREE IN SERIES-PARALLEL GRAPHS MICHAEL DRMOTA, OMER GIMENEZ, AND MARC NOY Abstract. We show that the number of vertices of a given degree k in several kinds of series-parallel labelled

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Triangle deletion. Ernie Croot. February 3, 2010

Triangle deletion. Ernie Croot. February 3, 2010 Triangle deletion Ernie Croot February 3, 2010 1 Introduction The purpose of this note is to give an intuitive outline of the triangle deletion theorem of Ruzsa and Szemerédi, which says that if G = (V,

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS Contents 1. Moment generating functions 2. Sum of a ranom number of ranom variables 3. Transforms

More information

Math 526: Brownian Motion Notes

Math 526: Brownian Motion Notes Math 526: Brownian Motion Notes Definition. Mike Ludkovski, 27, all rights reserved. A stochastic process (X t ) is called Brownian motion if:. The map t X t (ω) is continuous for every ω. 2. (X t X t

More information