Lecture 13: Martingales


 Christopher Small
 2 years ago
 Views:
Transcription
1 Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of independent random variables and related models 1.5 An exponential martingale 1.6 Likelihood ratios 3. Square Integrable Martingales 2.1 Doob s martingale 2.2 Doob decomposition 2.3 Doob decomposition for square integrable martingales 2.4 DoobKolmogorov inequality 4. Convergence of Martingales 3.1 A.s. convergence of martingales 3.2 Law of large numbers for martingales 3.3 Central limit theorem for martingales 5. Stopping times 4.1 Definition and basic properties of stopping times 4.2 Stopped martingales 1
2 4.3 Optional stopping theorems 4.4 Wald equation 4.5 A fair game example 1. Definition of a Martingale 1.1 Filtrations < Ω, F, P > be a probability space; S = {S 0, S 1,...} is a finite or infinite sequence of random variables defined on this probability space. F = {F 0, F 1,...} is a finite or infinite sequence of σalgebras such that F 0, F 1,... F. (1) If FN = {F 0, F 1,..., F N } is a finite sequence of σalgebras, then the infinite sequence of σalgebras F = {F 0, F 1,..., F N, F N,...} = {F n N, n = 0, 1,...} is called a natural continuation of the initial finite sequence of σalgebras. (2) If SN = {S 0, S 1,..., S N } is a finite sequence of random variables, then the infinite sequence of random variable S = {S 0, S 1,..., S N, S N,...} = {S n N, n = 0, 1,...} is called a natural continuation of the initial finite sequence of random variables. The continuation construction described above us restrict consideration by infinite sequences of random variables and σ algebras. Definition A sequence of σalgebras F = {F 0, F 1,...} is a filtration if it is a nondecreasing sequence, i.e., F 0 F 1 2
3 . Examples (1) F 0 = F 1 = F 2 =. Two extreme cases are, where F 0 = {, Ω} or F 0 = F. (2) S = {S 0, S 1,...} is a sequence of random variables, and F n = σ(s k, k = 0,..., n), n = 0, 1,.... Then, F = {F0, F 1,...} is a filtration. It is called the natural filtration generated by the sequence S. Note that in this case the random variable S n is F n measurable for every n = 0, 1,.... (3) S = {S 0, S 1,...} and X = { X 0, X 1,...} are, respectively, a sequence of random variables and a sequence of random vectors, and F n = σ(s k, X k, k = 0,..., n), n = 0, 1,.... Then, F = {F 0, F 1,...} is a filtration. Note that in this case the random variable S n is also F n measurable for every n = 0, 1,.... (4) If F 0 F 1,, F N is a finite nondecreasing sequence of σalgebras, then its natural continuation F = {F n N, n = 0, 1,...} is also a nondecreasing sequence of σalgebras, i.e., it is a filtration. (5) If F n = σ(s k, k = 0,..., n), n = 0, 1,..., N is the finite sequence of σalgebras generated by a finite sequence of random variables S 0,..., S N }, then F = {F n N, n = 0, 1,...} is a natural filtration for the infinite sequence of random variables S = S n N, n = 0, 1,...}. 3
4 1.2 Definition of a martingale and its basic properties < Ω, F, P > be a probability, S = {S0, S 1,...} is a sequence of random variables defined on this probability space, and F = {F 0, F 1,...} is a filtration at this probability space. Definition A sequence of random variables S = {S 0, S 1,...} is Fadapted to a filtration F = {F 0, F 1,...} if the random variable S n is F n measurable (i.e., event {S n B} F n, B B 1 ) for every n = 0, 1,.... Definition A Fadapted sequence of random variables S = {S 0, S 1,...} is a Fmartingale (martingale with respect to a filtration F) if it satisfies the following conditions: (a) E S n <, n = 0, 1,...; (b) E(S n+1 /F n ) = S n, n = 0, 1,.... (1) Condition (b) can be written in the equivalent form E(S n+1 S n /F n ) = 0, n = 0, 1,.... (2) A Fadapted sequence of random variables S = {S 0, S 1,...} is Fsubmartingale or Fsupermartingale if (a) E S n <, n = 0, 1,... and, respectively (b ) E(S n+1 /F n ) S n, n = 0, 1,... or (b ) E(S n+1 /F n ) S n, n = 0, 1,.... (3) If S = {S 0, S 1,...} is a martingale with respect to a filtration F = {F n = σ(x 0,..., X n ), n = 0, 1,...} generated by the sequence of random variables X == {X 0, X 1,...}, then the notation E(S n+1 /X 0,..., X n ) = E(S n+1 /F n ) and one can speak about S as the martingale with respect to the sequence of ran 4
5 dom variable X. (4) If S = {S 0, S 1,...} is a martingale with respect to a natural filtration F = {F n = σ(s 0,..., S n ), n = 0, 1,...} then the notation E n S n+m = E(S n+m /F n ) may be used and one can refer to the sequence S as a martingale without a specification of the corresponding filtration. The martingales possess the following basic properties: 1. If S = {S 0, S 1,...} is Fmartingale, then E(S n+m /F n ) = S n, 0 n < n + m <. 2. If S = {S 0, S 1,...} is Fmartingale, then ES n = ES 0, n = 0, 1,.... (a) E(S n+m /F n ) = E(E(S n+m /F n+m 1 )/F n ) = E(S n+m 1 /F n ); (b) Iterate this relation to get the above formula. ES n = E(E(S n /F 0 )) = ES 0, n = 0, 1, If S = {S 0, S 1,...} and S = {S 0, S 1,...} are two Fmartingales, and a, b R 1, then the sequence S = {S 0 = as 0 + bs 0, S 1 = as 1 + bs 1,...} is also a Fmartingale. E(S n+1 /F n ) = E(aS n+1 + bs n+1/f n ) 5
6 4. If S = {S 0, S 1,...} is Fmartingale and ES 2 n <, n = 0, 1,..., then this sequence is nondecreasing, i.e., ES 2 n ES 2 n+1, n = 0, 1,.... = ae(s n+1/f n ) + be(s n+1/f n ) = as n + bs n = S n, n = 0, 1,.... (a) ES n (S n+1 S n ) = E(E(S n (S n+1 S n )/F n )) = E(S n E(S n+1 S n )/F n )) = E(S n 0) = 0; 1.3 Sums of independent random variables and related models (1) Let X 1, X 2,... is a sequence of independent random variables and S = {S n, n = 0, 1,...}, where S n = S 0 + X X n, n = 0, 1,..., S 0 = const. Let also F = {F n, n = 0, 1,...}, where F n = σ(x 1,..., X n ) = σ(s 0, S 1,..., S n ), n = 0, 1,..., F 0 = {, Ω}. In this case, the sequence S is a F martingale if and only if EX n = 0, n = 1, 2,.... (b) 0 E(S n+1 S n ) 2 = ES 2 n+1 2ES n+1 S n + ES 2 n = ES 2 n+1 ES 2 n 2ES n (S n+1 S n ) = ES 2 n+1 ES 2 n. (a) In this case, the sequence S is Fadapted; 6
7 (b) E(S n+1 S n /F n ) = E(X n+1 /F n ) = EX n+1, n = 0, 1,.... (2) Let X n, n = 1, 2,... is a sequence of independent random variables taking values +1 and 1 with probabilities p n and q n = 1 p n, respectively. In this case, EX n = p n q n and, therefore, the following compensated sequence is a Fmartingale, where S n = S 0 + n X k A n, n = 0, 1,..., k=1 A n = n (p k q k ), n = 0, 1,.... k=1 1.4 Products of independent random variables and related models (1) Let X 1, X 2,... is a sequence of independent random variables and S = {S n, n = 0, 1,...}, where S n = S 0 n k=1 X k, n = 0, 1,..., S 0 = const. Let also F = {F n, n = 0, 1,...}, where F n = σ(x 1,..., X n ) = σ(s 0, S 1,..., S n ), n = 0, 1,..., F 0 = {, Ω}. In this case, the sequence S is a F martingale if EX n = 1, n = 1, 2,.... (a) In this case, the sequence S is Fadapted; 7
8 (b) S n+1 S n = S n (X n+1 1), n = 0, 1,...; (c) E(S n+1 S n /F n ) = E(S n (X n+1 1)/F n ) = S n E(X n+1 1/F n ) = S n (EX n+1 1), n = 0, 1,.... (d) If random variables X n are a.s. positive, for example if X n = e Y n, n = 1, 2,..., and S 0 > 0, then a.s. S n > 0, n = 0, 1,.... In this case condition EX n = 1, n = 1, 2,... is also necessary condition for the sequence S to be a Fmartingale. (2) S n = S 0 exp{ n k=1 Y k }, n = 0, 1,..., where S 0 = const and Y k = N(µ k, σk), 2 k = 1, 2,... are independent normal random variables. In this case, E exp Y n = e µ n+ 1 2 σ2 n, n = 1, 2,... and therefore, the following condition is necessary and sufficient condition under which the sequence S = {S n, n = 0, 1,...} is a Fmartingale, µ n σ2 n = 0, n = 1, 2,.... (3) S n = S 0 ( q p ) n k=1 Y k, n = 0, 1,..., where S 0 = const and Y 1, Y 2,... is a sequence of Bernoulli random variables taking values 1 and 1 with probabilities, respectively, p and q = 1 p, where 0 < p < 1. In this case, E( q p )Y n = ( q p )p + ( q p ) 1 q = 1, n = 1, 2,... and, therefore, the sequence S = {S n, n = 0, 1,...} is a Fmartingale. 1.5 An exponential martingale Let X 1, X 2,... are i.i.d. random variables such that ψ(t) = Ee tx 1 < for some t > 0. Let also Y n = X X n, n = 8
9 1, 2,... and S n = ety n ψ(t) n = n k=1 Let also F = {F n, n = 0, 1,...}, where e tx n ψ(t), n = 1, 2,..., S 0 = 1. F n = σ(x 1,..., X n ) = σ(s 0, S 1,..., S n ), n = 0, 1,..., F 0 = {, Ω}. In this case, S = {S0, S 1,...} is a Fmartingale (known as an exponential martingale). (a) S = {S 0, S 1,...} is a Fadapted sequence; e (b) S n+1 = S txn n ψ(t), n = 0, 1,... e (c) E(S n+1 /F n ) = E(S txn n ψ(t) /F n) = S n E( etxn ψ(t) /F n)) = S n 1 = S n 1.6 Likelihood ratios Let X 1, X 2,... be i.i.d. random variables and let f 0 (x) and f 1 (x) are different probability density functions. For simplicity assume that f 0 (x) > 0, x R 1. Let us define the socalled likelihood ratios, S n = f 1(X 1 )f 1 (X 2 ) f 1 (X n ) f 0 (X 1 )f 0 (X 2 ) f 0 (X n ), n = 1, 2,..., S 0 = 1. Let also the filtration F = {F 0, F 1,...}, where F n = σ(x 1,..., X n ), n = 0, 1,..., F 0 = {, Ω}. (1) S = {S 0, S 1,...} is Fadapted sequence of random variables since, S n is a nonrandom Borel function of random variables X 1,..., X n. 9
10 (2) Due to independence of random variables X n, we get f E(S n+1 /F n ) = E(S 1 (X n+1 ) n f 0 (X n+1 ) /F n) = S n E f 1(X n+1 ) f 0 (X n+1 ), n = 0, 1,.... (3) Thus, the sequence S = {S 0, S 1,...} is a Fmartingale under hypopiesis that the common probability density functions for random variable X n is f 0 (x). Indeed, E f 1(X n+1 ) f 0 (X n+1 ) = f 1 (x) f 0 (x) f 0(x)dx = f 1 (x)dx = 1, n = 0, 1, Square Integrable Martingales 2.1 Doob s martingale Definition Let S ba a random variable such that E S < and F = {F 0, F 1,...} is a filtration. In this case the following sequence is a Fmartingale (Doob s martingale) S n = E(S/F n ), n = 0, 1,.... E(S n+1 /F n ) = E(E(S/F n+1 )/F n ) = E(S/F n ) = S n, n = 0, 1, Doob decomposition Definition A sequence of random variables A = {A n, n = 0, 1,...} is a predictable with respect to a filtration F = {F 0, F 1,...} ( Fpredictable sequence) if A 0 = 0 and the random variable A n is F n 1 measurable for every n = 1, 2,.... Theorem 13.1 (Doob decomposition). Let S = {S 0, S 1,...} be a sequence of random variables adapted to a filtration F = 10
11 {F 0, F 1,...} and such that E S n <, n = 0, 1,.... Then, S can be uniquely decomposed in the sum of two sequences M = {M 0, M 1,...}, which is a Fmartingale, and A = {A 0, A 1,...}, which is a Fpredictable sequence, S n = M n + A n, n = 0, 1,.... (a) Let define M 0 = S 0, A 0 = 0 and M n = M 0 + n (S k E(S k /F k 1 )), A n = S n M n, n = 1, 2,.... k=1 (b) M n is F n measurable since S k, k = 0, 1,..., n and E(S k /F k 1 ), k = 1,..., n are F n measurable. (c) E(M n+1 /F n ) = M 0 + n+1 k=1 E((S k E(S k /F k 1 ))/F n ) = M 0 + n k=1 E((S k E(S k /F k 1 ))/F n ) +E((S n+1 E(S n+1 /F n ))/F n ) = M n + 0 = M n. (d) A n = S n S 0 n k=1 (S k E(S k /F k 1 )) = n k=1 E(S k /F k 1 ) n 1 k=0 S k is a F n 1 measurable random variable. (d) Note also that A n+1 A n = E(S n+1 /F n ) S n, n = 0, 1,.... (f) Suppose that S n = M n + A n, n = 0, 1,... is another decomposition in the sum of a F n martingale and F n predictable sequence. Then A n+1 A n = E(A n+1 A n/f n ) = E((S n+1 S n ) (M n+1 M n)/f n ) = E(S n+1 /F n ) S n (M n M n) = E(S n+1 /F n ) S n = A n+1 A n, n = 0, 1,.... (g) Since A 0 = A 0 = 0, we get using (e), A n = A 0 + n 1 k=0(a k+1 A k ) 11
12 = A 0 + n 1 k=0(a k+1 A k) = A n, n = 0, 1,.... (h) M n = S n A n = S n A n = M n, n = 0, 1,.... (1) If S = {S 0, S 1,...} is a submartingale, then A = {A 0, A 1,...} is an a.s. nondecreasing sequence, i.e., P (0 = A 0 A 1 A 2 ) = 1. Indeed, according (d) A n+1 A n = E(S n+1 /F n ) S n 0, n = 0, 1,.... < Ω, F, P > be a probability space; F = {F 0, F 1,...} is a finite or infinite sequence of σalgebras such that F 0, F 1,... F. S = {S 0, S 1,...} is a Fadapted sequence of random variables defined on this probability space. Lemma Let a sequence S is a Fmartingale such that E S n 2 <, n = 0, 1,.... Then the sequence S 2 = {S 2 0, S 2 1,...} is a submartingale. 2.3 Doob decomposition for square integrable martingales E(S 2 n+1/f n ) = E((S n + (S n+1 S n )) 2 /F n ) = E(S 2 n + 2S n (S n+1 S n ) + (S n+1 S n ) 2 /F n ) = E(S 2 n/f n ) + 2S n E(S n+1 S n /F n ) + E((S n+1 S n ) 2 /F n ) 12
13 = S 2 n + E((S n+1 S n ) 2 /F n ) S 2 n, n = 0, 1,.... (1) According Doob decomposition theorem the sequence S 2 can be uniquely decomposed in the sum of a Fmartingale M = {M 0, M 1,...}, and a Fpredictable sequence A = {A 0, A 1,...}, where S 2 n = M n + A n, n = 0, 1,..., M n = S0 2 + n (Sk 2 E(Sk/F 2 k 1 )), A n = Sn 2 M n. k=1 (2) A n = n k=1 E((S k S k 1 ) 2 /F k 1 ), n = 0, 1,.... A n = S n S 2 0 n k=1 (S 2 k E(S 2 k/f k 1 )) = = n k=1 (E(S 2 k/f k 1 ) S 2 k 1) = n k=1 (E(S 2 k 1 +2S k 1 (S k S k 1 )+(S k S k 1 ) 2 )/F k 1 ) S 2 k 1) = n k=1 E((S k S k 1 ) 2 /F k 1 ). (3) The Fpredictable sequence A is a.s. nonnegative and nondecreasing in this case, i.e. P (0 = A 0 A 1 ) = 1. (4) ES 2 n = EM n + EA n = ES EA n, n = 0, 1,.... (5) The sequence {ES 2 0, ES 2 1,...} is nondecreasing, i.e., ES 2 n ES 2 n+1, n = 0, 1,
14 (a) EM n = EM 0 = ES 2 0, n = 0, 1,...; (b) 0 = EA 0 EA 1 EA 2 ; (c) ES 2 n = EM n + EA n = ES EA n ES 2 n+1, n = 0, 1,.... Definition In this case, the notation S = { S 0, S 1,...} is used for Fpredictable sequence A and it is called a quadratic characteristic of the martingale S, i.e., S n = n k=1 E((S k S k 1 ) 2 /F k 1 ), n = 0, 1,.... (6) Since S is a nondecreasing sequence, there exists with probability 1 a finite or infinite limit, Example S = lim n S n. If the martingale S is the sum of independent random variables X 1, X 2,..., i.e., S n = X X n, n = 0, 1,... such that E X k 2 <, EX k = 0, k = 1,..., then the quadratic characteristic S n = EX EX 2 n, n = 0, 1,... = V arx 1 + V arx n. The quadratic characteristic is a nonrandom sequence. 2.4 DoobKolmogorov inequality Theorem 13.2 (DoobKolmogorov inequality). If S = {S 0, S 1,...} is a square integrable Fmartingale, then P ( max 0 k n S n ε) 1 ε 2ES2 n, ε > 0, n = 0, 1,
15 (a) A i = { S j ε, j = 1,..., i 1, S i ε}, i = 0, 1,..., n; (b) n i=0a i = {max 0 k n S n ε}; (c) A = Ω \ ( n i=0a i ); (d) ES 2 n = n i=0 ES 2 ni Ai + ES 2 ni A n i=0 ES 2 ni Ai ; (e) E(S n S i )S i I Ai = EE(S n S i )S i I Ai /F i ) = ES i I Ai E(S n S i /F i ) = 0; (f) ES 2 ni Ai = E((S n S i + S i ) 2 I Ai = E(S n S i ) 2 I Ai + 2E(S n S i )S i I Ai + ES 2 i I Ai ES 2 i I Ai ε 2 EI Ai = ε 2 P (A i ); 3. Convergence of Martingales 3.1 A.s. convergence of martingales Theorem Let a sequence S is a Fmartingale such that ESn 2 < M, n = 0, 1,..., where M = const <. Then there exists a random variable S such that S n a.s. S as n. (g) E(S 2 n) n i=0 ES 2 ni Ai n i=0 ε 2 P (A i ) = ε 2 P (max 0 k n S n ε). (a) Since ES 2 n is bounded and nondecreases in n, we can choose M = lim n ES 2 n; (b) S (m) n = S m+n S m, n = 0, 1,..., for m = 0, 1,...; 15
16 (c) F n (m) = σ(s (m) k = S m+k S m, k = 0,..., n), n = 0, 1,...; (d) F (m) = {F (m) 0, F (m) 1,...}; (e) F (m) n F m+n, n = 0, 1,...; (f) E(S (m) n+1/f n (m) = E(S m+n /F (m) n (g) {S (m) n ) = E(E(S m+n+1 /F m+n )/F n (m) ) ) = E(S (m) /F (m) ) = S n (m), n = 0, 1,...;, n = 0, 1,...} is a F (m) martingale; n n (i) E(S m+n S m ) 2 = ES 2 m+n 2ES m+n S m + ES 2 m = ES 2 m+n ES 2 m 2E(S m+n S m )S m = ES 2 m+n ES 2 m. (h) P (max m i m+n S i S m ε) 1 ε 2 (ES 2 m+n ES 2 m). (j) P (max m i< S i S m ε) 1 ε 2 (M ES 2 m); (k) P ( m=0 i=m { S i S m ε} = lim m P (max m i< S i S m ε) lim m 1 ε 2 (M ES 2 m) = 0; (l) P ( S i S m ε for infinitely many i, m) = 0, for any ε > 0; (m) P (ω : lim m max i m S i (ω) S m (ω) = 0) = 1; (n) A nonrandom sequence a n a as n if and only if max i m a i a m 0 as m. (o) P (ω : lim n S n (ω) = S(ω)) = 1. Theorem 13.4**. Let a sequence S is a Fmartingale such that E S n < M, n = 0, 1,..., where M = const <. Then n 16
17 there exists a random variable S such that Example S n a.s. S as n. As is known the series diverges the alternating series converges. Let X 1, X 2,... be i.i.d. random variables taking value +1 and 1 with equal probabilities. Let also, S n = n X i i=1 i, n = 1, 2,.... This sequence is a martingale (with respect to the natural filtration generating by this sequence). Now, ESn 2 = V ars n = n 1 i=1 i 1 2 i=1 <. Thus, a.s. there exists a random variable S such that S i2 n S as n, i.e., the random harmonic series X i i=1 i converges with probability Law of large numbers for martingales Theorem Let a sequence S is a Fmartingale such that ESn 2 a.s. <, n = 0, 1,... and S n. Then, for any nondecreasing function f(x) such that f(x) 1 and 0 f(x) 2 dx <, S n a.s. 0 as n. f( S n ) (a) Y n = n S i S i 1 i=1 f( S i ), n = 1, 2,..., Y 0 = 0; (b) E(Y n+1 Y n /F n ) = E( S n+1 S n f( S n+1 ) /F n) = 1 E( S n f( S n+1 ) /F n) = (c) EY n = n i=1 E S i S i 1 f( S i ) S n f( S n+1 ) S n f( S n+1 ) f( S n+1 ) E(S n+1/f n ) = 0, n = 0, 1,...; = n i=1 EE( S i S i 1 f( S i ) /F i 1) = 0, n 1; (d) Y n+1 Y n = E(Y n+1 Y n ) 2 /F n ) = E( (S n+1 S n ) 2 f( S n+1 ) 2 /F n ) 17
18 = 1 f( S n+1 ) 2 E(S n+1 S n ) 2 /F n ) = S n+1 S n f( S n+1 ) 2 ; (e) Y n = n 1 S k+1 S k k=0 f( S k+1 ) S k+1 2 S k f(x) 2 dx 0 f(x) 2 dx = M a.s., for n = 1, 2,.... (f) EY 2 n = EY E Y n M, n = 0, 1,.... (g) By Theorem 1, Y n = n i=1 S i S i 1 f( S i ) a.s. Y as n ; (h) Lemma (Kronecker). If a n as n and n x k k=1 a k c as n where c < then 1 nk=1 a n x k 0 as n. 1 (i) ni=1 f( S n ) (S i S i 1 ) = 1 f( S n ) (S n S 0 ) a.s. 0 as n. 1 (j) f( S n ) S n a.s. 0 as n. Example Let X 1, X 2,... be independent random variables such that σ 2 n = V arx n <, EX n = µ n, n = 1, 2,.... Denote b n = nk=1 σ 2 k, a n = n k=1 µ k. Assume that b n as n. Let also, S n = n (X k µ k ), n = 1, 2,..., S 0 = 0. k=1 This sequence is a martingale (with respect to the natural filtration generating by this sequence) and ES n = 0, S n = b n, n = 0, 1,.... Choose, f(x) = max(x, 1). This function is nondecreasing and 0 f(x) 2 dx = x 2 dx <. 18
19 (1) By Theorem 2, S n = f(b n) b n b n S n f(b n ) a.s. 1 0 = 0 as n. (2) If a n bn c, then, also, 1 n X k b n k=1 a.s. c as n. (3) If σn 2 = σ 2, EX n = µ, n = 1, 2,..., then b n = nσ 2, a n = µn, n = 1, 2,..., c = µ σ and 2 1 nσ 2 n X k k=1 a.s. c as n. 3.3 Central limit theorem for martingales Theorem 13.6**. Let a sequence S is a Fmartingale such that ES 2 n <, n = 0, 1,... and the following condition hold: (1) S n n (2) Then, P σ 2 > 0 as n ; n i=1 E((S i S i 1 ) 2 I( S i S i 1 ε n) n 0 as n, ε > 0. S n nσ d S as n, where S is a standard normal random variable with the mean 0 and the variance Stopping times 19
20 4.1 Definition and basic properties of stopping times < Ω, F, P > be a probability space; F = {F 0, F 1,...} is a finite or infinite sequence of σalgebras such that F 0, F 1,... F; T = T (ω) be a random variable defined on a probability space < Ω, F, P > and taking values in the set {0, 1,..., + }; S = {S 0, S 1,...} is a Fadapted sequence of random variables defined on the probability space < Ω, F, P >. Definition The random variable T is called a stopping time for filtration F if, {T = n} F n, n = 1, 2,.... (1) The equivalent condition defining a stopping time T is to require that {T n} F n, n = 1, 2,... or {T > n} F n, n = 1, 2,.... (a) Events {T = k} F k F n, k = 0, 1,..., n, thus, event {T n} = n k=0{t = k} F n and in sequel {T > n} F n ; (2) Let H 0, H 1,... be a sequence of Borel subsets of a real line. A hitting time T = min(n 0 : S n H n ) is an example of stopping time. (b) Events {T > n 1} F n 1 F n, thus, {T = n} = {T > n 1} \ {T > n} F n. 20
21 (3) If T and T are stopping times for a filtration F then T +T, max(t, T ), min(t, T ) are stopping times. {T = n} = {S 0 / H 0,..., S n 1 / H n 1, S n H n } F n, n = 0, 1,.... (a) {T = T + T = n} = n k=0({t = k, T = n k} F n ; (b) {T = max(t, T ) n} = {T n, T n} F n ; (b) {T = min(t, T ) > n} = {T > n, T > n} F n ; 21
22 4.2 Stopped martingales Theorem If S = {S n, n = 0, 1,...} is a Fmartingale and T is a stopping time for filtration F, then the sequence S = {S n = S T n, n = 0, 1,...} is also a Fmartingale. (a) S T n = n 1 k=0 S k I(T = k) + S n I(T > n 1); (b) E(S n+1/f n ) = S T (n+1) /F n ) = n k=0 E(S k I(T = k)/f n )+ E(S n+1 I(T > n)/f n ) = n k=0 I(T = k)s k + I(T > n)e(s n+1 /F n ) = n 1 k=0 I(T = k)s k + I(T = n)s n + I(T > n)s n = n 1 k=0 I(T = k)s k + S n I(T > n 1) (1) ES n = ES 0, n = 0, 1,... Theorem If S = {S n, n = 0, 1,...} is a Fmartingale and T is a stopping time for filtration F such that P (T N) = 1 for some integer constant N 0, then = S T n = S n ES n = ES 0 = ES T 0 = ES 0. (a) S T = S T N = S N a.s. (b) ES T = ES N = ES 0. ES T = ES 0. 22
23 4.3 Optional stopping theorems Theorem If S = {S n, n = 0, 1,...} is a Fmartingale and T is a stopping time for filtration F such that (1) P (T < ) = 1; (2) E S T < ; (3) E(S n /T > n)p (T > n) = ES n I(T > n) 0 as n. Then, ES T = ES 0. (a) ES T = ES T I(T n) + ES T I(T > n); (b) E S T = k=0 E( S T /T = k)p (T = k) < ; (c) ES T I(T > n) k=n+1 E( S T /T = k)p (T = k) 0 as n ; (d) ES T I(T n) = ES T I(T n) + ES n I(T > n) ES n I(T > n) = ES T n ES n I(T > n); (e) ES n I(T > n) 0 as n ; (f) ES T = lim n ES T n = ES 0. Example Let consider socalled symmetric random walk that is S n = ni=1 X i, n = 0, 1,..., where X 1, X 2,... are i.i.d. Bernoulli random variables taking value 1 and 1 with equal probabilities. 23
24 In this case, sequence S = {S n, n = 0, 1,...} is a martingale with respect to the natural filtration F generated by random variables X 1, X 2,.... Let a, b be strictly positive integers and T = min(n 1 : S n = a or b). It is a hitting time, and, therefore, a stopping time for the filtration F. (1) P (S T = a) = b a+b. (2) ET = ab. (a) S T can take only two values a or b with probabilities P a = P (S T = a) and 1 P a respectively; (b) Thus, E S T <, i.e., condition (2) of Theorem 13.9 holds. (c) A = {X k = 1, k = 1,... a + b}, B c = { a < c + S k < b, k = 1,... a + b}, a < c < b; (d) P (A) = ( 1 2 )a+b = p > 0; (e) B c A, a < c < b; (f) max a<c<b P (B c ) P (A) 1 p < 1; (g) P (T > a + b) = P (B 0 ) 1 p; (h) P (T > 2(a + b)) = a<c<b P (T > a + b, S a+b = c)p (B c ) (1 p) a<c<b P (T > a + b, S a+b = c) = (1 p)p (T > a + b) (1 p) 2 ; (i) P (T > k(a + b)) (1 p) k, k = 1, 2,...; 24
25 (j) P (T > n) 0 as n, i.e P (T < ) = 1; (k) Thus, condition (1) of Theorem 13.9 holds. (l) ES n I(T > n) (a + b)ei(t > n) 0 as n ; (m) Thus condition (3) of Theorem 2 also holds; (n) ES T = ap a + b(1 P a ) = ES 0 = 0; (o) P a = b a+b. (p) Consider a random sequence V n = S 2 n n, n = 0, 1,.... The nonrandom sequence n is a quadratic characteristic for the submartingale S 2 n and V n is a martingale with respect to the natural filtration F generated by random variables X 1, X 2,...; (q) ET < that follows from (i); (r) V T S T 2 + T max(a, b) 2 + T ; (s) E V T max(a, b) 2 + ET < ; (t) EV T I(T > n) (max(a, b) 2 + n)p (T > n) 0 as n ; (u) By Theorem 2, EV T = a 2 P a + b 2 (1 P a ) ET = EV 0 = 0; (w) ET = ab. Theorem If S = {S n, n = 0, 1,...} is a Fmartingale and T is a stopping time for filtration F such that (1) ET < ; (2) E S n+1 S n /F n ) K <, n = 0, 1,...; Then, ES T = ES 0. (a) Z 0 = S 0, Z n = S n S n 1, n = 1, 2,...; 25
26 (b) W = Z Z T ; (c) S T W ; (d) EW = n=0 nk=0 EZ k I(T = n) = k=0 n=k EZ k I(T = n) = k=0 EZ k I(T k) = E S 0 + k=1 EZ k I(T k); (e) I(T k) is F k 1 measurable, for k = 1, 2,...; (f) EZ k I(T k) = EE(Z k I(T k)/f k 1 ) = EI(T k)e(z k /F k 1 ) KP (T k); (g) EW E S 0 + k=1 KP (T k) = E S 0 + KET < ; (h) ES T n ES T E S T n S T I(T > n) 2EW I(T > n); (i) k=0 EW I(T = k) = EW < ; (j) EW I(T > n) = k>n EW I(T = k) 0 as n ; (k) ES T n = ES 0 ; (l) It follows from (h)  (k) that ES T = ES Wald equation Theorem (Wald Equation). Let X 1, X 2,... be i.i.d. random variables such that E X 1 <, EX 1 = µ. Let also Y n = n k=1 X k, n = 1, 2,.... Then, the centered sequence S n = Y n µn, n = 1, 2,..., S 0 = 0 is a martingale with respects to F generated by random variables X 1, X 2,.... Lets also T is a stopping time with respect to filtration F such that ET <. Then, EY T = µet. 26
27 (a) E( S n+1 S n /F n ) = E X 1 µ 2µ, n = 1, 2,...; (b) S T = Y T µt ; (c) Theorem 4 implies that ES T = ES 0 = 0; (d) EY T µet = ES T = A fair game example A gambler flips a fair coin and wins his bet if it comes up heads and losses his bet if it comes up tails. The socalled martingale betting strategy can be viewed as a way to win in a fair gain is a to keep doubling the bet until the gambler eventually wins. (1) Let X n, n = 1, 2,... be independent random variables taking value 2 n 1 and 2 n 1 with equal probabilities. These random variables represent the outcomes of sequential flips of the coin. Thus, S n is the gain of the gambler after n flips. Since EX n = 0, n = 1, 2,..., the sequence S n = X X n, n = 1, 2,..., S 0 = 0 is a martingale with respect to the natural filtration generating by the random variables X 1, X 2,.... (2) Let T be the number of flip in which the gambler first time win. By the definition, P (T = n) = ( 1 2 )n, n = 1, 2,.... Then, according the definition of T and the description of martingale betting strategy, S T = 2 T ( T 1 ) = 1. (3) This, seems, contradicts to optional stopping theorem since T is a stopping time with ET <, and, the equality ES T = 27
28 ES 0 = 0 may be expected, while according (2) ES T = 1. (4) The explanation of this phenomenon is that conditions of optional stopping Theorems does not hold. (a) Theorem 13.8 can not be implied, since T is not bounded random variable; (b) Theorem 13.9 can not be implied, since E(S n I(T > n)) = ( n 1 ) 1 2 = 2n 1 n as n ; n (c) Theorem can not be implied, since E( S n+1 S n /F n ) = E X n+1 = 2 n as n. LN Problems 1. Please, characterize a sequence of random variables S = {S 0, S 1,...} which is a martingale with respect to the filtration F = {F 0, F 1,...} for the case where F 0 = F 1 = F 2 =, in particular, if F 0 = {, Ω}. 2. Let X 1, X 2,... be i.i.d random variables, EX 1 = 0, V arx 1 = σ 2 <. Let also S 0 = 0 and S n = n k=1 X k nσ 2. Prove that the sequence S = {S 0, S 1,...} is a martingale with respect to filtration F = {F 0, F 1,...}, where F n = σ(x 1,..., X n ), n = 0, 1,..., F 0 = {, Ω}. 3. Suppose that a sequence S = {S 0, S 1,...} is a martingale and also a predictable sequence with respect to a filtration 28
29 F = {F 0, F 1,...}. Show that, in this case P (S n = S 0 ) = 1, n = 0, 1, Suppose that a sequence S = {S 0, S 1,...} is a martingale with respect to a filtration F = {F 0, F 1,...}. Show that the sequence S = {S 0 = S 2 0, S 1 = S 2 1,...} is a submartingale with respect to a filtration F. 5. Suppose that sequences S = {S 0, S 1,...} and S = {S 0, S 1,...} are submartingales with respect to a filtration F = {F 0, F 1,...}. Show that the sequence S = {S 0 = S 0 S 0, S 1 = S 1 S 1,...} is also a submartingale with respect to a filtration F. 6. Let F = {F 0, F 1,...} is a filtration and let F be a minimal σalgebra which contains all σalgebra F n, n = 0, 1,.... Let also S a random variable such that P (S 0) = 1 and ES <. Define, S n = E(S/F n ), n = 0, 1,.... Prove that S n a.s. S as n. 7. Let F = {F 0, F 1,...} is a filtration and let F be a minimal σalgebra which contains all σalgebra F n, n = 0, 1,.... Let also S a random variable such that E S <. Prove that E(S/F n ) a.s. E(S/F) as n. 8. Let X 1, X 2,... be a sequence of independent random variables such that EXn 2 = b n <, EX n = a n 0, n = 1, 2,.... Let define random variables S n = n X k k=1 a k, n = 1, 2,..., S 0 = 1. Prove that the sequence S = {S 0, S 1,...} is a martingale with respect to filtration generated by random variables X 1, X 2,... and prove that condition b k k=1 m < implies that there exist 2 k 29
30 a random variable S such that S n a.s. S as n. 9. Let X 1, X 2,... be a sequence of independent random variables such that EX 2 n = b k <, EX n = 0, n = 1, 2,.... Define S n = nk=1 X k, n =, 1, 2,..., S 0 = 0. Prove that the sequence S = {S 0, S 1,...} is a martingale with respect to filtration generated by random variables X 1, X 2,... and that condition k=1 b k < implies that there exist a random variable S such that S n a.s. S as n. 10. Please, reformulate Theorem 4 for the case where the random variables S n = X X n, n = 1, 2,... are sums of independent random variables and F is the natural filtration generated by the random variables X 1, X 2, Let X 1, X 2,... be nonnegative i.i.d. random variables such that EX 1 = µ > 0. Let also Y n = n k=1 X k, n = 1, 2,... and T u = min(n 1 : Y n u), u > 0. Prove, that T u is a stopping time such that ET u <, u > Let X 1, X 2,... be nonnegative i.i.d. random variables such that EX 1 = µ > 0. Let also Y n = n k=1 X k, n = 1, 2,... and T u = min(n 1 : Y n u), u > 0. Prove, EY Tu = µet u, u > 0. 30
1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let
Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as
More informationSOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY
SOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY WATSON LADD Abstract. Martingales are a very simple concept with wide application in probability. We introduce the concept of a martingale, develop
More informationThe sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
More informationExercises with solutions (1)
Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal
More informationarxiv:1112.0829v1 [math.pr] 5 Dec 2011
How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly
More informationWald s Identity. by Jeffery Hein. Dartmouth College, Math 100
Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,
More informationP (A) = lim P (A) = N(A)/N,
1.1 Probability, Relative Frequency and Classical Definition. Probability is the study of random or nondeterministic experiments. Suppose an experiment can be repeated any number of times, so that we
More informationMARTINGALES AND GAMBLING Louis H. Y. Chen Department of Mathematics National University of Singapore
MARTINGALES AND GAMBLING Louis H. Y. Chen Department of Mathematics National University of Singapore 1. Introduction The word martingale refers to a strap fastened belween the girth and the noseband of
More informationSTAT 830 Convergence in Distribution
STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31
More information1 Limiting distribution for a Markov chain
Copyright c 2009 by Karl Sigman Limiting distribution for a Markov chain In these Lecture Notes, we shall study the limiting behavior of Markov chains as time n In particular, under suitable easytocheck
More informationSTAT 571 Assignment 1 solutions
STAT 571 Assignment 1 solutions 1. If Ω is a set and C a collection of subsets of Ω, let A be the intersection of all σalgebras that contain C. rove that A is the σalgebra generated by C. Solution: Let
More informationLecture 10: Characteristic Functions
Lecture 0: Characteristic Functions. Definition of characteristic functions. Complex random variables.2 Definition and basic properties of characteristic functions.3 Examples.4 Inversion formulas 2. Applications
More informationChapter 4 Expected Values
Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges
More informationChapter 1  σalgebras
Page 1 of 17 PROBABILITY 3  Lecture Notes Chapter 1  σalgebras Let Ω be a set of outcomes. We denote by P(Ω) its power set, i.e. the collection of all the subsets of Ω. If the cardinality Ω of Ω is
More informationIEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
More informationCHAPTER III  MARKOV CHAINS
CHAPTER III  MARKOV CHAINS JOSEPH G. CONLON 1. General Theory of Markov Chains We have already discussed the standard random walk on the integers Z. A Markov Chain can be viewed as a generalization of
More informationMaster s Theory Exam Spring 2006
Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem
More informationDefinition 6.1.1. A r.v. X has a normal distribution with mean µ and variance σ 2, where µ R, and σ > 0, if its density is f(x) = 1. 2σ 2.
Chapter 6 Brownian Motion 6. Normal Distribution Definition 6... A r.v. X has a normal distribution with mean µ and variance σ, where µ R, and σ > 0, if its density is fx = πσ e x µ σ. The previous definition
More informationThe Real Numbers. Here we show one way to explicitly construct the real numbers R. First we need a definition.
The Real Numbers Here we show one way to explicitly construct the real numbers R. First we need a definition. Definitions/Notation: A sequence of rational numbers is a funtion f : N Q. Rather than write
More informationLectures on Stochastic Processes. William G. Faris
Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 37, 38 The remaining discrete random
More informationTopic 2: Scalar random variables. Definition of random variables
Topic 2: Scalar random variables Discrete and continuous random variables Probability distribution and densities (cdf, pmf, pdf) Important random variables Expectation, mean, variance, moments Markov and
More informationMartingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996
Martingale Ideas in Elementary Probability Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 William Faris University of Arizona Fulbright Lecturer, IUM, 1995 1996
More informationSF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,
More information6.042/18.062J Mathematics for Computer Science December 12, 2006 Tom Leighton and Ronitt Rubinfeld. Random Walks
6.042/8.062J Mathematics for Comuter Science December 2, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Random Walks Gambler s Ruin Today we re going to talk about onedimensional random walks. In
More information10.2 Series and Convergence
10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and
More informationConditional expectation
A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined
More informationGambling Systems and MultiplicationInvariant Measures
Gambling Systems and MultiplicationInvariant Measures by Jeffrey S. Rosenthal* and Peter O. Schwartz** (May 28, 997.. Introduction. This short paper describes a surprising connection between two previously
More information61. REARRANGEMENTS 119
61. REARRANGEMENTS 119 61. Rearrangements Here the difference between conditionally and absolutely convergent series is further refined through the concept of rearrangement. Definition 15. (Rearrangement)
More informationReflection principle and Ocone martingales
Stochastic Processes and their Applications 119 29 3816 3833 www.elsevier.com/locate/spa Reflection principle and Ocone martingales L. Chaumont, L. Vostrikova LAREMA, Département de Mathématiques, Université
More informationPractice problems for Homework 11  Point Estimation
Practice problems for Homework 11  Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:
More informationLecture 4: Random Variables
Lecture 4: Random Variables 1. Definition of Random variables 1.1 Measurable functions and random variables 1.2 Reduction of the measurability condition 1.3 Transformation of random variables 1.4 σalgebra
More informationExpected Value 10/11/2005
Expected Value 10/11/2005 Definition Let X be a numericallyvalued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω xm(x), provided
More informationFinite Markov Chains and Algorithmic Applications. Matematisk statistik, Chalmers tekniska högskola och Göteborgs universitet
Finite Markov Chains and Algorithmic Applications Olle Häggström Matematisk statistik, Chalmers tekniska högskola och Göteborgs universitet PUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE
More informationIndependence of Four Projective Criteria for the Weak Invariance Principle
Alea 5, 21 27 (2009) Independence of Four Projective Criteria for the Weak Invariance Principle Olivier Durieu Laboratoire de Mathematiques Raphaël Salem, UMR 6085 CNRSUniversité de Rouen URL: http://www.univrouen.fr/lmrs/persopage/durieu/
More informationWHERE DOES THE 10% CONDITION COME FROM?
1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay
More informationCHAPTER II THE LIMIT OF A SEQUENCE OF NUMBERS DEFINITION OF THE NUMBER e.
CHAPTER II THE LIMIT OF A SEQUENCE OF NUMBERS DEFINITION OF THE NUMBER e. This chapter contains the beginnings of the most important, and probably the most subtle, notion in mathematical analysis, i.e.,
More informationn k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n + 1 +...
6 Series We call a normed space (X, ) a Banach space provided that every Cauchy sequence (x n ) in X converges. For example, R with the norm = is an example of Banach space. Now let (x n ) be a sequence
More informationMATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators...
MATH4427 Notebook 2 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 20092016 by Jenny A. Baglivo. All Rights Reserved. Contents 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................
More informationTaylor and Maclaurin Series
Taylor and Maclaurin Series In the preceding section we were able to find power series representations for a certain restricted class of functions. Here we investigate more general problems: Which functions
More informationMATH 201. Final ANSWERS August 12, 2016
MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6sided dice, five 8sided dice, and six 0sided dice. A die is drawn from the bag and then rolled.
More informationx if x 0, x if x < 0.
Chapter 3 Sequences In this chapter, we discuss sequences. We say what it means for a sequence to converge, and define the limit of a convergent sequence. We begin with some preliminary results about the
More informationLecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. LogNormal Distribution
Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Logormal Distribution October 4, 200 Limiting Distribution of the Scaled Random Walk Recall that we defined a scaled simple random walk last
More informationHOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!
Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following
More information3.1. Sequences and Their Limits Definition (3.1.1). A sequence of real numbers (or a sequence in R) is a function from N into R.
CHAPTER 3 Sequences and Series 3.. Sequences and Their Limits Definition (3..). A sequence of real numbers (or a sequence in R) is a function from N into R. Notation. () The values of X : N! R are denoted
More informationProbability Theory. Florian Herzog. A random variable is neither random nor variable. GianCarlo Rota, M.I.T..
Probability Theory A random variable is neither random nor variable. GianCarlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,
More informationRANDOM INTERVAL HOMEOMORPHISMS. MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis
RANDOM INTERVAL HOMEOMORPHISMS MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis This is a joint work with Lluís Alsedà Motivation: A talk by Yulij Ilyashenko. Two interval maps, applied
More informationUndergraduate Notes in Mathematics. Arkansas Tech University Department of Mathematics
Undergraduate Notes in Mathematics Arkansas Tech University Department of Mathematics An Introductory Single Variable Real Analysis: A Learning Approach through Problem Solving Marcel B. Finan c All Rights
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (19031987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationWorked examples Random Processes
Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted
More informationWhat is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference
0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures
More informationJANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability)
JANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability) Author: Daniel Cooney Reviewer: Rachel Zax Last modified: January 4, 2012 Reading from Meyer, Mathematics
More informationA SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS
A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS Eusebio GÓMEZ, Miguel A. GÓMEZVILLEGAS and J. Miguel MARÍN Abstract In this paper it is taken up a revision and characterization of the class of
More informationGenerating Functions
Chapter 10 Generating Functions 10.1 Generating Functions for Discrete Distributions So far we have considered in detail only the two most important attributes of a random variable, namely, the mean and
More informationStatistics 100A Homework 8 Solutions
Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the onehalf
More informationGood luck! Problem 1. (b) The random variables X 1 and X 2 are independent and N(0, 1)distributed. Show that the random variables X 1
Avd. Matematisk statistik TETAME I SF2940 SAOLIKHETSTEORI/EXAM I SF2940 PROBABILITY THE ORY, WEDESDAY OCTOBER 26, 2016, 08.0013.00. Examinator : Boualem Djehiche, tel. 087907875, email: boualem@kth.se
More informationBANACH AND HILBERT SPACE REVIEW
BANACH AND HILBET SPACE EVIEW CHISTOPHE HEIL These notes will briefly review some basic concepts related to the theory of Banach and Hilbert spaces. We are not trying to give a complete development, but
More informationChapter 3: Discrete Random Variable and Probability Distribution. January 28, 2014
STAT511 Spring 2014 Lecture Notes 1 Chapter 3: Discrete Random Variable and Probability Distribution January 28, 2014 3 Discrete Random Variables Chapter Overview Random Variable (r.v. Definition Discrete
More informationProbability Generating Functions
page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence
More informationTiers, Preference Similarity, and the Limits on Stable Partners
Tiers, Preference Similarity, and the Limits on Stable Partners KANDORI, Michihiro, KOJIMA, Fuhito, and YASUDA, Yosuke February 7, 2010 Preliminary and incomplete. Do not circulate. Abstract We consider
More informationP (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationProbability II (MATH 2647)
Probability II (MATH 2647) Lecturer Dr. O. Hryniv email Ostap.Hryniv@durham.ac.uk office CM309 http://maths.dur.ac.uk/stats/courses/probmc2h/probability2h.html or via DUO This term we shall consider: Review
More information2 1 2 s 3 = = s 4 = =
Infinite Series  EPM c CNMiKnO PG  The word series as used in mathematics is different from the way it is used in everyday speech. Webster s Third New International Dictionary includes the following
More informationA Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails
12th International Congress on Insurance: Mathematics and Economics July 1618, 2008 A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails XUEMIAO HAO (Based on a joint
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More informationStochastic Processes and Advanced Mathematical Finance. Laws of Large Numbers
Steven R. Dunbar Department of Mathematics 203 Avery Hall University of NebraskaLincoln Lincoln, NE 685880130 http://www.math.unl.edu Voice: 4024723731 Fax: 4024728466 Stochastic Processes and Advanced
More informationMath 151. Rumbos Spring 2014 1. Solutions to Assignment #22
Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability
More informationMath 425 (Fall 08) Solutions Midterm 2 November 6, 2008
Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationNotes for STA 437/1005 Methods for Multivariate Data
Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.
More informationUsing pivots to construct confidence intervals. In Example 41 we used the fact that
Using pivots to construct confidence intervals In Example 41 we used the fact that Q( X, µ) = X µ σ/ n N(0, 1) for all µ. We then said Q( X, µ) z α/2 with probability 1 α, and converted this into a statement
More informationCONDITIONAL EXPECTATION AND MARTINGALES
CONDITIONAL EXPECTATION AND MARTINGALES 1. INTRODUCTION Martingales play a role in stochastic processes roughly similar to that played by conserved quantities in dynamical systems. Unlike a conserved quantity
More informationSeries Convergence Tests Math 122 Calculus III D Joyce, Fall 2012
Some series converge, some diverge. Series Convergence Tests Math 22 Calculus III D Joyce, Fall 202 Geometric series. We ve already looked at these. We know when a geometric series converges and what it
More informationDepartment of Mathematics, Indian Institute of Technology, Kharagpur Assignment 23, Probability and Statistics, March 2015. Due:March 25, 2015.
Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 3, Probability and Statistics, March 05. Due:March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x
More informationProbability and Statistics
CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b  0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute  Systems and Modeling GIGA  Bioinformatics ULg kristel.vansteen@ulg.ac.be
More informationANALYTICAL MATHEMATICS FOR APPLICATIONS 2016 LECTURE NOTES Series
ANALYTICAL MATHEMATICS FOR APPLICATIONS 206 LECTURE NOTES 8 ISSUED 24 APRIL 206 A series is a formal sum. Series a + a 2 + a 3 + + + where { } is a sequence of real numbers. Here formal means that we don
More informationAn Introduction to Basic Statistics and Probability
An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random
More informationMath/Stats 342: Solutions to Homework
Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability
More informationA Martingale System Theorem for Stock Investments
A Martingale System Theorem for Stock Investments Robert J. Vanderbei April 26, 1999 DIMACS New Market Models Workshop 1 Beginning Middle End Controversial Remarks Outline DIMACS New Market Models Workshop
More informationSolution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute
Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the
More informationREAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE
REAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE CHRISTOPHER HEIL 1.4.1 Introduction We will expand on Section 1.4 of Folland s text, which covers abstract outer measures also called exterior measures).
More informationManual for SOA Exam MLC.
Chapter 5 Life annuities Extract from: Arcones Manual for the SOA Exam MLC Fall 2009 Edition available at http://wwwactexmadrivercom/ 1/70 Due n year deferred annuity Definition 1 A due n year deferred
More informationSF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance
More informationGROUPS SUBGROUPS. Definition 1: An operation on a set G is a function : G G G.
Definition 1: GROUPS An operation on a set G is a function : G G G. Definition 2: A group is a set G which is equipped with an operation and a special element e G, called the identity, such that (i) the
More informationTAKEAWAY GAMES. ALLEN J. SCHWENK California Institute of Technology, Pasadena, California INTRODUCTION
TAKEAWAY GAMES ALLEN J. SCHWENK California Institute of Technology, Pasadena, California L INTRODUCTION Several games of Tf takeaway?f have become popular. The purpose of this paper is to determine the
More informationDiscrete Probability Distributions
Chapter Discrete Probability Distributions. Simulation of Discrete Probabilities. As n increases, the proportion of heads gets closer to /2, but the difference between the number of heads and half the
More informationINTRODUCTION TO THE CONVERGENCE OF SEQUENCES
INTRODUCTION TO THE CONVERGENCE OF SEQUENCES BECKY LYTLE Abstract. In this paper, we discuss the basic ideas involved in sequences and convergence. We start by defining sequences and follow by explaining
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationNotes on the second moment method, Erdős multiplication tables
Notes on the second moment method, Erdős multiplication tables January 25, 20 Erdős multiplication table theorem Suppose we form the N N multiplication table, containing all the N 2 products ab, where
More informationThe Limit of a Sequence of Numbers: Infinite Series
Connexions module: m36135 1 The Limit of a Sequence of Numbers: Infinite Series Lawrence Baggett This work is produced by The Connexions Project and licensed under the Creative Commons Attribution License
More informationMath 55: Discrete Mathematics
Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What
More informationSequences of Functions
Sequences of Functions Uniform convergence 9. Assume that f n f uniformly on S and that each f n is bounded on S. Prove that {f n } is uniformly bounded on S. Proof: Since f n f uniformly on S, then given
More informationLECTURE 15: AMERICAN OPTIONS
LECTURE 15: AMERICAN OPTIONS 1. Introduction All of the options that we have considered thus far have been of the European variety: exercise is permitted only at the termination of the contract. These
More informationIEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS
IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)
More informationThe Exponential Distribution
21 The Exponential Distribution From DiscreteTime to ContinuousTime: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding
More informationAdvanced Probability
Advanced Probability Perla Sousi October 13, 2013 Contents 1 Conditional expectation 3 1.1 Discrete case................................... 4 1.2 Existence and uniqueness............................. 5
More informationIntroduction to Series and Sequences Math 121 Calculus II D Joyce, Spring 2013
Introduction to Series and Sequences Math Calculus II D Joyce, Spring 03 The goal. The main purpose of our study of series and sequences is to understand power series. A power series is like a polynomial
More informationProbabilities and Random Variables
Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so
More informationMila Stojaković 1. Finite pure exchange economy, Equilibrium,
Novi Sad J. Math. Vol. 35, No. 1, 2005, 103112 FUZZY RANDOM VARIABLE IN MATHEMATICAL ECONOMICS Mila Stojaković 1 Abstract. This paper deals with a new concept introducing notion of fuzzy set to one mathematical
More information