THE CENTRAL LIMIT THEOREM TORONTO

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "THE CENTRAL LIMIT THEOREM TORONTO"

Transcription

1 THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010

2 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples Roulette Cauchy Distribution Historical Background 6 6 Proof Outline of the Proof Lévy Continuity Theorem Lemmas Proof of the Central Limit Theorem References 13

3 1 Introduction The Central Limit Theorem (CLT) is one of the most remarkable results in probability theory because it s not only very easy to phrase, but also has very useful applications. The CLT can tell us about the distribution of large sums of random variables even if the distribution of the random variables is almost unknown. With this result we are able to approximate how likely it is that the arithmetic mean deviates from its expected value. I give such an example in chapter 4. The CLT provides answers to many statistical problems. Using the CLT we can verify hypotheses by making statistical decisions, because we are able to determine the asymptotic distribution of certain test statistics. As a warm up, we attempt to understand what happens to the distribution of random variables if we sum them. Suppose X and Y are continuous and independent random variables with densities f X and f Y. If we define 1 A (x) to be the indicator function of a set A, then we want to recall that for independent random variables P (X A, Y B) = R R 1 A (x)1 B (y)f X (x)f Y (y) dx dy. Now we can see that the density of X + Y is given by the convolution of f X and f Y. [ z y ] P (X + Y z) = 1 [x+y z] (x, y)f X (x)f Y (y) dx dy = f X (x) dx f Y (y) dx dy R R R [ z ] z = f X (x y) dx f Y (y) dx dy F ubini = f X (x y)f Y (y) dy dx R R = z f X f Y (x) dx. In order to visualize this result I did some calculations. I determined the density of the sum of independent, uniformly distributed random variables. The following pictures will show the density of n i=1 X i for X i U[ 0.5, 0.5]. 1

4 n = 1 n = 2 n = 3 n = 10 If we compare these graphs to the density of a standard normally distributed random variable, we can see remarkable similarities even for small n. Density of a standard normally distributed random variable This result leads us to suspect that sums of random variables somehow behave normally. The CLT frames this fact. 2

5 2 Mathematical Background In this section I just want to recall some of the basic definitions and theorems used in this paper. Elementary definitions of probability theory are assumed to be well known. To keep things simple, we just consider the sample space Ω = R. Definition. A random variable X is called normally distributed with parameters µ and σ (X N(µ, σ)) if the density of the random variable is given by φ(x) = 1 2πσ 2 e (x µ)2 2σ 2. Definition. If a random variable X with probability measure P X has density f, then we define the distribution function F X as follows F X (x) = P (X x) = P X ((, x]) = x dp X (x) = x f(x) dx Definition. A sequence of random variables X 1, X 2... is converging in distribution to a random variable X if lim F X n n (x) = F X (x) for every x R at which F X is continuous. We can write this as X n d X. The following theorem is also used to define convergence in distribution and will become handy in proving the Central Limit Theorem: Theorem 2.1. Suppose X 1, X 2... is a sequence of random variables, then X n only if d X if and for every bounded and continuous function f. lim E[f(X n)] = E[f(X)] n Definition. The characteristic function of a random variable X is defined by ϕ(t) = E [ e itx] = e itx dp X (x). R 3

6 Proposition. Every characteristic function ϕ is continuous, ϕ(0) = 1 and ϕ(t) 1. Theorem 2.2. If X,Y are random variables and ϕ X (t) = ϕ Y (t) for all t R, then i.e. X and Y have the same distribution. X d = Y Theorem 2.3. Suppose X and Y are independent random variables, then ϕ X+Y (t) = ϕ X (t) ϕ Y (t) t R. 3 The Central Limit Theorem The Central Limit Theorem. If {X n } is a sequence of identically and independent distributed random variables each having finite expectation µ and finite positive variance σ 2, then: X 1 + X X n n µ σ n d N(0, 1) i.e. a centered and normalized sum of independent and identically distributed (i.i.d.) random variables is distributed standard normally as n goes to infinity. 4 Examples 4.1 Roulette It s nothing new that on average you should lose when playing roulette. Despite this it s still interesting to examine the chances of winning. The CLT gives an answer to this question: 4

7 A roulette wheel has 37 numbers in total. 18 are black, 18 are red and 1 is green. Players are allowed to bet on black or red. Assume a player is always betting $1 on black. We define X i to be the winnings of the ith spin. X 1, X 2,... are clearly independent and P (X i = 1) = 18 37, P (X i = 1) = E(X i ) = 1 37, V ar(x i) = E(X i 2 ) [E(X i )] 2 1. We want to approximate the probability that S n = X X n is bigger than 0 ( Sn nµ P (S n > 0) = P > nµ ). nσ nσ Let s say we want to play n = 100 times, then Now the CLT states that nµ 100 (1/37) = = 10 nσ P (S n > 0) P (X > ) for a standard normally distributed random variable X. Since P (X > ) 0.39 we can assume that the chance to win money by playing roulette 100 times is about 39%. 4.2 Cauchy Distribution The Cauchy distribution shows that the conditions of finite variance and finite expectation can not be dropped. Definition. A random variable X is called Cauchy distributed if the density of X is given by f(x) = 1 π(1 + x 2 ). Proposition. If X is a Cauchy distributed random variable, then E[X] = and V ar[x] =. 5

8 Lemma. If X is Cauchy distributed, then ϕ X (t) = e t. Proposition. If {X n } is a sequence of independent Cauchy distributed random variables, then Y n = 1 n n i=1 X i has Cauchy distribution. Proof. To prove this statement we want to compute the characteristic function of Y n and compare it to the characteristic function of a Cauchy distributed random variable. If they are the same, then the claim follows by Theorem 2.2. ϕ Yn (t) = n i=1 = e t ϕ X i (t) = n n ( ) t ϕ Xi = n i=1 ( ϕ X1 ( t n)) n = ) (e t n n The first step is true because of the Theorem about sums of independent random variables and its characteristic functions (Theorem 2.3). The third step follows from Theorem 2.2 since all random variables are identically distributed. So as we can see the arithmetic mean of Cauchy distributed random variables is always Cauchy distributed and therefore the CLT doesn t hold. 5 Historical Background The CLT has a long and vivid history. It developed over time and there are many different versions and proofs of the CLT. 1st Chapter ( ) In 1776 Laplace published a paper about the inclining angles of meteors. In this paper he tried to calculate the probability that the actual data collected differed from the theoretical mean he calculated. This was the first attempt to study summed random variables. From this it is clear that the CLT was motivated by statistics. His work was continued by Poisson, 6

9 who published two papers in 1824 and In these papers he tried to generalize the work of Laplace, and also make it more rigorous. At this time probability theory still was not considered a branch of real mathematics. For most mathematicians it was sufficient that the CLT worked in practice, so they didn t make a lot of effort to give real proofs. 2nd Chapter ( ) This mindset changed during the 19th century. Bessel, Dirichlet and especially Cauchy turned probability theory into a respected branch of pure mathematics. They succeeded in giving rigorous proofs, but there were still some issues. They had problems in dealing with distributions with infinite support and quantifying the rate of convergence. Moreover the conditions for the CLT were not satisfying. Between 1870 and 1913 the famous Russian mathematicians Markov, Chebyshev and Lyapunov did a lot of research on the CLT and are considered to be the most important contributers to the CLT. To prove the CLT they worked in two different directions: Markov and Chebyshev attempted to prove the CLT using the Method of Moments, whereas Lyapunov was using characteristic functions. 3rd Chapter ( ) During that period Lindeberg, Feller and Lévy studied the CLT. Lindeberg was able to apply the CLT to random vectors and he quantified the rate of convergence. His proof was a big step since he was able to give all sufficient conditions of the CLT. Later, Feller and Lévy also succeeded in giving also all necessary conditions, which could be proven by the work of Cramer. The CLT, as it is known today, was born. The CLT today People have continued to improve the CLT. There has been various research on theorems related to dependent random variables, but the basic principles of Lindeberg, Feller and Lévy are still up to date. 7

10 6 Proof 6.1 Outline of the Proof The idea of the proof is to use nice properties of characteristic functions. The Lévy continuity theorem states that the distribution of the limit of random variables is uniquely defined by the limit of the corresponding characteristic functions. So all we have to do is understand the limit of the characteristic functions of our summed random variables. We will see that the characteristic function of summed i.i.d. random variables behave very well. The first step is to understand the Lévy continuity theorem. The second step will deal with the evaluation of the characteristic function of summed i.i.d. random variables. The final step will put all things together in a very nice and smooth proof. 6.2 Lévy Continuity Theorem The actual proof of the CLT is straight forward. The difficulty is to understand all the contributing theorems and lemmas. Since the most important theorem is the Lévy continuity theorem, I want to have a close look at this result. Lévy Continuity Theorem. Suppose X 1, X 2,...X n, X are random variables and ϕ(t) 1,..., ϕ(t) n, ϕ X (t) the corresponding sequence of characteristic functions, then X n X lim n ϕ n(t) = ϕ X (t) t R. To understand how the proof works we need some more tools: Bounded Convergence Theorem. If X, X 1, X 2,... are random variables, X n d X, C R and X n C for all n N, then lim E[X n] = E[X]. n Definition. A sequence of random variables X n is defined as tight if ɛ > 0 there exists a M R s.t. P ( X n > M) ɛ for all n N. Lemma If X n is tight, then there exists a random variable X s.t. X n d X. 8

11 Lemma If X n is tight and if each subsequence of X n that converges at all converges to a random variable Y, then also X n d Y Proof of the Lévy Continuity Theorem : Since cos(tx n ) and sin(tx n ) are continuous and bounded functions we can see that by Theorem 2.1 ϕ n (t) = E[e itx ] = E[cos(tX n )]+ie[sin(tx n )] n E[cos(tX)]+iE[sin(tX)] = ϕ n (t) : This part of the proof proceeds in two steps. First we want to show that pointwise convergence of characteristic functions implies tightness. After this, we are able to use nice properties of tight random variables to prove the claim. We will show tightness by estimating the following term which will turn out to be a nice upper bound for the probability that X n is big. For arbitrary δ > 0 δ 1 (1 ϕ n (t)) dt = δ 1 1 E[e itxn ] dt = δ 1 E[1 e itxn ] dt [ = δ 1 F ubini = δ 1 = δ 1 R R 1 e itx dp n (x) R [ ] [ 2δ 1 e itx dt ] dt dp n (x) cos(tx) + i sin(tx) dt ] [ = δ 1 2δ cos(tx) dt dp n (x) R = δ 1 2δ 2 sin(δx) dp n (x) R x = 2 1 sin(δx) dp n (x). δx R ] dp n (x) Now that this term has a nice shape we want to find a lower bound for it. We know that 1 sin(ux) ux 0. This is true because: sin x = x 0 cos(y)dy x. So the integral gets smaller if we discard an interval 9

12 2 1 sin(δx) dp n (x) 2 1 sin(δx) dp n (x) 2 R ux x δ/2 δx x δ/2 = P (X n δ/2 ). x δ/2 dp n (x) = P n ({x : x δ/2}) 1 1 δx } {{ } 1/2 Pick ɛ > 0. Because ϕ is continuous in 0 and ϕ(0) = 1 we can find a δ > 0 s.t. dp n (x) 1 ϕ(t) ɛ 4 t δ. We can use this to estimate the following term δ 1 1 ϕ(t) dt δ 1 2δ ɛ 4 = ɛ 2. Since ϕ n (t) 1 the bounded convergence theorem implies (1 ϕ n (t)) dt n Because of that there exists a N R s.t. for all n > N (1 ϕ(t)) dt. δ 1 (1 ϕ n (t)) dt δ 1 (1 ϕ(t)) dt ɛ 2. If we put the three bounds together, we get for all n > N P (X n δ/2 ) δ 1 1 ϕ n (t) dt = δ 1 (1 ϕ n (t)) + (1 ϕ(t)) (1 ϕ(t)) dt δ 1 1 ϕ n (t) dt δ 1 δ (1 ϕ(t)) dt + δ 1 (1 ϕ n (t)) dt ɛ. We just proved that the point δ/2 has the property that the probability of X n exceeding this value is small for infinite many n. To show tightness we just need to find a bound for all the finite cases. Because P n ([ m, m]) m 1 we know that for all n {1,..., N} there exists a m n R such that 10

13 P ( X n m n ) ɛ. Now define M = max{m 1,..., m N, δ/2}. Because of the monotonicity of distribution functions we just proved that P ( X n > M) ɛ for all n N X n is tight. Lemma tells us that X n has a limit. Because of Lemma we just need to show that every converging subsequence converges to X. So Suppose X nk converges to some random variable Y in distribution. Then we know because of that Y has the characteristic function ϕ(t). Now Y = d X because of Theorem 2.2. Since this holds for any converging subsequence we just have shown X n d X. 6.3 Lemmas To apply the Lévy Continuity Theorem to the characteristic function of summed i.i.d. random variables we need two more lemmas. Lemma Suppose X is a random variable and E [ X 2] <, then ϕ X (t) can be written as the following Taylor expansion ϕ X (t) = 1 + ite[x] + t2 2 E [ X 2] + o ( t 2) Recall o(t 2 ) means that as t 0 o(t 2 ) t 2 0. The Lemma can be proven by using the following estimate for the error term o(t 2 ) < min( X 3 3!, 2 X 2 2 ). Lemma Suppose c n is a sequence of complex numbers and c n n c, then ( lim 1 + c ) n n = e c n n 11

14 6.4 Proof of the Central Limit Theorem Without loss of generality, we can assume that µ = 0 and σ 2 = 1 because E ] and V ar = 1. [ Xn µ σ [ ] Xn µ σ = 0 The Lévy Continuity Theorem tells us that it is sufficient to show that the characteristic function of our sum converges to the characteristic function of a standard normally distributed random variable, i.e. and S n = X 1 + X X n. ( ) ϕ Sn (t) = ϕ t Sn n n e t2 /2 as n Now we want to use independence by applying Theorem 2.3 to the sum. Fix t R ϕ Sn ( t n ) = n k=1 ( ) ( ( )) t t n ϕ Xk = ϕ X1. n n The second equality is true because all random variables are identically distributed and therefore have the same characteristic function (Theorem 2.2). Lemma and the basic fact that V ar[y ] = E [ Y 2] E[Y ] 2 for any random variable Y yields ( ( )) t n ϕ X1 = 1 + i t E[X 1 ] n n } {{ } =0 ( )) n = (1 + t2 1 2n + o n ( = 1 + t2 /2 + n o ( ) 1 n n + t2 n E [ ( ) n X1 2 ] t 2 +o } {{ } n =1 ) n. By using Lemma and because o ( n 1) /n 1 0 as n we just identified the limit ( ) t lim ϕ S n n = e t2 /2. n 12

15 7 References P. Billingsley (1986) Probability and Measure. John Wiley and Sons, New York R. Durrett (2010) Probability: Theory and Examples M. Mether (2003) The History Of The Central Limit Theorem 13

5. Convergence of sequences of random variables

5. Convergence of sequences of random variables 5. Convergence of sequences of random variables Throughout this chapter we assume that {X, X 2,...} is a sequence of r.v. and X is a r.v., and all of them are defined on the same probability space (Ω,

More information

Chapter 4 Expected Values

Chapter 4 Expected Values Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

More information

Lecture 12: Sequences. 4. If E X and x is a limit point of E then there is a sequence (x n ) in E converging to x.

Lecture 12: Sequences. 4. If E X and x is a limit point of E then there is a sequence (x n ) in E converging to x. Lecture 12: Sequences We now recall some basic properties of limits. Theorem 0.1 (Rudin, Theorem 3.2). Let (x n ) be a sequence in a metric space X. 1. (x n ) converges to x X if and only if every neighborhood

More information

Lectures 5-6: Taylor Series

Lectures 5-6: Taylor Series Math 1d Instructor: Padraic Bartlett Lectures 5-: Taylor Series Weeks 5- Caltech 213 1 Taylor Polynomials and Series As we saw in week 4, power series are remarkably nice objects to work with. In particular,

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

{f 1 (U), U F} is an open cover of A. Since A is compact there is a finite subcover of A, {f 1 (U 1 ),...,f 1 (U n )}, {U 1,...

{f 1 (U), U F} is an open cover of A. Since A is compact there is a finite subcover of A, {f 1 (U 1 ),...,f 1 (U n )}, {U 1,... 44 CHAPTER 4. CONTINUOUS FUNCTIONS In Calculus we often use arithmetic operations to generate new continuous functions from old ones. In a general metric space we don t have arithmetic, but much of it

More information

13.0 Central Limit Theorem

13.0 Central Limit Theorem 13.0 Central Limit Theorem Discuss Midterm/Answer Questions Box Models Expected Value and Standard Error Central Limit Theorem 1 13.1 Box Models A Box Model describes a process in terms of making repeated

More information

MATH 201. Final ANSWERS August 12, 2016

MATH 201. Final ANSWERS August 12, 2016 MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

More information

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the

More information

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev Law of Large Numbers Alexandra Barbato and Craig O Connell Honors 391A Mathematical Gems Jenia Tevelev Jacob Bernoulli Life of Jacob Bernoulli Born into a family of important citizens in Basel, Switzerland

More information

6. Metric spaces. In this section we review the basic facts about metric spaces. d : X X [0, )

6. Metric spaces. In this section we review the basic facts about metric spaces. d : X X [0, ) 6. Metric spaces In this section we review the basic facts about metric spaces. Definitions. A metric on a non-empty set X is a map with the following properties: d : X X [0, ) (i) If x, y X are points

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14 Roulette p.1/14 Roulette Math 5 Crew Department of Mathematics Dartmouth College Roulette p.2/14 Roulette: A Game of Chance To analyze Roulette, we make two hypotheses about Roulette s behavior. When we

More information

Taylor Polynomials and Taylor Series Math 126

Taylor Polynomials and Taylor Series Math 126 Taylor Polynomials and Taylor Series Math 26 In many problems in science and engineering we have a function f(x) which is too complicated to answer the questions we d like to ask. In this chapter, we will

More information

1. Periodic Fourier series. The Fourier expansion of a 2π-periodic function f is:

1. Periodic Fourier series. The Fourier expansion of a 2π-periodic function f is: CONVERGENCE OF FOURIER SERIES 1. Periodic Fourier series. The Fourier expansion of a 2π-periodic function f is: with coefficients given by: a n = 1 π f(x) a 0 2 + a n cos(nx) + b n sin(nx), n 1 f(x) cos(nx)dx

More information

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2016 LECTURE NOTES Series

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2016 LECTURE NOTES Series ANALYTICAL MATHEMATICS FOR APPLICATIONS 206 LECTURE NOTES 8 ISSUED 24 APRIL 206 A series is a formal sum. Series a + a 2 + a 3 + + + where { } is a sequence of real numbers. Here formal means that we don

More information

x a x 2 (1 + x 2 ) n.

x a x 2 (1 + x 2 ) n. Limits and continuity Suppose that we have a function f : R R. Let a R. We say that f(x) tends to the limit l as x tends to a; lim f(x) = l ; x a if, given any real number ɛ > 0, there exists a real number

More information

3. Renewal Theory. Definition 3 of the Poisson process can be generalized: Let X 1, X 2,..., iidf(x) be non-negative interarrival times.

3. Renewal Theory. Definition 3 of the Poisson process can be generalized: Let X 1, X 2,..., iidf(x) be non-negative interarrival times. 3. Renewal Theory Definition 3 of the Poisson process can be generalized: Let X 1, X 2,..., iidf(x) be non-negative interarrival times. Set S n = n i=1 X i and N(t) = max {n : S n t}. Then {N(t)} is a

More information

Stochastic Processes and Advanced Mathematical Finance. Laws of Large Numbers

Stochastic Processes and Advanced Mathematical Finance. Laws of Large Numbers Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Stochastic Processes and Advanced

More information

Reference: Introduction to Partial Differential Equations by G. Folland, 1995, Chap. 3.

Reference: Introduction to Partial Differential Equations by G. Folland, 1995, Chap. 3. 5 Potential Theory Reference: Introduction to Partial Differential Equations by G. Folland, 995, Chap. 3. 5. Problems of Interest. In what follows, we consider Ω an open, bounded subset of R n with C 2

More information

Sequences and Convergence in Metric Spaces

Sequences and Convergence in Metric Spaces Sequences and Convergence in Metric Spaces Definition: A sequence in a set X (a sequence of elements of X) is a function s : N X. We usually denote s(n) by s n, called the n-th term of s, and write {s

More information

16 Convergence of Fourier Series

16 Convergence of Fourier Series 16 Convergence of Fourier Series 16.1 Pointwise convergence of Fourier series Definition: Piecewise smooth functions For f defined on interval [a, b], f is piecewise smooth on [a, b] if there is a partition

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

) + ˆf (n) sin( 2πnt. = 2 u x 2, t > 0, 0 < x < 1. u(0, t) = u(1, t) = 0, t 0. (x, 0) = 0 0 < x < 1.

) + ˆf (n) sin( 2πnt. = 2 u x 2, t > 0, 0 < x < 1. u(0, t) = u(1, t) = 0, t 0. (x, 0) = 0 0 < x < 1. Introduction to Fourier analysis This semester, we re going to study various aspects of Fourier analysis. In particular, we ll spend some time reviewing and strengthening the results from Math 425 on Fourier

More information

10 BIVARIATE DISTRIBUTIONS

10 BIVARIATE DISTRIBUTIONS BIVARIATE DISTRIBUTIONS After some discussion of the Normal distribution, consideration is given to handling two continuous random variables. The Normal Distribution The probability density function f(x)

More information

Gambling Systems and Multiplication-Invariant Measures

Gambling Systems and Multiplication-Invariant Measures Gambling Systems and Multiplication-Invariant Measures by Jeffrey S. Rosenthal* and Peter O. Schwartz** (May 28, 997.. Introduction. This short paper describes a surprising connection between two previously

More information

Transformations and Expectations of random variables

Transformations and Expectations of random variables Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,

More information

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August

More information

INTRODUCTION TO THE CONVERGENCE OF SEQUENCES

INTRODUCTION TO THE CONVERGENCE OF SEQUENCES INTRODUCTION TO THE CONVERGENCE OF SEQUENCES BECKY LYTLE Abstract. In this paper, we discuss the basic ideas involved in sequences and convergence. We start by defining sequences and follow by explaining

More information

Metric Spaces. Chapter 1

Metric Spaces. Chapter 1 Chapter 1 Metric Spaces Many of the arguments you have seen in several variable calculus are almost identical to the corresponding arguments in one variable calculus, especially arguments concerning convergence

More information

The Alternating Series Test

The Alternating Series Test The Alternating Series Test So far we have considered mostly series all of whose terms are positive. If the signs of the terms alternate, then testing convergence is a much simpler matter. On the other

More information

Taylor and Maclaurin Series

Taylor and Maclaurin Series Taylor and Maclaurin Series In the preceding section we were able to find power series representations for a certain restricted class of functions. Here we investigate more general problems: Which functions

More information

Some Notes on Taylor Polynomials and Taylor Series

Some Notes on Taylor Polynomials and Taylor Series Some Notes on Taylor Polynomials and Taylor Series Mark MacLean October 3, 27 UBC s courses MATH /8 and MATH introduce students to the ideas of Taylor polynomials and Taylor series in a fairly limited

More information

Continuous Probability Distributions (120 Points)

Continuous Probability Distributions (120 Points) Economics : Statistics for Economics Problem Set 5 - Suggested Solutions University of Notre Dame Instructor: Julio Garín Spring 22 Continuous Probability Distributions 2 Points. A management consulting

More information

Renewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t].

Renewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t]. Renewal Theory Def. A stochastic process {N(t), t 0} is said to be a counting process if N(t) represents the total number of events that have occurred up to time t. X 1, X 2,... times between the events

More information

AMS 5 CHANCE VARIABILITY

AMS 5 CHANCE VARIABILITY AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and

More information

Probability and statistics; Rehearsal for pattern recognition

Probability and statistics; Rehearsal for pattern recognition Probability and statistics; Rehearsal for pattern recognition Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception

More information

EXIT TIME PROBLEMS AND ESCAPE FROM A POTENTIAL WELL

EXIT TIME PROBLEMS AND ESCAPE FROM A POTENTIAL WELL EXIT TIME PROBLEMS AND ESCAPE FROM A POTENTIAL WELL Exit Time problems and Escape from a Potential Well Escape From a Potential Well There are many systems in physics, chemistry and biology that exist

More information

Nonparametric adaptive age replacement with a one-cycle criterion

Nonparametric adaptive age replacement with a one-cycle criterion Nonparametric adaptive age replacement with a one-cycle criterion P. Coolen-Schrijner, F.P.A. Coolen Department of Mathematical Sciences University of Durham, Durham, DH1 3LE, UK e-mail: Pauline.Schrijner@durham.ac.uk

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

3. Continuous Random Variables

3. Continuous Random Variables 3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

More information

Probability and Expected Value

Probability and Expected Value Probability and Expected Value This handout provides an introduction to probability and expected value. Some of you may already be familiar with some of these topics. Probability and expected value are

More information

Statistiek (WISB361)

Statistiek (WISB361) Statistiek (WISB361) Final exam June 29, 2015 Schrijf uw naam op elk in te leveren vel. Schrijf ook uw studentnummer op blad 1. The maximum number of points is 100. Points distribution: 23 20 20 20 17

More information

Chap2: The Real Number System (See Royden pp40)

Chap2: The Real Number System (See Royden pp40) Chap2: The Real Number System (See Royden pp40) 1 Open and Closed Sets of Real Numbers The simplest sets of real numbers are the intervals. We define the open interval (a, b) to be the set (a, b) = {x

More information

A sequence of real numbers or a sequence in R is a mapping f : N R.

A sequence of real numbers or a sequence in R is a mapping f : N R. A sequence of real numbers or a sequence in R is a mapping f : N R. A sequence of real numbers or a sequence in R is a mapping f : N R. Notation: We write x n for f(n), n N and so the notation for a sequence

More information

Math 526: Brownian Motion Notes

Math 526: Brownian Motion Notes Math 526: Brownian Motion Notes Definition. Mike Ludkovski, 27, all rights reserved. A stochastic process (X t ) is called Brownian motion if:. The map t X t (ω) is continuous for every ω. 2. (X t X t

More information

Continuous Distributions

Continuous Distributions MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take

More information

Taylor Series and Asymptotic Expansions

Taylor Series and Asymptotic Expansions Taylor Series and Asymptotic Epansions The importance of power series as a convenient representation, as an approimation tool, as a tool for solving differential equations and so on, is pretty obvious.

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

Double Sequences and Double Series

Double Sequences and Double Series Double Sequences and Double Series Eissa D. Habil Islamic University of Gaza P.O. Box 108, Gaza, Palestine E-mail: habil@iugaza.edu Abstract This research considers two traditional important questions,

More information

t := maxγ ν subject to ν {0,1,2,...} and f(x c +γ ν d) f(x c )+cγ ν f (x c ;d).

t := maxγ ν subject to ν {0,1,2,...} and f(x c +γ ν d) f(x c )+cγ ν f (x c ;d). 1. Line Search Methods Let f : R n R be given and suppose that x c is our current best estimate of a solution to P min x R nf(x). A standard method for improving the estimate x c is to choose a direction

More information

Conditional expectation

Conditional expectation A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

More information

3.1. Sequences and Their Limits Definition (3.1.1). A sequence of real numbers (or a sequence in R) is a function from N into R.

3.1. Sequences and Their Limits Definition (3.1.1). A sequence of real numbers (or a sequence in R) is a function from N into R. CHAPTER 3 Sequences and Series 3.. Sequences and Their Limits Definition (3..). A sequence of real numbers (or a sequence in R) is a function from N into R. Notation. () The values of X : N! R are denoted

More information

20 Applications of Fourier transform to differential equations

20 Applications of Fourier transform to differential equations 20 Applications of Fourier transform to differential equations Now I did all the preparatory work to be able to apply the Fourier transform to differential equations. The key property that is at use here

More information

MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators...

MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators... MATH4427 Notebook 2 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................

More information

6.4 The Infinitely Many Alleles Model

6.4 The Infinitely Many Alleles Model 6.4. THE INFINITELY MANY ALLELES MODEL 93 NE µ [F (X N 1 ) F (µ)] = + 1 i

More information

Probability and Statistics

Probability and Statistics CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

More information

MATH 461: Fourier Series and Boundary Value Problems

MATH 461: Fourier Series and Boundary Value Problems MATH 461: Fourier Series and Boundary Value Problems Chapter III: Fourier Series Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2015 fasshauer@iit.edu MATH 461 Chapter

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Fourier Series. A Fourier series is an infinite series of the form. a + b n cos(nωx) +

Fourier Series. A Fourier series is an infinite series of the form. a + b n cos(nωx) + Fourier Series A Fourier series is an infinite series of the form a b n cos(nωx) c n sin(nωx). Virtually any periodic function that arises in applications can be represented as the sum of a Fourier series.

More information

Math 317 HW #5 Solutions

Math 317 HW #5 Solutions Math 317 HW #5 Solutions 1. Exercise 2.4.2. (a) Prove that the sequence defined by x 1 = 3 and converges. x n+1 = 1 4 x n Proof. I intend to use the Monotone Convergence Theorem, so my goal is to show

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Lecture 8: Continuous random variables, expectation and variance

Lecture 8: Continuous random variables, expectation and variance Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: spring 2012 Lejla Batina Version: spring 2012 Wiskunde

More information

Test Code MS (Short answer type) 2004 Syllabus for Mathematics. Syllabus for Statistics

Test Code MS (Short answer type) 2004 Syllabus for Mathematics. Syllabus for Statistics Test Code MS (Short answer type) 24 Syllabus for Mathematics Permutations and Combinations. Binomials and multinomial theorem. Theory of equations. Inequalities. Determinants, matrices, solution of linear

More information

Course 221: Analysis Academic year , First Semester

Course 221: Analysis Academic year , First Semester Course 221: Analysis Academic year 2007-08, First Semester David R. Wilkins Copyright c David R. Wilkins 1989 2007 Contents 1 Basic Theorems of Real Analysis 1 1.1 The Least Upper Bound Principle................

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Course 421: Algebraic Topology Section 1: Topological Spaces

Course 421: Algebraic Topology Section 1: Topological Spaces Course 421: Algebraic Topology Section 1: Topological Spaces David R. Wilkins Copyright c David R. Wilkins 1988 2008 Contents 1 Topological Spaces 1 1.1 Continuity and Topological Spaces...............

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

Lecture 13: Martingales

Lecture 13: Martingales Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

Lecture 3: Fourier Series: pointwise and uniform convergence.

Lecture 3: Fourier Series: pointwise and uniform convergence. Lecture 3: Fourier Series: pointwise and uniform convergence. 1. Introduction. At the end of the second lecture we saw that we had for each function f L ([, π]) a Fourier series f a + (a k cos kx + b k

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

MATHEMATICAL METHODS OF STATISTICS

MATHEMATICAL METHODS OF STATISTICS MATHEMATICAL METHODS OF STATISTICS By HARALD CRAMER TROFESSOK IN THE UNIVERSITY OF STOCKHOLM Princeton PRINCETON UNIVERSITY PRESS 1946 TABLE OF CONTENTS. First Part. MATHEMATICAL INTRODUCTION. CHAPTERS

More information

Set theory, and set operations

Set theory, and set operations 1 Set theory, and set operations Sayan Mukherjee Motivation It goes without saying that a Bayesian statistician should know probability theory in depth to model. Measure theory is not needed unless we

More information

Sequences of Functions

Sequences of Functions Sequences of Functions Uniform convergence 9. Assume that f n f uniformly on S and that each f n is bounded on S. Prove that {f n } is uniformly bounded on S. Proof: Since f n f uniformly on S, then given

More information

2 Sequences and Accumulation Points

2 Sequences and Accumulation Points 2 Sequences and Accumulation Points 2.1 Convergent Sequences Formally, a sequence of real numbers is a function ϕ : N R. For instance the function ϕ(n) = 1 n 2 for all n N defines a sequence. It is customary,

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

An example of a computable

An example of a computable An example of a computable absolutely normal number Verónica Becher Santiago Figueira Abstract The first example of an absolutely normal number was given by Sierpinski in 96, twenty years before the concept

More information

Expected Value. 24 February 2014. Expected Value 24 February 2014 1/19

Expected Value. 24 February 2014. Expected Value 24 February 2014 1/19 Expected Value 24 February 2014 Expected Value 24 February 2014 1/19 This week we discuss the notion of expected value and how it applies to probability situations, including the various New Mexico Lottery

More information

The Kelly Betting System for Favorable Games.

The Kelly Betting System for Favorable Games. The Kelly Betting System for Favorable Games. Thomas Ferguson, Statistics Department, UCLA A Simple Example. Suppose that each day you are offered a gamble with probability 2/3 of winning and probability

More information

How to Gamble If You Must

How to Gamble If You Must How to Gamble If You Must Kyle Siegrist Department of Mathematical Sciences University of Alabama in Huntsville Abstract In red and black, a player bets, at even stakes, on a sequence of independent games

More information

MATH31011/MATH41011/MATH61011: FOURIER ANALYSIS AND LEBESGUE INTEGRATION. Chapter 4: Fourier Series and L 2 ([ π, π], µ) ( 1 π

MATH31011/MATH41011/MATH61011: FOURIER ANALYSIS AND LEBESGUE INTEGRATION. Chapter 4: Fourier Series and L 2 ([ π, π], µ) ( 1 π MATH31011/MATH41011/MATH61011: FOURIER ANALYSIS AND LEBESGUE INTEGRATION Chapter 4: Fourier Series and L ([, π], µ) Square Integrable Functions Definition. Let f : [, π] R be measurable. We say that f

More information

THE BANACH CONTRACTION PRINCIPLE. Contents

THE BANACH CONTRACTION PRINCIPLE. Contents THE BANACH CONTRACTION PRINCIPLE ALEX PONIECKI Abstract. This paper will study contractions of metric spaces. To do this, we will mainly use tools from topology. We will give some examples of contractions,

More information

STAT 571 Assignment 1 solutions

STAT 571 Assignment 1 solutions STAT 571 Assignment 1 solutions 1. If Ω is a set and C a collection of subsets of Ω, let A be the intersection of all σ-algebras that contain C. rove that A is the σ-algebra generated by C. Solution: Let

More information

The Delta Method and Applications

The Delta Method and Applications Chapter 5 The Delta Method and Applications 5.1 Linear approximations of functions In the simplest form of the central limit theorem, Theorem 4.18, we consider a sequence X 1, X,... of independent and

More information

Tail inequalities for order statistics of log-concave vectors and applications

Tail inequalities for order statistics of log-concave vectors and applications Tail inequalities for order statistics of log-concave vectors and applications Rafał Latała Based in part on a joint work with R.Adamczak, A.E.Litvak, A.Pajor and N.Tomczak-Jaegermann Banff, May 2011 Basic

More information

JANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability)

JANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability) JANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability) Author: Daniel Cooney Reviewer: Rachel Zax Last modified: January 4, 2012 Reading from Meyer, Mathematics

More information

Prof. Girardi, Math 703, Fall 2012 Homework Solutions: Homework 13. in X is Cauchy.

Prof. Girardi, Math 703, Fall 2012 Homework Solutions: Homework 13. in X is Cauchy. Homework 13 Let (X, d) and (Y, ρ) be metric spaces. Consider a function f : X Y. 13.1. Prove or give a counterexample. f preserves convergent sequences if {x n } n=1 is a convergent sequence in X then

More information

Estimating the Degree of Activity of jumps in High Frequency Financial Data. joint with Yacine Aït-Sahalia

Estimating the Degree of Activity of jumps in High Frequency Financial Data. joint with Yacine Aït-Sahalia Estimating the Degree of Activity of jumps in High Frequency Financial Data joint with Yacine Aït-Sahalia Aim and setting An underlying process X = (X t ) t 0, observed at equally spaced discrete times

More information

1 Limiting distribution for a Markov chain

1 Limiting distribution for a Markov chain Copyright c 2009 by Karl Sigman Limiting distribution for a Markov chain In these Lecture Notes, we shall study the limiting behavior of Markov chains as time n In particular, under suitable easy-to-check

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)! Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following

More information

Analysis MA131. University of Warwick. Term

Analysis MA131. University of Warwick. Term Analysis MA131 University of Warwick Term 1 01 13 September 8, 01 Contents 1 Inequalities 5 1.1 What are Inequalities?........................ 5 1. Using Graphs............................. 6 1.3 Case

More information

SECTION 10-2 Mathematical Induction

SECTION 10-2 Mathematical Induction 73 0 Sequences and Series 6. Approximate e 0. using the first five terms of the series. Compare this approximation with your calculator evaluation of e 0.. 6. Approximate e 0.5 using the first five terms

More information

Example: Find the expected value of the random variable X. X 2 4 6 7 P(X) 0.3 0.2 0.1 0.4

Example: Find the expected value of the random variable X. X 2 4 6 7 P(X) 0.3 0.2 0.1 0.4 MATH 110 Test Three Outline of Test Material EXPECTED VALUE (8.5) Super easy ones (when the PDF is already given to you as a table and all you need to do is multiply down the columns and add across) Example:

More information

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information