Example 1. Consider tossing a coin n times. If X is the number of heads obtained, X is a random variable.

Size: px
Start display at page:

Download "Example 1. Consider tossing a coin n times. If X is the number of heads obtained, X is a random variable."

Transcription

1 7 Random variables A random variable is a real-valued function defined on some sample space. associates to each elementary outcome in the sample space a numerical value. That is, it Example 1. Consider tossing a coin n times. If X is the number of heads obtained, X is a random variable. Example 2. Consider a stock price which moves each day either up one unit or down one unit, and suppose its initial value is 10$. Let T be the first time the value of the stock hits either 0$ or 20$. Then T is a random variable. Example 3. The lifetime T of a lightbulb is a random variable. In the last example, if we can measure time with infinite precision, then the possible values of T are the non-negative real numbers [0, ). This is an uncountable set: there is no way to enumerate [0, ) in a sequence. We will have to treat random variables of this type separately from the random variables which take values in a countable set. While in practice time can only be measured up to finite precision and consequently the possible values of T will in fact be countable, it is still more convenient mathematically to make the idealization that all values in [0, ) are possible, and we will do so. 7.1 Distribution functions For a random variable X, we can associate the distribution function F X ( ), sometimes called the cumulative distribution function, defined by F X (t) = P (X t). (1) Notice that F X ( ) is defined on all real numbers. The distribution function determines the probability that X falls in an interval: P (a < X b) = P (X b) P (X a) = F X (b) F X (a). Example 4. Suppose a coin with probability p of landing heads is tossed until the first time a heads appears. Let T be the number of tosses required. For a real number t, let [t] denote the integer part of t. We have P (T > t) = P (T > [t]) = P (first [t] tosses are tails ) = (1 p) [t]. 1

2 Consequently, We can use this to compute F T (t) = P (T t) = { 1 (1 p) [t] if t 0, 0 if t < 0. P (2 < T 5) = 1 (1 p) 5 [ 1 (1 p) 2] = (1 p) 2 (1 p) 5. We may want to find the probability that X falls in a closed interval. To this end, we need the following: Proposition 1. Let X be a random variable with distribution function F. Then P (X < t) = lim s t F (s). The value lim s t F (s) is called the left limit of F at t, and is denoted sometime by F (t ). Part of the conclusion of Proposition 1 is that a distribution function has left limits everywhere. To prove this, we first need to show that a probability P ( ) obeys a certain kind of continuity. Lemma 2. (i) Let A 1 A 2 A 3 be a non-decreasing sequence of events. Then lim P (A n) = P ( A k ). (ii) Let A 1 A 2 A 3 be a non-increasing sequence of events. Then lim P (A n) = P ( A k ). Proof. We prove (i). The proof of (ii) is obtained by looking at complements and using (i), and is left to the reader as an exercise. Define B k = A k \ A k 1. The events {B k } are disjoint, A n = n B k, and A k = B k. Thus, P ( A k ) = P ( B k ) = P (B k ) = lim n P (B k ) = lim P ( n B k ) = lim P (A n ). 2

3 Proof of Proposition 1. Notice that {X < t} = {X t 1 }. Then applying Lemma k 2 gives P (X < t) = lim P (X t 1 n ) = lim F (t 1 n ) = lim s t F (s). Thus, we can also use the distribution function of X to calculate other probabilities involving X: P (a X b) = P (X b) P (X < a) = F (b) F (a ) P (X = a) = P (X a) P (X < a) = F (a) F (a ). (2) A random variable X is proper if P ( < X < ) = 1. Almost all random variables we will encounter will be proper, but it is worth noting that there do exist random variables which are not proper. Example 5. Suppose a particle moves on the integer {..., 3, 2, 1, 0, 1, 2, 3,...} as follows: at each move, it moves up one integer with probability 2/3, and moves down one integer with probability 1/3. The particle starts at 0. Let T be the first time that the particle is at 1. The event that the particle never hits 1 is {T = }. We will see later that P (T < ) < 1, so that T is not a proper random variable. Writing {X < } = {X < k}, if X is a finite random variable then applying Lemma 2 again shows Also, 1 = P (X < ) = lim P (X n) = lim F X (n). 0 = P (X < ) = lim P (X n) = lim F X( n) = lim F X(n). m n Finally, since {X t} = {X t + 1 }, using part (ii) of Lemma 2 shows that k lim s t F (s) = F (t), and so F is right-continuous everywhere. We summarize the properties of the distribution function of a random variable X as follows: Proposition 3. Let X be a proper random variable with distribution function F. Then (i) F is right-continuous: lim s t F (s) = F (t), (ii) the left-limits of F exist everywhere, 3

4 (iii) lim t F (t) = 1. (iv) lim t F (t) = 0. The first two properties imply that the worst behavior possible of a distribution function is that it jumps. 8 Discrete random variables We call a random variable which can take on only countable many values a discrete random variable. 8.1 Probability mass functions Let X be a discrete random variable which takes values in the set A = {a 0, a 1, a 2,...}. Associated to X is the function p X ( ), defined by p X (a) = P (X = a). (3) The function p X ( ) is defined for all real numbers, although it will be strictly positive only for a in the set A. A function p( ) satisfying (i) p(a) 0 for all a, (ii) a p(a) = 1 is called a probability mass function, or pmf for short. It is easily checked that p X ( ) satisfies these conditions, and we call it the pmf of X. We write X p( ) to indicate that X has pmf p( ). Notice that from (2) we have p X (a) = P (X = a) = F X (a) F X (a ), (4) so the pmf of X can be determined if the distribution function of X is known. Example 6. Suppose n independent experiments are performed, each of which can result in either success or failure, and suppose that the probability of success on each experiment is p. Such a sequence of experiments is called Bernoulli trials. Let X be the number of successes in these n experiments. The event {X = k} contains all outcomes containing 4

5 exactly k successes and n k failures. There are ( n k) such outcomes, each having probability p k (1 p) n k (by independence.) Thus ( ) n p X (k) = P (X = k) = p k (1 p) n k. (5) k A random variable X having the pmf in (5) is called a Binomial(n, p) random variable, and we write X Binomial(n, p). 8.2 The distribution of a discrete random variable An event determined by X is an event of the form {X A}, where A is a subset of the real numbers. We can find the probability of any event determined by X using only the pmf of X: P (X A) = a A p X (a). (6) Applying (6) to the set A = (, t] gives F X (t) = P (X t) = a A p X (a). (7) Thus the distribution function of X can be computed from the pmf of X. To summarize, we record the following: Proposition 4. Let X be a discrete random variable. Each of the following can be computed using any of the others: (i) The probabilities of all events determined by X, that is, the collection of probabilities {P (X A) : A R}, (ii) the pmf p X ( ) of X, (iii) the distribution function F X ( ) of X. Proof. This is the content of the combination of equations (4), (6), and (7). The collection of probabilities {P (X A) : A R} is called simply the distribution of X. It contains all the probabilistic information about the random variable X. Proposition 4 says that for a discrete random variable, it is enough to specify either the pmf or the distribution function to specify the distribution. Thus, if one is asked to determine the distribution of X, it is sufficient to provide either the pmf or the distribution function. 5

6 9 Continuous random variables A probability density function (abbreviated pdf or sometimes simply density) is a realvalued function f defined on the real numbers satisfying (i) f(t) 0 for all real numbers t, (ii) f(t)dt = 1. A continuous random variable is a random variable X for which there exists a pdf f X so that P (a < X b) = b a f X (t)dt for all a < b. (8) In fact, if (8) holds, then for any subset of real numbers A such that f(t)dt is defined, A the identity P (X A) = f(t)dt (9) is valid. Applying (9) to the set (, t] shows that A F X (t) = P (X t) = t f(s)ds, (10) and so the distribution function of X can be determined from the density function of X. Note that a consequence of (10) is that F X is a continuous function for a continuous random variable, and in particular P (X = a) = F X (a) F X (a ) = 0. Applying the Fundamental Theorem of Calculus to (10) shows that d dt F X(t) = f X (t), (11) at all points t where f X is continuous. Thus if f X is piecewise continuous, as will be the case in this course, then it can be determined from the distribution function via (11). The following summarizes the situation for continuous random variables with piecewise continuous densities: 6

7 Proposition 5. Let X be a continuous random variable with piecewise continuous density. Each of the following can be computed using any of the others: (i) The probabilities of all events determined by X, that is, the collection of probabilities {P (X A) : A R such that f(t)dt is defined}, A (ii) the pdf f X ( ) of X, (iii) the distribution function F X ( ) of X. Proof. This is what equations (9), (10), (11) say. 9.1 Interpretation of density function What is the interpretation of the density function? Suppose that X has a density f, which is continuous at the point a. We have P (a X a + ) = F (a + ) F (a) The right-hand side tends to F (a) = f(a) as 0. Thus we can write that. P (a X a + ) = f(a) + ε 0 ( ), where ε 0 ( ) 0 as 0. Multiplying both sides by, we have that P (a X a + ) = f(a) + ε 0 ( ). }{{} ε( ) If ε( ) = ε 0 ( ), then ε( )/ 0 as 0. Thus we can write P (a X a + ) f(a), (12) where the error in the approximation is ε( ) and satisfies ε( )/ 0 as 0. Equation (12) is useful in interpreting the meaning of a probability density function: the probability of X falling in a very small interval near a is approximated by f(a), where is the length of the interval. 7

8 10 Expected Value Let X be a discrete random variable with the following pmf 2 if a = 1, 5 2 if a = 0, 5 p X (a) = 1 if a = 1, 5 0 if a { 1, 0, 1}. How should the average value of X be defined? A first attempt might be to say that the average value should be 0, since 0 is in the center of the three possible values { 1, 0, 1}. But this does not take into account that X does not assume these values with equal probability. The average should account for not just the values taken on by X, but also the probabilities associated to each of these values. This leads to the definition of the expectation of X, which is a weighted average of the values of X, the weights determined by the pmf or pdf. Precisely, we define E(X) = { a ap X(a) if X is discrete tf X(t)dt if X is continuous. (13) E(X) is only defined when the sum or integral in (13) converges absolutely, that is, we need { a a p X(a) < if X is discrete In the example above, t f X(t)dt < if X is continuous E(X) = = 1 5. We use the terms expected value, mean, and moment all to refer to the expectation of X. Example 7. Let X be a Binomial(n, p) random variable. This means that X has a pmf given by ( ) n p X (k) = p k (1 p) n k, k 8

9 for k = 0, 1,..., n. The pmf is 0 for any other values. Then n ( ) n E(X) = k p k (1 p) n k k k=0 n = k n! k! (n k)!pk (1 p) n k = n n! (k 1)!(n k)! pk (1 p) n k k 1 = np n n 1 (n 1)! (k 1)!(n 1 (k 1))! pk 1 (1 p) n 1 (k 1) (n 1)! = np p k (1 p) n 1 k k!(n 1 k)! k=0 }{{} = np. ( n 1 k ) } {{ } pmf of Binomial(n 1, p) r.v. Example 8. We say that X is an Exponential random variable with parameter λ if it has a pdf { 1 λ f(t) = e 1 λ t if t 0, 0 if t < 0. X has the property that P (X > t + s X > t) = P (X > t). (The reader should check that!) The expected value is the integral E(X) = We can evaluate this by integration by parts: Set 0 t 1 λ e 1 λ t dt. u = t du = dt v = e 1 λ t dv = 1 λ e 1 λ t dt 9

10 so that 0 t 1 1 λ e λ t dt = te 1 λ t + 0 = λe 1 λ t = λ 0 0 e 1 λ t dt Functions of random variables If g : R R is a function, and X is a random variable, then Y = g(x) is a new random variable. To calculate E(Y ) according to the definition (13), we need the pmf if X is discrete, or the pdf if X is continuous. Fortunately, the following proposition tells us how to compute E(Y ) without finding its density or pmf. Proposition 6. Let X be a random variable, and g a real-valued function. { a E(g(X)) = g(a)p X(a) if X is discrete with pmf p X, g(t)f X(t)dt if X is continuous with pdf f X. Proof. We prove the case where X is discrete: E(g(X)) = b = b = b = b = a bp (g(x) = b) b P (X = a) a : g(a)=b a : g(a)=b a : g(a)=b bp X (a) g(a)p X (a). g(a)p X (a) An immediate corollary is the following: 10

11 Corollary 7. Let X be a random variable, and let α and β be constants. Then E(αX + β) = αe(x) + β. Proof. We write the continuous case, the discrete case is similar: Applying Proposition 6 to g(x) = αx + β gives E(αX + β) = (αt + β)f X (t)dt = α tf X (t) + β f X (t)dt = αe(x) + β. 11 Variance Expectation measures the center of mass of a density or pmf. Variance is a measure of how spread out the density or pmf of X is. The random variable Y = (X E(X)) 2 gives the squared distance of X to its mean value. This measures how far X is from its center of mass. Taking the expectation of Y gives the variance of X: { V (X) = E(X E(X)) 2 a = (a E(X))2 p X (a) if X is discrete with pmf p X, (t E(X))2 f X (t)dt if X is continuous with pdf p X. The following is a useful way to compute variance Proposition 8. for a random variable X, V (X) = E(X 2 ) [E(X)] 2. Proof. The proof is similar in the continuous and discrete cases, we show here the discrete 11

12 case: V (X) = a = a (a E(X)) 2 p X (a) (a 2 2aE(X) + [E(X)] 2 )p X (a) = a a 2 p X (a) 2E(X) a ap X (a) + [E(X)] 2 a p X (a) = E(X 2 ) 2E(X)E(X) + [E(X)] 2 = E(X 2 ) [E(X)] 2 12

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)! Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous?

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous? 36 CHAPTER 1. LIMITS AND CONTINUITY 1.3 Continuity Before Calculus became clearly de ned, continuity meant that one could draw the graph of a function without having to lift the pen and pencil. While this

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Mathematics of Life Contingencies MATH 3281

Mathematics of Life Contingencies MATH 3281 Mathematics of Life Contingencies MATH 3281 Life annuities contracts Edward Furman Department of Mathematics and Statistics York University February 13, 2012 Edward Furman Mathematics of Life Contingencies

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Practice with Proofs

Practice with Proofs Practice with Proofs October 6, 2014 Recall the following Definition 0.1. A function f is increasing if for every x, y in the domain of f, x < y = f(x) < f(y) 1. Prove that h(x) = x 3 is increasing, using

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

Basic Probability Concepts

Basic Probability Concepts page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

1 if 1 x 0 1 if 0 x 1

1 if 1 x 0 1 if 0 x 1 Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 4. Life Insurance. c 29. Miguel A. Arcones. All rights reserved. Extract from: Arcones Manual for the SOA Exam MLC. Fall 29 Edition. available at http://www.actexmadriver.com/ c 29. Miguel A. Arcones.

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

Lectures 5-6: Taylor Series

Lectures 5-6: Taylor Series Math 1d Instructor: Padraic Bartlett Lectures 5-: Taylor Series Weeks 5- Caltech 213 1 Taylor Polynomials and Series As we saw in week 4, power series are remarkably nice objects to work with. In particular,

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

A review of the portions of probability useful for understanding experimental design and analysis.

A review of the portions of probability useful for understanding experimental design and analysis. Chapter 3 Review of Probability A review of the portions of probability useful for understanding experimental design and analysis. The material in this section is intended as a review of the topic of probability

More information

The Mean Value Theorem

The Mean Value Theorem The Mean Value Theorem THEOREM (The Extreme Value Theorem): If f is continuous on a closed interval [a, b], then f attains an absolute maximum value f(c) and an absolute minimum value f(d) at some numbers

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. DISCRETE RANDOM VARIABLES.. Definition of a Discrete Random Variable. A random variable X is said to be discrete if it can assume only a finite or countable

More information

Transformations and Expectations of random variables

Transformations and Expectations of random variables Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,

More information

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to

More information

Lecture 13: Martingales

Lecture 13: Martingales Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

Two Fundamental Theorems about the Definite Integral

Two Fundamental Theorems about the Definite Integral Two Fundamental Theorems about the Definite Integral These lecture notes develop the theorem Stewart calls The Fundamental Theorem of Calculus in section 5.3. The approach I use is slightly different than

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100 Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,

More information

2 Binomial, Poisson, Normal Distribution

2 Binomial, Poisson, Normal Distribution 2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

Notes on Probability Theory

Notes on Probability Theory Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a

More information

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 3. Spaces with special properties

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 3. Spaces with special properties SOLUTIONS TO EXERCISES FOR MATHEMATICS 205A Part 3 Fall 2008 III. Spaces with special properties III.1 : Compact spaces I Problems from Munkres, 26, pp. 170 172 3. Show that a finite union of compact subspaces

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

APPLIED MATHEMATICS ADVANCED LEVEL

APPLIED MATHEMATICS ADVANCED LEVEL APPLIED MATHEMATICS ADVANCED LEVEL INTRODUCTION This syllabus serves to examine candidates knowledge and skills in introductory mathematical and statistical methods, and their applications. For applications

More information

Gambling Systems and Multiplication-Invariant Measures

Gambling Systems and Multiplication-Invariant Measures Gambling Systems and Multiplication-Invariant Measures by Jeffrey S. Rosenthal* and Peter O. Schwartz** (May 28, 997.. Introduction. This short paper describes a surprising connection between two previously

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

1. Prove that the empty set is a subset of every set.

1. Prove that the empty set is a subset of every set. 1. Prove that the empty set is a subset of every set. Basic Topology Written by Men-Gen Tsai email: b89902089@ntu.edu.tw Proof: For any element x of the empty set, x is also an element of every set since

More information

10.2 Series and Convergence

10.2 Series and Convergence 10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles... MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Binomial lattice model for stock prices

Binomial lattice model for stock prices Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }

More information

Mathematical Methods of Engineering Analysis

Mathematical Methods of Engineering Analysis Mathematical Methods of Engineering Analysis Erhan Çinlar Robert J. Vanderbei February 2, 2000 Contents Sets and Functions 1 1 Sets................................... 1 Subsets.............................

More information

An example of a computable

An example of a computable An example of a computable absolutely normal number Verónica Becher Santiago Figueira Abstract The first example of an absolutely normal number was given by Sierpinski in 96, twenty years before the concept

More information

6.3 Conditional Probability and Independence

6.3 Conditional Probability and Independence 222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4.

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4. Difference Equations to Differential Equations Section. The Sum of a Sequence This section considers the problem of adding together the terms of a sequence. Of course, this is a problem only if more than

More information

1 Prior Probability and Posterior Probability

1 Prior Probability and Posterior Probability Math 541: Statistical Theory II Bayesian Approach to Parameter Estimation Lecturer: Songfeng Zheng 1 Prior Probability and Posterior Probability Consider now a problem of statistical inference in which

More information

Cardinality. The set of all finite strings over the alphabet of lowercase letters is countable. The set of real numbers R is an uncountable set.

Cardinality. The set of all finite strings over the alphabet of lowercase letters is countable. The set of real numbers R is an uncountable set. Section 2.5 Cardinality (another) Definition: The cardinality of a set A is equal to the cardinality of a set B, denoted A = B, if and only if there is a bijection from A to B. If there is an injection

More information

Homework # 3 Solutions

Homework # 3 Solutions Homework # 3 Solutions February, 200 Solution (2.3.5). Noting that and ( + 3 x) x 8 = + 3 x) by Equation (2.3.) x 8 x 8 = + 3 8 by Equations (2.3.7) and (2.3.0) =3 x 8 6x2 + x 3 ) = 2 + 6x 2 + x 3 x 8

More information

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ

More information

Mathematics for Econometrics, Fourth Edition

Mathematics for Econometrics, Fourth Edition Mathematics for Econometrics, Fourth Edition Phoebus J. Dhrymes 1 July 2012 1 c Phoebus J. Dhrymes, 2012. Preliminary material; not to be cited or disseminated without the author s permission. 2 Contents

More information

Undergraduate Notes in Mathematics. Arkansas Tech University Department of Mathematics

Undergraduate Notes in Mathematics. Arkansas Tech University Department of Mathematics Undergraduate Notes in Mathematics Arkansas Tech University Department of Mathematics An Introductory Single Variable Real Analysis: A Learning Approach through Problem Solving Marcel B. Finan c All Rights

More information

Probability and statistics; Rehearsal for pattern recognition

Probability and statistics; Rehearsal for pattern recognition Probability and statistics; Rehearsal for pattern recognition Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception

More information

Continued Fractions and the Euclidean Algorithm

Continued Fractions and the Euclidean Algorithm Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction

More information

MA651 Topology. Lecture 6. Separation Axioms.

MA651 Topology. Lecture 6. Separation Axioms. MA651 Topology. Lecture 6. Separation Axioms. This text is based on the following books: Fundamental concepts of topology by Peter O Neil Elements of Mathematics: General Topology by Nicolas Bourbaki Counterexamples

More information

Nonparametric adaptive age replacement with a one-cycle criterion

Nonparametric adaptive age replacement with a one-cycle criterion Nonparametric adaptive age replacement with a one-cycle criterion P. Coolen-Schrijner, F.P.A. Coolen Department of Mathematical Sciences University of Durham, Durham, DH1 3LE, UK e-mail: Pauline.Schrijner@durham.ac.uk

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

CONTINUED FRACTIONS AND PELL S EQUATION. Contents 1. Continued Fractions 1 2. Solution to Pell s Equation 9 References 12

CONTINUED FRACTIONS AND PELL S EQUATION. Contents 1. Continued Fractions 1 2. Solution to Pell s Equation 9 References 12 CONTINUED FRACTIONS AND PELL S EQUATION SEUNG HYUN YANG Abstract. In this REU paper, I will use some important characteristics of continued fractions to give the complete set of solutions to Pell s equation.

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Follow links for Class Use and other Permissions. For more information send email to: permissions@pupress.princeton.edu

Follow links for Class Use and other Permissions. For more information send email to: permissions@pupress.princeton.edu COPYRIGHT NOTICE: Ariel Rubinstein: Lecture Notes in Microeconomic Theory is published by Princeton University Press and copyrighted, c 2006, by Princeton University Press. All rights reserved. No part

More information