M266 - Probability and Statistics - Exam 2 Review Markov and Chebyshev s Inequalities Markov s Inequality: If X is a random variable that takes on only nonnegative values, then for any value a > 0, P {X a} E[X] a Chebyshev s Inequality: If X is a random variable with finite mean µ and variance σ 2, then for any value k > 0, Moment-Generating Function P { X µ k} σ2 k 2 Moment Generating Function: Let X be a random variable. Then the moment generating function M X (t) of X is defined by M X (t) := E[e tx ] Computing n-th moment of a random variable using the moment generating function: E[X n ] = M (n) X (0) Theorem 4.10: If a and b are constants, then (a) M X+a (t) = e at M X (t) (b) M bx (t) = M X (bt) ( ) (c) M X+a (t) = e a t b t M X b b Uniqueness of Moment Generating Functions: Theorem: Let X and Y be two random variables with moment generating functions M X (t) and M Y (t). If for some δ > 0, M X (t) = M Y (t) for t ( δ, δ), then X and Y have the same distribution. Discrete Uniform Distribution Description: A random variable X that takes on k different values with equal probability is called a discrete uniform random variable. Probability Mass Function: If X is a discrete uniform random variable, then its probability distribution is given by f(x) = 1 k for x = x 1, x 2,..., x k Mean and Variance. Let X be a discrete uniform random variable over the set {1, 2,..., k}. Then the mean and variance of X are µ = k + 1 2 and σ 2 = k2 1 12
Moment-Generating Function. Let X be a discrete uniform random variable over the set {1, 2,..., k}. Then the moment-generating function of X is Bernoulli Distribution M X (t) = et (1 e kt ) k(1 e t ). Description: Suppose a random variable X has only two possible outcomes, where we call the first outcome a success and the second outcome a failure. If we let X = 1 when the outcome is a success and X = 0 when the outcome is a failure and P (X = 1) = θ and P (X = 0) = (1 θ) where θ, 0 θ 1, is the probability of success, then X is called a Bernoulli random variable with parameter θ. Probability Mass Function: Let X be a Bernoulli random variable with parameter θ. Then its probability distribution is given by f(x; θ) = θ x (1 θ) 1 x for x = 0, 1 Expectation and Variance: Let X be a Bernoulli random variable with parameter θ. Then its expectation and variance is given by E[X] = θ and var(x) = θ(1 θ) Moment-Generating Function: Let X be a Bernoulli random variable with parameter θ. Then its moment-generating function is given by Binomial Distribution M X (t) = 1 + θ(e t 1). Description: Suppose you perform n independent Bernoulli trials with parameter θ. If X represents the number of successes that occur in the n trials, then X is called a binomial random variable with parameters (n, θ). Probability Mass Function: Let X be a binomial random variable with parameters (n, θ). Then the probability distribution of X is ( ) n f(x; n, θ) = P (X = x) = θ x (1 θ) n x x = 0, 1,..., n x Binomial Theorem. (x + y) n = n k=0 ( ) n x k y n k k Expectation and Variance: If X is a binomial random variable with parameters (n, θ), then E[X] = nθ and var(x) = nθ(1 θ) 2
Moment-Generating Function: If X is a binomial random variable with parameters (n, θ), then M X (t) = [1 + θ(e t 1)] n. Negative Binomial Distribution Description: Suppose you perform a sequence of independent Bernoulli trials with probability of success θ. If X represents the number of trials it takes to attain k successes, then X is called a negative binomial random variable with parameters θ and k. Probability Mass Function: Let X be a negative binomial random variable with parameters θ and k. Then the probability distribution of X is given by ( ) x 1 f(x; k, θ) = θ k (1 θ) x k x = k, k + 1, k + 2,.... k 1 Expectation and Variance: Let X be a negative binomial random variable with parameters θ and k. Then the expectation and variance of X are E[X] = k and var(x) = k ( ) 1 θ θ θ 1. Moment-Generating Function: Let X be a negative binomial random variable with parameters θ and k. Then the moment-generating function of X is [ θe t M X (t) = 1 (1 θ)e t Geometric Distribution: A negative binomial random variable with k = 1 (i.e. the number of trials until the first success occurs) is called the geometric random variable. Poisson Distribution Description : Let X be a binomial random variable with parameters (n, θ) such that n is large and nθ is moderate. Then X can be approximated by a Poisson random variable with parameter λ = nθ = E[X]. Probability Mass Function: Let X be a Poisson random variable with parameter λ. Then its probability distribution is given by ] k λ λx f(x; λ) = e x! x = 0, 1, 2,... Expectation and Variance: Let X be a Poisson random variable with parameter λ, then E[X] = λ and var(x) = λ Moment-Generating Function: Let X be a Poisson random variable with parameter λ, then M X (t) = e λ(et 1) 3
The Uniform Distribution Probability Density: If X is a uniform random variable over the interval [α, β], then its probability density is given by f(x; α, β) = 1 β α α < x < β Mean and Variance. Let X be a uniform random variable over the interval [α, β]. Then the mean and variance of X are E[X] = β + α 2 and var(x) = (β α)2 12 Moment-Generating Function. Let X be a uniform random variable over the interval [α, β]. Then the moment-generating function of X is The Gamma Distribution M X (t) = etβ e tα t(β α). Probability Density: If X is a gamma random variable with parameters α and β, then its probability density is given by f(x; α, β) = 1 β α Γ(α) xα 1 e x/β for x > 0 where α > 0 and β > 0 and Γ is the gamma function defined by Γ(α) = 0 y α 1 e y dy for α > 0 Expectation and Variance: Let X be a gamma random variable with parameters α and β. Then its expectation and variance are given by E[X] = αβ and var(x) = αβ 2 Moment-Generating Function: Let X be a gamma random variable with parameters α and β. Then its moment-generating function is given by M X (t) = (1 βt) α The Exponential Distribution - Gamma distribution with α = 1 and β = θ Probability Density: If X is an exponential random variable with parameter θ, then its probability density is given by f(x; θ) = 1 θ e x/θ for x > 0 4
Expectation and Variance: Let X be an exponential random variable with parameter θ. Then its expectation and variance are given by E[X] = θ and var(x) = θ 2 Moment-Generating Function: Let X be an exponential random variable with parameter θ. Then its moment-generating function is given by M X (t) = (1 θt) 1 The Chi-Square Distribution - Gamma distribution with α = ν/2 and β = 2 Probability Density: If X is a chi-square random variable with ν degrees of freedom, then its probability density is given by f(x; ν) = 1 2 ν/2 Γ(ν/2) x ν 2 2 e x/2 for x > 0 Expectation and Variance: Let X be a chi-square random variable with ν degrees of freedom. Then its expectation and variance are given by E[X] = ν and var(x) = 2ν Moment-Generating Function: Let X be a chi-square random variable with ν degrees of freedom. Then its moment-generating function is given by The Normal Distribution M X (t) = (1 2t) ν/2 Probability Density: If X is a normal random variable with parameters µ and σ 2, then its probability density is given by f(x; µ, σ) = 1 2πσ e (x µ)2 /2σ 2 < x < Expectation and Variance: Let X be a normal random variable with parameters µ and σ 2. Then its expectation and variance are given by E[X] = µ and var(x) = σ 2 Moment-Generating Function: Let X be a normal random variable with parameters µ and σ 2. Then its moment-generating function is given by M X (t) = e µt+ 1 2 σ2 t 2 Standard Normal Random Variable. A normal random variable with µ = 0 and σ 2 = 1. Standardizing the Normal Random Variable: If X is a normal random variable with parameters µ and σ 2, then Z. = X µ σ is a standard normal random variable. 5
Random Samples, Sample Mean, Sample Variance Random Sample from Infinite Population: If X 1, X 2,..., X n are independent and identically distributed (i.i.d) random variables, we say that they constitute a random sample from the infinite population given by their common distribution. Sample Mean and Variance: If X 1, X 2,..., X n constitute a random sample, then the sample mean and the sample variance are defined by X := n i=1 n X i and S 2 := n (X i X) 2 i=1 n 1 Distribution of Sample Mean Expectation and Variance: If X 1, X 2,..., X n constitute a random sample from an infinite population with mean µ and variance σ 2, then E[ X] = µ and var( X) = σ2 n Law of Large Numbers. For any positive constant c, the probability that X will take on a value between µ c and µ + c is at least 1 σ2 nc 2 When n, this probability approaches 1. Central Limit Theorem. If X 1, X 2,..., X n constitute a random sample from an infinite population with mean µ and variance σ 2, then the limiting distribution of Z = X µ σ/ n as n is the standard normal distribution. Sampling from Normal Population. If X 1, X 2,..., X n constitute a random sample from a normal population with mean µ and variance σ 2, then the sample mean X has a normal distribution with the mean µ and the variance σ 2 /n. Distribution of Sample Variance If X and S 2 are the mean and the variance of a random sample of size n from a normal population with mean µ and standard deviation σ, then (a) X and S 2 are independent, and (n 1)S2 (b) the random variable σ 2 has a chi-square distribution with n 1 degrees of freedom. 6
Distribution of Sample Mean with Unknown Variance If X and S 2 are the mean and variance of a random sample of size n from a normal population with mean µ and variance σ 2, then T = X µ S/ n has the t distribution with n 1 degrees of freedom. Distribution of Ratio of Sample Variances If S 2 1 and S2 2 are the variances of independent random samples of sizes n 1 and n 2 from normal populations with variances σ 2 1 and σ2 2, then F = S2 1 /σ2 1 S 2 2 /σ2 2 = σ2 2 S2 1 σ 2 1 S2 2 is a random variable having an F distribution with n 1 1 and n 2 1 degrees of freedom. 7