M266 - Probability and Statistics - Exam 2 Review

Similar documents
Lecture Notes 1. Brief Review of Basic Probability

Master s Theory Exam Spring 2006

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Math 461 Fall 2006 Test 2 Solutions

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Introduction to Probability

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

2WB05 Simulation Lecture 8: Generating random variables

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Joint Exam 1/P Sample Exam 1

Poisson Processes. Chapter Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0.

PROBABILITY AND STATISTICS. Ma To teach a knowledge of combinatorial reasoning.

Tenth Problem Assignment

Sums of Independent Random Variables

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

( ) is proportional to ( 10 + x)!2. Calculate the

Notes on Continuous Random Variables

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

1 Sufficient statistics

Probability Distribution

THE CENTRAL LIMIT THEOREM TORONTO

The Exponential Distribution

Lecture 8. Confidence intervals and the central limit theorem

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Important Probability Distributions OPRE 6301

Section 5.1 Continuous Random Variables: Introduction

Math 431 An Introduction to Probability. Final Exam Solutions

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

Probability Generating Functions

3. The Economics of Insurance

Statistics I for QBIC. Contents and Objectives. Chapters 1 7. Revised: August 2013

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

MAS108 Probability I

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Statistics 100A Homework 8 Solutions

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

An Introduction to Basic Statistics and Probability

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Lecture 7: Continuous Random Variables

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability Calculator

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

Random variables, probability distributions, binomial random variable

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

5. Continuous Random Variables

STAT 360 Probability and Statistics. Fall 2012

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Chapter 5. Random variables

TABLE OF CONTENTS. 4. Daniel Markov 1 173

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

MATHEMATICAL METHODS OF STATISTICS

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

A Model of Optimum Tariff in Vehicle Fleet Insurance


Lecture 6: Discrete & Continuous Probability and Random Variables

Some special discrete probability distributions

PROBABILITY AND SAMPLING DISTRIBUTIONS

Chapter 4 Lecture Notes

Maximum Likelihood Estimation

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

MAS2317/3317. Introduction to Bayesian Statistics. More revision material

Aggregate Loss Models

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

STAT 830 Convergence in Distribution

Exponential Distribution

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

2. Discrete random variables

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Lecture 13: Martingales

Stat 704 Data Analysis I Probability Review

AFM Ch.12 - Practice Test

Chapter 5 Discrete Probability Distribution. Learning objectives

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Math/Stats 342: Solutions to Homework

Practice problems for Homework 11 - Point Estimation

Introduction to Queueing Theory and Stochastic Teletraffic Models

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

32. PROBABILITY P(A B)

Exam Introduction Mathematical Finance and Insurance

Quantitative Methods for Finance

Summer School in Statistics for Astronomers & Physicists, IV June 9-14, 2008

Satistical modelling of clinical trials

Sampling Distributions

Practice Problems #4

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Business Statistics. Successful completion of Introductory and/or Intermediate Algebra courses is recommended before taking Business Statistics.

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

Homework set 4 - Solutions

To discuss this topic fully, let us define some terms used in this and the following sets of supplemental notes.

VISUALIZATION OF DENSITY FUNCTIONS WITH GEOGEBRA

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Transcription:

M266 - Probability and Statistics - Exam 2 Review Markov and Chebyshev s Inequalities Markov s Inequality: If X is a random variable that takes on only nonnegative values, then for any value a > 0, P {X a} E[X] a Chebyshev s Inequality: If X is a random variable with finite mean µ and variance σ 2, then for any value k > 0, Moment-Generating Function P { X µ k} σ2 k 2 Moment Generating Function: Let X be a random variable. Then the moment generating function M X (t) of X is defined by M X (t) := E[e tx ] Computing n-th moment of a random variable using the moment generating function: E[X n ] = M (n) X (0) Theorem 4.10: If a and b are constants, then (a) M X+a (t) = e at M X (t) (b) M bx (t) = M X (bt) ( ) (c) M X+a (t) = e a t b t M X b b Uniqueness of Moment Generating Functions: Theorem: Let X and Y be two random variables with moment generating functions M X (t) and M Y (t). If for some δ > 0, M X (t) = M Y (t) for t ( δ, δ), then X and Y have the same distribution. Discrete Uniform Distribution Description: A random variable X that takes on k different values with equal probability is called a discrete uniform random variable. Probability Mass Function: If X is a discrete uniform random variable, then its probability distribution is given by f(x) = 1 k for x = x 1, x 2,..., x k Mean and Variance. Let X be a discrete uniform random variable over the set {1, 2,..., k}. Then the mean and variance of X are µ = k + 1 2 and σ 2 = k2 1 12

Moment-Generating Function. Let X be a discrete uniform random variable over the set {1, 2,..., k}. Then the moment-generating function of X is Bernoulli Distribution M X (t) = et (1 e kt ) k(1 e t ). Description: Suppose a random variable X has only two possible outcomes, where we call the first outcome a success and the second outcome a failure. If we let X = 1 when the outcome is a success and X = 0 when the outcome is a failure and P (X = 1) = θ and P (X = 0) = (1 θ) where θ, 0 θ 1, is the probability of success, then X is called a Bernoulli random variable with parameter θ. Probability Mass Function: Let X be a Bernoulli random variable with parameter θ. Then its probability distribution is given by f(x; θ) = θ x (1 θ) 1 x for x = 0, 1 Expectation and Variance: Let X be a Bernoulli random variable with parameter θ. Then its expectation and variance is given by E[X] = θ and var(x) = θ(1 θ) Moment-Generating Function: Let X be a Bernoulli random variable with parameter θ. Then its moment-generating function is given by Binomial Distribution M X (t) = 1 + θ(e t 1). Description: Suppose you perform n independent Bernoulli trials with parameter θ. If X represents the number of successes that occur in the n trials, then X is called a binomial random variable with parameters (n, θ). Probability Mass Function: Let X be a binomial random variable with parameters (n, θ). Then the probability distribution of X is ( ) n f(x; n, θ) = P (X = x) = θ x (1 θ) n x x = 0, 1,..., n x Binomial Theorem. (x + y) n = n k=0 ( ) n x k y n k k Expectation and Variance: If X is a binomial random variable with parameters (n, θ), then E[X] = nθ and var(x) = nθ(1 θ) 2

Moment-Generating Function: If X is a binomial random variable with parameters (n, θ), then M X (t) = [1 + θ(e t 1)] n. Negative Binomial Distribution Description: Suppose you perform a sequence of independent Bernoulli trials with probability of success θ. If X represents the number of trials it takes to attain k successes, then X is called a negative binomial random variable with parameters θ and k. Probability Mass Function: Let X be a negative binomial random variable with parameters θ and k. Then the probability distribution of X is given by ( ) x 1 f(x; k, θ) = θ k (1 θ) x k x = k, k + 1, k + 2,.... k 1 Expectation and Variance: Let X be a negative binomial random variable with parameters θ and k. Then the expectation and variance of X are E[X] = k and var(x) = k ( ) 1 θ θ θ 1. Moment-Generating Function: Let X be a negative binomial random variable with parameters θ and k. Then the moment-generating function of X is [ θe t M X (t) = 1 (1 θ)e t Geometric Distribution: A negative binomial random variable with k = 1 (i.e. the number of trials until the first success occurs) is called the geometric random variable. Poisson Distribution Description : Let X be a binomial random variable with parameters (n, θ) such that n is large and nθ is moderate. Then X can be approximated by a Poisson random variable with parameter λ = nθ = E[X]. Probability Mass Function: Let X be a Poisson random variable with parameter λ. Then its probability distribution is given by ] k λ λx f(x; λ) = e x! x = 0, 1, 2,... Expectation and Variance: Let X be a Poisson random variable with parameter λ, then E[X] = λ and var(x) = λ Moment-Generating Function: Let X be a Poisson random variable with parameter λ, then M X (t) = e λ(et 1) 3

The Uniform Distribution Probability Density: If X is a uniform random variable over the interval [α, β], then its probability density is given by f(x; α, β) = 1 β α α < x < β Mean and Variance. Let X be a uniform random variable over the interval [α, β]. Then the mean and variance of X are E[X] = β + α 2 and var(x) = (β α)2 12 Moment-Generating Function. Let X be a uniform random variable over the interval [α, β]. Then the moment-generating function of X is The Gamma Distribution M X (t) = etβ e tα t(β α). Probability Density: If X is a gamma random variable with parameters α and β, then its probability density is given by f(x; α, β) = 1 β α Γ(α) xα 1 e x/β for x > 0 where α > 0 and β > 0 and Γ is the gamma function defined by Γ(α) = 0 y α 1 e y dy for α > 0 Expectation and Variance: Let X be a gamma random variable with parameters α and β. Then its expectation and variance are given by E[X] = αβ and var(x) = αβ 2 Moment-Generating Function: Let X be a gamma random variable with parameters α and β. Then its moment-generating function is given by M X (t) = (1 βt) α The Exponential Distribution - Gamma distribution with α = 1 and β = θ Probability Density: If X is an exponential random variable with parameter θ, then its probability density is given by f(x; θ) = 1 θ e x/θ for x > 0 4

Expectation and Variance: Let X be an exponential random variable with parameter θ. Then its expectation and variance are given by E[X] = θ and var(x) = θ 2 Moment-Generating Function: Let X be an exponential random variable with parameter θ. Then its moment-generating function is given by M X (t) = (1 θt) 1 The Chi-Square Distribution - Gamma distribution with α = ν/2 and β = 2 Probability Density: If X is a chi-square random variable with ν degrees of freedom, then its probability density is given by f(x; ν) = 1 2 ν/2 Γ(ν/2) x ν 2 2 e x/2 for x > 0 Expectation and Variance: Let X be a chi-square random variable with ν degrees of freedom. Then its expectation and variance are given by E[X] = ν and var(x) = 2ν Moment-Generating Function: Let X be a chi-square random variable with ν degrees of freedom. Then its moment-generating function is given by The Normal Distribution M X (t) = (1 2t) ν/2 Probability Density: If X is a normal random variable with parameters µ and σ 2, then its probability density is given by f(x; µ, σ) = 1 2πσ e (x µ)2 /2σ 2 < x < Expectation and Variance: Let X be a normal random variable with parameters µ and σ 2. Then its expectation and variance are given by E[X] = µ and var(x) = σ 2 Moment-Generating Function: Let X be a normal random variable with parameters µ and σ 2. Then its moment-generating function is given by M X (t) = e µt+ 1 2 σ2 t 2 Standard Normal Random Variable. A normal random variable with µ = 0 and σ 2 = 1. Standardizing the Normal Random Variable: If X is a normal random variable with parameters µ and σ 2, then Z. = X µ σ is a standard normal random variable. 5

Random Samples, Sample Mean, Sample Variance Random Sample from Infinite Population: If X 1, X 2,..., X n are independent and identically distributed (i.i.d) random variables, we say that they constitute a random sample from the infinite population given by their common distribution. Sample Mean and Variance: If X 1, X 2,..., X n constitute a random sample, then the sample mean and the sample variance are defined by X := n i=1 n X i and S 2 := n (X i X) 2 i=1 n 1 Distribution of Sample Mean Expectation and Variance: If X 1, X 2,..., X n constitute a random sample from an infinite population with mean µ and variance σ 2, then E[ X] = µ and var( X) = σ2 n Law of Large Numbers. For any positive constant c, the probability that X will take on a value between µ c and µ + c is at least 1 σ2 nc 2 When n, this probability approaches 1. Central Limit Theorem. If X 1, X 2,..., X n constitute a random sample from an infinite population with mean µ and variance σ 2, then the limiting distribution of Z = X µ σ/ n as n is the standard normal distribution. Sampling from Normal Population. If X 1, X 2,..., X n constitute a random sample from a normal population with mean µ and variance σ 2, then the sample mean X has a normal distribution with the mean µ and the variance σ 2 /n. Distribution of Sample Variance If X and S 2 are the mean and the variance of a random sample of size n from a normal population with mean µ and standard deviation σ, then (a) X and S 2 are independent, and (n 1)S2 (b) the random variable σ 2 has a chi-square distribution with n 1 degrees of freedom. 6

Distribution of Sample Mean with Unknown Variance If X and S 2 are the mean and variance of a random sample of size n from a normal population with mean µ and variance σ 2, then T = X µ S/ n has the t distribution with n 1 degrees of freedom. Distribution of Ratio of Sample Variances If S 2 1 and S2 2 are the variances of independent random samples of sizes n 1 and n 2 from normal populations with variances σ 2 1 and σ2 2, then F = S2 1 /σ2 1 S 2 2 /σ2 2 = σ2 2 S2 1 σ 2 1 S2 2 is a random variable having an F distribution with n 1 1 and n 2 1 degrees of freedom. 7