Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.



Similar documents
Chapter 4 Lecture Notes

ST 371 (IV): Discrete Random Variables

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

University of California, Los Angeles Department of Statistics. Random variables

Probability Distribution for Discrete Random Variables

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

5. Continuous Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables

Random variables, probability distributions, binomial random variable

2 Binomial, Poisson, Normal Distribution

Lecture 8. Confidence intervals and the central limit theorem

Math 431 An Introduction to Probability. Final Exam Solutions

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Joint Exam 1/P Sample Exam 1

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Statistics 100A Homework 4 Solutions

MAS108 Probability I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

You flip a fair coin four times, what is the probability that you obtain three heads.

4. Continuous Random Variables, the Pareto and Normal Distributions

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

An Introduction to Basic Statistics and Probability

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MBA 611 STATISTICS AND QUANTITATIVE METHODS

Notes on Continuous Random Variables

Math 151. Rumbos Spring Solutions to Assignment #22

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Section 5.1 Continuous Random Variables: Introduction

Introduction to Probability

The Binomial Probability Distribution

2. Discrete random variables

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Section 6.1 Joint Distribution Functions

Normal distribution. ) 2 /2σ. 2π σ

The normal approximation to the binomial

Normal Approximation. Contents. 1 Normal Approximation. 1.1 Introduction. Anthony Tanbakuchi Department of Mathematics Pima Community College

Chapter 5. Random variables

E3: PROBABILITY AND STATISTICS lecture notes

Decision Making Under Uncertainty. Professor Peter Cramton Economics 300

Aggregate Loss Models

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

ECE302 Spring 2006 HW5 Solutions February 21,

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

Random Variables. Chapter 2. Random Variables 1

STA 256: Statistics and Probability I

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Introduction to Hypothesis Testing

Lecture 7: Continuous Random Variables

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 7 Probability

Math 461 Fall 2006 Test 2 Solutions

Ex. 2.1 (Davide Basilio Bartolini)

Probability Generating Functions

Transformations and Expectations of random variables

In the situations that we will encounter, we may generally calculate the probability of an event

Chapter 4. Polynomial and Rational Functions. 4.1 Polynomial Functions and Their Graphs

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

2WB05 Simulation Lecture 8: Generating random variables

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Solution. Solution. (a) Sum of probabilities = 1 (Verify) (b) (see graph) Chapter 4 (Sections ) Homework Solutions. Section 4.

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Notes on Probability Theory

Statistics 100A Homework 3 Solutions

Department of Civil Engineering-I.I.T. Delhi CEL 899: Environmental Risk Assessment Statistics and Probability Example Part 1

Hypothesis Testing for Beginners

WHERE DOES THE 10% CONDITION COME FROM?

Sums of Independent Random Variables

The normal approximation to the binomial

Stats on the TI 83 and TI 84 Calculator

Chapter 5 Discrete Probability Distribution. Learning objectives

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

MATH 140 Lab 4: Probability and the Standard Normal Distribution

Lecture 10: Depicting Sampling Distributions of a Sample Proportion

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

HYPOTHESIS TESTING: POWER OF THE TEST

Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Notes on the Negative Binomial Distribution

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Notes on metric spaces

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Chapter 4. iclicker Question 4.4 Pre-lecture. Part 2. Binomial Distribution. J.C. Wang. iclicker Question 4.4 Pre-lecture

Chapter 5. Discrete Probability Distributions

Domain of a Composition

10.2 Series and Convergence

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

Manual for SOA Exam MLC.

Lecture 4: Joint probability distributions; covariance; correlation

Mathematical Expectation

ECE 316 Probability Theory and Random Processes

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Transcription:

Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F () or F X () is called cumdist[, X] in the courseware... What I call E[X] is called Epect[X] in the courseware. Introduction Often when an eperiment is performed, we are mainly interested in some function of the outcome as opposed to the actual outcome itself. For instance, in tossing two dice we are often interested in the sum of the two dice and are not really concerned with the separate values of each die. A real-valued function defined on a sample space is known as a random variable. Because the value of a random variable is determined by the outcome of the eperiment, we can find the probability that the random variable takes on each value. Eample 1 Suppose that you toss fair coins. For each Heads, you win $1, and for each Tails, you lose $1. Let X be you net winning. Then X is a random variable that can take the values, 1, 1,. And we have P(X = ) = P(X = ) = 1 8, P(X = 1) = P(X = 1) = 8. Eample 2 Suppose that we are giving a coin having probability p (0, 1) coming up Heads. This coin is flipped until a Heads occurs. Let X be the number of flips needed. Then X is a random variable that can takes on the values 1, 2,.... And we have P(X = n) = (1 p) n 1 p, n = 1, 2,.... By the way, this is a Geometric(p) random variable. For any random variable X, the function F defined by F () = P(X ), < < is called the cumulative distribution function, or simply, the distribution function of X. Thus the distribution function specifies, for all real, the probability that the random variable X is less than or equal to X. It is obvious that the distribution function F is an increasing function of. We will discuss other properties of distribution functions later. 1

A random variable that can take on at most countably many values is called a discrete random variable. For a discrete random variable X, we define the probability mass function p of X by p() = P(X = ), < <. The probability mass function p is positive for at most a countable number of values of. That is, if X must assume one of the values 1, 2,..., then p( i ) 0, i = 1, 2,..., p() = 0, for all other values of. Since X must assume one of the values i, we have p( i ) = 1. i=1 Given any nonnegative function p() defined on R such that p() is positive for at most countably many values of and that p() = 1, there is a random variable X with p() as its probability mass function. The (cumulative) distribution function F of a discrete random variable X can be epressed in terms of the mass function p of X: F () = p(t), < <. t:t It is often instructive to present the probability mass function in a graphical format by plotting p( i ) on the y-ais against the i on the -ais. You have seen plenty of eamples of this sort in the courseware. If X is a discrete random variable whose possible values are 1, 2,... where 1 < 2 <, then its distribution function F is a step function, that is the value of F is constant in the intervals [ i 1, i ) and then takes a step (or jump) of size p( i ) at i. Thus in this case, p( i ) = F ( i ) F ( i ), i = 1, 2,.... Eample If X is a discrete random variable with probability mass function given by the its distribution function F is given by p(1) = 1 4, p(2) = 1 2, p() = 1 8, p(4) = 1 8, 0 < 1, 1 4 1 < 2 F () = 4 2 < 7 8 < 4 1 4. 2

In the courseware there are eamples of the graphs of distribution functions. Epectations of discrete random variables One of the most important concepts in probability theory is that of epectation of a random variable. If X is a discrete random variable with probability mass function p, the epectation or epected value of X, denoted by E[X], is defined by E[X] = p(). :p()>0 Eample 4 Let X be the random variable in Eample 1. Then E[X] = ( ) 1 8 + ( 1) 8 + 1 8 + 1 8 = 0. Eample 5 Let X be the random variable in Eample 2. Then, by putting q = 1 p, E[X] = nq n 1 d p = p dq (qn ) n=1 n=0 ( = p d ) q n = p d ( ) 1 dq dq 1 q = n=0 p (1 q) 2 = 1 p. Epectation of a function of a discrete random variable Suppose that we are given a discrete random variable along with its probability mass function, and that we want to compute the epectation of some function of X, say g(x). How can we accomplis this? One way is as follows: Since g(x) is itself a discrete random variable, it has a probability mass function which can be determined from the probability mass function of X. Once we get the probability mass function of g(x), we can find its epectation easily. However, the following result gives a more convenient method. Proposition 6 If X is a discrete random variable with mass function p, then for any real-valued function g, E[g(X)] = g()p(). As a consequence of this theorem, we immediately get the following: Corollary 7 If X is a a discrete random variable, and a, b are constants, then E[aX + b] = a E[X] + b. This appears in the courseware, in a slightly different form. Eample 8 Let X be a random variable that takes on the values 1, 0, 1 with respective probabilities P(X = 1) =.2, P(X = 0) =.5, P(X = 1) =..

Find E[X 2 ]. Solution 1 Letting Y = X 2, then the probability mass function of Y is given by Hence E[X 2 ] = E(Y ) = 1(.5)+)(.5) =.5. Solution 2 Applying the above proposition, P(Y = 1) =.5, P(Y = 0) =.5. E[X 2 ] = ( 1) 2 (.2) + 0 2 (.5) + 1 2 (.) =.5. Variances of discrete random variables Given a random variable X, it would be etremely useful if we are able to summarize the essential statistical properties of X by certain suitable defined measure. One such measure is E[X], the epectation (mean) of X. However, E[X] does not tell us anything about how spreadout the random variable is from its mean. For eample, the following three random variables Y and Z with probability mass functions given by P(Y = 1) = P(Y = 1) = 1 2, P(Z = 100) = P(Z = 100) = 1 2, both have mean 0, but the second variable spread out much more than the first variable. We measure the spreadout of a random variable from its mean by the following quantity. If X is a random variable with mean µ, then the variance of X, denoted by Var(X), is defined by Var(X) = E[(X µ) 2 ]. An alternative formula for Var(X) is derived a s follows (in the discrete case): Var(X) = E[(X µ) 2 ] = ( µ) 2 p() = ( 2 2µ + µ 2 )p() = 2 p() 2µ p() + µ 2 p() = E[X 2 ] 2µ 2 + µ 2 = E[X 2 ] µ 2. That is Var(X) = E[X 2 ] (E[X]) 2. The following result follows immediately from Proposition??. Proposition 9 If X is a discrete random variable, a, b are constants, then Var(aX + b) = a 2 Var(X). Eample 10 Let X be the outcome when a fair die is rolled. Find Var(X). We can easily find E[X] = 7 2. Also Hence Var(X) = 91 6 = ( 7 2 )2 = 5 12. E[X 2 ] = 1 2 ( 1 6 ) + 22 ( 1 6 ) + 2 ( 1 6 ) + 42 ( 1 6 ) + 52 ( 1 6 ) + 62 ( 1 6 ) = 91 6. 4

Properties of distributions functions If F is the (cumulative) distribution function of a random variable X, then it follows from the definition of the distribution function and properties of probability measures that F satisfies the following properties: 1. F is a nondecreasing function; that is, if < y, then F () F (y). 2. lim F () = 1.. lim F () = 0. 4. F is right continuous; that is, for any, we have F (+) = F (), where F (+) = lim y F (y) is the right limit at. All probability questions about X can be answered in terms of F. For instance, for any real numbers a, b, P(X < a) = F (a ), P(X = a) = F (a) F (a ), P(X > a) = 1 F (a), P(a < X b) = F (b) F (a), P(a X b) = F (b) F (a ), P(a < X < b) = F (b ) F (a), P(a X < b) = F (b ) F (a ), If F is any function on R satisfying condition 1 through 4 above, then it is the distribution function of some random variable. Here is an eample of finding probabilities using the distribution function. Eample 11 The distribution function of a random variable X is given by 0 < 0, 2 0 < 1 F () = 2 1 < 2 11 12 2 < 1. Find (a) P(X < ), (b) P(X = 1), (c) P(X > 1 2 ) and (d) P(2 < X 4). 5