5. Continuous Random Variables



Similar documents
Section 5.1 Continuous Random Variables: Introduction

Notes on Continuous Random Variables

Math 461 Fall 2006 Test 2 Solutions

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 7: Continuous Random Variables

Normal distribution. ) 2 /2σ. 2π σ

Practice problems for Homework 11 - Point Estimation

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Statistics 100A Homework 4 Solutions

An Introduction to Basic Statistics and Probability

ST 371 (IV): Discrete Random Variables

Math 431 An Introduction to Probability. Final Exam Solutions

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Chapter 4 Lecture Notes

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

4. Continuous Random Variables, the Pareto and Normal Distributions

MAS108 Probability I

Chapter 5. Random variables

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Introduction to Probability

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Lecture 8. Confidence intervals and the central limit theorem

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Maximum Likelihood Estimation

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Random variables, probability distributions, binomial random variable

Math 151. Rumbos Spring Solutions to Assignment #22

STAT 830 Convergence in Distribution

ECE302 Spring 2006 HW5 Solutions February 21,

Statistics 100A Homework 7 Solutions

Lesson 20. Probability and Cumulative Distribution Functions

Principle of Data Reduction

Practice Problems #4

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Joint Exam 1/P Sample Exam 1

1 Sufficient statistics

Chapter 9 Monté Carlo Simulation

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Stats on the TI 83 and TI 84 Calculator

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

2WB05 Simulation Lecture 8: Generating random variables

Exponential Distribution

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

The normal approximation to the binomial

Continuous Random Variables

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions

Lecture Notes 1. Brief Review of Basic Probability

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

2.6. Probability. In general the probability density of a random variable satisfies two conditions:

Section 6.1 Joint Distribution Functions

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

General Sampling Methods

Statistics 100A Homework 8 Solutions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

e.g. arrival of a customer to a service station or breakdown of a component in some system.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Hypothesis Testing for Beginners

STA 256: Statistics and Probability I

Math/Stats 342: Solutions to Homework

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Dongfeng Li. Autumn 2010

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

M2S1 Lecture Notes. G. A. Young ayoung

Probability Calculator

Generating Random Variables and Stochastic Processes

Lecture 8: More Continuous Random Variables

The normal approximation to the binomial

Important Probability Distributions OPRE 6301

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Lecture 5 : The Poisson Distribution

SOME ASPECTS OF GAMBLING WITH THE KELLY CRITERION. School of Mathematical Sciences. Monash University, Clayton, Victoria, Australia 3168

Generating Random Variables and Stochastic Processes

Aggregate Loss Models

ECE302 Spring 2006 HW4 Solutions February 6,

BINOMIAL DISTRIBUTION

Binomial random variables

2 Binomial, Poisson, Normal Distribution

), 35% use extra unleaded gas ( A

Random Variables. Chapter 2. Random Variables 1

WEEK #22: PDFs and CDFs, Measures of Center and Spread

Chapter 4. iclicker Question 4.4 Pre-lecture. Part 2. Binomial Distribution. J.C. Wang. iclicker Question 4.4 Pre-lecture

Homework # 3 Solutions

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

WHERE DOES THE 10% CONDITION COME FROM?

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Manual for SOA Exam MLC.

The Standard Normal distribution

Transcription:

5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be the length of a randomly selected telephone call. (ii) Let X be the volume of coke in a can marketed as 12oz. Remarks A continuous variable has infinite precision, hence P(X = x) = 0 for any x. In this case, the p.m.f. will provide no useful information. 1

Definition. X is a continuous random variable if there is a function f(x) so that for any constants a and b, with a b, P(a X b) = b a f(x) dx (1) For δ small, P(a X a + δ) f(a) δ. The function f(x) is called the probability density function (p.d.f.). For any a, P(X = a) = P(a X a) = a f(x) dx = 0. a A discrete random variable does not have a density function, since if a is a possible value of a discrete RV X, we have P(X = a) > 0. Random variables can be partly continuous and partly discrete. 2

The following properties follow from the axioms: f(x) dx = 1. f(x) 0. Example. For some constant c, the random variable X has probability density function cx n 0 < x < 1 f(x) = 0 otherwise Find (a) c and (b) P(X > x) for 0 < x < 1. 3

Discussion problem. Let X be the duration of a telephone call in minutes and suppose X has p.d.f. f(x) = c e x/10 for x 0. Find c, and also find the chance that the call lasts less than 5 minutes. 4

Cumulative Distribution Function (c.d.f.) The c.d.f. of a continuous RV is defined exactly the same as for discrete RVs: F (x) = P(X x) = P (X (, x]) Hence F (x) = x f(x)dx, and differentiating both sides we get df dx = f(x) Example. Suppose the lifetime, X, of a car battery has P(X > x) = 2 x. Find the p.d.f. of X. 5

Expectation of Continuous RVs Definition. For X a continuous RV with p.d.f. f(x) E(X) = x f(x)dx Intuitively, this comes from the discrete case by replacing with and p(x i ) with f(x)dx. We can think of a continuous distribution as being approximated by a discrete distribution on a lattice..., 2δ, δ, 0, δ, 2δ,... for small δ. Exercise. Let X be a continuous RV and let X δ be a discrete RV approximating X on the lattice..., 2δ, δ, 0, δ, 2δ,... for small δ. Sketch a p.d.f. f(x) for X and the corresponding p.m.f p(x) for X δ, paying attention to the scaling on the y-axis. 6

Example. Suppose X has p.d.f. (1 x 2 )(3/4) 1 x 1 f(x) = 0 else Find the expected value of X. Solution. Note. Similarly to discrete RVs, the expected value is the balancing point of the graph of the p.d.f., and so if the p.d.f. is symmetric then the expected value is the point of symmetry. A sketch of the p.d.f. quickly determines the expected value in this case: 7

Example. The density function of X is a + bx 2 0 x 1 f(x) = 0 else If E(X) = 3/5, find a and b. Solution. 8

Proposition 1. For X a non-negative continuous RV, with p.d.f. f and c.d.f. F, E(X) = Proof. 0 P(X > x)dx = 0 (1 F (x)) dx 9

Proposition 2. For X a continuous RV with p.d.f. f and any real-valued function g Proof. E [g(x)] = g(x)f(x)dx 10

Example. A stick of length 1 is split at a point U that is uniformly distributed over (0, 1). Determine the expected length of the piece that contains the point p, for 0 p 1. Solution. 11

Variance of Continuous RVs The definition is the same as for discrete RVs: [ (X ) ] 2 Var(X) = E E(X) The basic properties are also the same Var(X) = E(X 2 ) (E(X)) 2 E(aX + b) = ae(x) + b Var(aX + b) = a 2 Var(X) Example. If E(X) = 1 and Var(X) = 5, find (a) E [ (2 + X) 2] (b) Var(4 + 3X) 12

The Uniform Distribution Definition: X has the uniform distribution on [0, 1] (and we write X Uniform[0, 1] or just X U[0, 1]) if X has p.d.f. 1 0 x 1 f(x) = 0 else This is what people usually mean when they talk of a random number between 0 and 1. Each interval of length δ in [0, 1] has equal probability: x+δ f(y)dy = δ x The chance of X falling into an interval is equal to the length of the interval. 13

Generalization: X has the uniform distribution on [a, b] (i.e. X U[a, b]) if X has p.d.f. f(x) = 1/(b a) 0 else Proposition. If Z U[0, 1], then (b a)z + a U[a, b]. Proof. a x b 14

Example. Let X U[a, b]. Show that E(X) = a+b 2 and Var(X) = (b a)2 12, (a) by standardizing (i.e., using the previous proposition); (b) directly (by brute force). 15

Example. Your company must make a sealed bid for a construction project. If you succeed in winning the contract (by having the lowest bid), then you plan to pay another firm $100, 000 to do the work. If you believe that the maximum bid (in thousands of dollars) of the other participating companies can be modeled as being the value of a random variable that is uniformly distributed on (70, 140), how much should you bid to maximize your expected profit? Solution. 16

Discussion problem. Suppose X U [5, 10]. Find the probability that X 2 5X 6 is greater than zero. Solution. 17

The Standard Normal Distribution Definition: Z has the standard normal distribution if it has p.d.f. f(x) = 1 2π e x2 /2 f(x) is symmetric about x = 0, so E(X) = 0. Var(X) = 1. Check this, using integration by parts: 18

The Normal Distribution Definition. If X has density f(x) = { 1 (x µ) 2 exp 2πσ 2 2σ 2 then X has the normal distribution, and we write X N(µ, σ 2 ). E(X) = µ and Var(X) = σ 2. If Z N(0, 1) then Z is standard normal. Proposition. Let Z N(0, 1) and set X = µ + σz for constants µ and σ. Then, X N(µ, σ 2 ). Proof. } 19

Exercise. Check that the standard normal density integrates to 1. Trick. By changing to polar coordinates, show ( 2 exp{ x 2 /2} dx) = 2π. Solution. 20

Calculating with the Normal Distribution There is no closed form solution to the integral 1 2π e x2 /2 dx, so we rely upon computers (or b a tables). The c.d.f. of the standard normal distribution is z 1 Φ(z) = e x2 /2 dx. 2π This is tabulated on page 201 of Ross. Example. Find 2 1 Solution. 1 2π e x2 /2 dx. 21

Standardization If we wish to find P(a X b) where X N(µ, σ 2 ), we write X = µ + σz. Then, ( a µ P(a X b) = P Z b µ ) σ σ ( ) ( ) b µ a µ = Φ Φ σ σ Example. Suppose the weight of a new-born baby averages 8 lbs, with a SD of 1.5 lbs. If weights are normally distributed, what fraction of babies are between 7 and 10 pounds? Solution. 22

Example continued. For what value of x does the interval [8 x, 8 + x] include 95% of birthweights? Solution. 23

Example. Suppose that the travel time from your home to your office is normally distributed with mean 40 minutes and standard deviation 7 minutes. If you want to be 95% percent certain that you will not be late for an office appointment at 1 p.m., What is the latest time that you should leave home? Solution. 24

The Normal Approximation to the Binomial Distribution If X Binomial(n, p) and n is large enough, then X is approximately N(np, np(1 p)). Rule of thumb: this approximation is reasonably good for np(1 p) > 10 P(X = k) P(k 1/2 < Y < k + 1/2) where Y N(np, np(1 p)). Note: P(X k) is usually approximated by P(Y < k + 1/2). 25

Example. I toss 1000 coins. Find the chance (approximately) that the number of heads is between 475 and 525, inclusive. Solution. 26

Example. The ideal size of a first-year class at a particular college is 150 students. The college, knowing from past experience that on the average only 30 percent of those accepted for admission will actually attend, uses a policy of approving the applications of 450 students. Compute the probability that more than 150 first-year students attend this college. Solution. 27

The Exponential Distribution Definition. X has the exponential distribution with parameter λ if it has density λe λx x 0 f(x) = 0 x < 0 We write X Exponential(λ). E(X) = 1/λ and Var(X) = 1/λ 2. F (x) = 1 e λx 28

Example. The amount of time, in hours, that a computer functions before breaking down is a continuous random variable with probability density function given by λe x/100 x 0 f(x) = 0 x < 0 Find the probability that (a) the computer will break down within the first 100 hours; (b) given that it it still working after 100 hours, it breaks down within the next 100 hours. Solution. 29

The Memoryless Property of Exponential RVs The exponential distribution is the continuous analogue of the geometric distribution (one has an exponentially decaying p.m.f., the other an exponentially decaying p.d.f.). Suppose that X Exponential(λ). Then P(X > t + s X > t) = e λs = P(X > s). Check this: This is an analog for continuous random variables of the memoryless property that we saw for the geometric distribution. 30

Example. At a certain bank, the amount of time that a customer spends being served by a teller is an exponential random variable with mean 5 minutes. If there is a customer in service when you enter the bank, what is the probability that he or she will still be with the teller after an additional 4 minutes? Solution. 31

Example. Suppose X U[0, 1] and Y = ln(x) (so Y > 0). Find the p.d.f. of Y. Solution. Note. This gives a convenient way to simulate exponential random variables. 32

Calculations similar to the previous example are required whenever we want to find the distribution of a function of a random variable Example. Suppose X has p.d.f. f X (x) and Y = ax + b for constants a and b. Find the p.d.f. f Y (y) of Y. Solution. 33

Example. A stick of length 1 is split at a point U that is uniformly distributed over (0, 1). Determine the p.d.f. of the longer piece. Solution. 34