STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case



Similar documents
Statistics 100A Homework 7 Solutions

Section 6.1 Joint Distribution Functions

Feb 28 Homework Solutions Math 151, Winter Chapter 6 Problems (pages )

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Math 432 HW 2.5 Solutions

Section 5.1 Continuous Random Variables: Introduction

Lecture 6: Discrete & Continuous Probability and Random Variables

Joint Exam 1/P Sample Exam 1

), 35% use extra unleaded gas ( A

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Lecture Notes 1. Brief Review of Basic Probability

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Statistics 100A Homework 8 Solutions

Math 431 An Introduction to Probability. Final Exam Solutions

PSTAT 120B Probability and Statistics

ECE302 Spring 2006 HW5 Solutions February 21,

Notes on Continuous Random Variables

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

5. Continuous Random Variables

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

STA 256: Statistics and Probability I

Probability for Estimation (review)

M2S1 Lecture Notes. G. A. Young ayoung

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

5 Double Integrals over Rectangular Regions

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Exponential Distribution

M5A42 APPLIED STOCHASTIC PROCESSES PROBLEM SHEET 1 SOLUTIONS Term

ST 371 (IV): Discrete Random Variables

Calculus 1: Sample Questions, Final Exam, Solutions

MULTIPLE INTEGRALS. h 2 (y) are continuous functions on [c, d] and let f(x, y) be a function defined on R. Then

Microeconomic Theory: Basic Math Concepts

MAS108 Probability I

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

STAT x 0 < x < 1

Math 461 Fall 2006 Test 2 Solutions

Properties of moments of random variables

Ex. 2.1 (Davide Basilio Bartolini)

Random Variables. Chapter 2. Random Variables 1

Lecture 7: Continuous Random Variables

Let H and J be as in the above lemma. The result of the lemma shows that the integral

2WB05 Simulation Lecture 8: Generating random variables

Aggregate Loss Models

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Change of Variables in Double Integrals

ECG590I Asset Pricing. Lecture 2: Present Value 1

1. A survey of a group s viewing habits over the last year revealed the following

User Guide Thank you for purchasing the DX90

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions

Introduction to Probability

1. First-order Ordinary Differential Equations

Correlation in Random Variables

Lesson 20. Probability and Cumulative Distribution Functions

Lecture 5: Mathematical Expectation

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Math 151. Rumbos Spring Solutions to Assignment #22

Continuous Random Variables

Lecture 3 : The Natural Exponential Function: f(x) = exp(x) = e x. y = exp(x) if and only if x = ln(y)

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY EXAM P SAMPLE QUESTIONS

Lecture 8. Generating a non-uniform probability distribution

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Lecture 8. Confidence intervals and the central limit theorem

Probability Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

f(x) = a x, h(5) = ( 1) 5 1 = 2 2 1

Chapter 4 Lecture Notes

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY EXAM P SAMPLE QUESTIONS

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan , Fall 2010

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY EXAM P SAMPLE QUESTIONS

UNIVERSITY of TORONTO. Faculty of Arts and Science

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 830 Convergence in Distribution

Fourth Problem Assignment

3. The Economics of Insurance

INSURANCE RISK THEORY (Problems)

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

Optimization of Business Processes: An Introduction to Applied Stochastic Modeling. Ger Koole Department of Mathematics, VU University Amsterdam

Covariance and Correlation

AP Calculus AB 2010 Free-Response Questions

Poisson Processes. Chapter Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0.

Student name: Earlham College. Fall 2011 December 15, 2011

>

Solutions for Review Problems


Generating Random Variables and Stochastic Processes

Transcription:

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011

Joint density function of continuous Random Variable When X and Y are two continuous random variables, the joint density function f (x, y) is a function defined for each pair of numbers (x,y) by f (x, y) Since the total probability of the all possible pairs of (x,y) is 1, the joint density function must satisfy f (x, y)dxdy = 1 Also x,y f (x, y) 0

Marginal pdf of Continuous Random Variable The marginal pdf s of X and Y, denoted by f X (x) and f Y (y), respectively, are given by f X (x) = f (x, y)dy y f Y (y) = f (x, y)dx x

Example The joint density { function of X and Y is given by 2e f (x, y) = x e 2y, 0 < x <, 0 < y < 0, otherwise Check that f (x, y) is a joint density function. Check 1: Check 2: x,y f (x, y) 0 f (x, y)dxdy = 1

Example-continue The joint density { function of X and Y is given by 2e f (x, y) = x e 2y, 0 < x <, 0 < y < 0, otherwise Compute the marginal density of X and Y.

Example-continue The joint density { function of X and Y is given by 2e f (x, y) = x e 2y, 0 < x <, 0 < y < 0, otherwise Compute the marginal density of X and Y. f X (x) = 0 2e x e 2y dy = e x f Y (y) = 0 2e x e 2y dx = 2e 2y

Usage 1: compute probability X and Y are two continuous r.v. s. P[(X, Y ) A] = (x,y) A f (x, y)dxdy In the last example, compute (a) P(X > 1, Y < 1) (b) P(X < Y )

Usage 1: compute probability X and Y are two continuous r.v. s. P[(X, Y ) A] = (x,y) A f (x, y)dxdy In the last example, compute (a) P(X > 1, Y < 1) (b) P(X < Y ) 1 P(X > 1, Y < 1) = 1 0 2e x e 2y dxdy = e 1 e 3 P(X < Y ) = y 0 0 2e x e 2y dxdy = 1/3

Usage 2: compute marginal Expected Value X and Y are two continuous r.v. s and they have marginal distribution f X (x) and f Y (y). E[X] = xf X (x)dx E[Y ] = x y yf Y (y)dy In the last example, what is the expected value of X? EX = 0 xe x dx = 1.

Usage 3: compute Expected Value of a function of X and Y If X and Y have a joint probability mass function p(x, y), then E[g(X, Y )] = g(x, y)p(x, y) y x X and Y are two continuous r.v. s. E[g(X, Y )] = g(x, Y )f (x, y)dxdy x,y In the last example, what is the expected value of e 1 2 X+Y? E[e 1 2 X+Y ] = y 0 0 e 1 2 X+Y 2e x e 2y dxdy = y 0 2e 1 2 X Y dxdy = 4. 0

Usage 4: compute Conditional probability X and Y are two continuous r.v. s and they have marginal distribution f X (x) and f Y (y). f X Y (x y) = f (x, y) f Y (y) f Y X (y x) = f (x, y) f X (x) What is f X Y (x y)? Given that y = 2, what is the distribution of x? f X Y (x y) = e x, for any y.

Conditional Expectation E[X Y = y] = xf X Y (x y)dx. What is E[X Y = 2]? E[X Y = y] = xe x dx = 1.

Comments Again, conditional expectation satisfies all of the properties of ordinary expectation, for example E[ n i=1 X i Y = y] = n i=1 E[X i Y = y]

Usage 5: Check independence When X and Y are continuous, X and Y are independent if and only if f (x, y) = f X (x)f Y (y), for all x, y Are X and Y independent? They are, since f (x, y) = 2e x e 2y = f X (x)f Y (y)

Example The joint density of X and Y is given by f (x, y) = { 12 5 x(2 x y), 0 < x < 1, 0 < y < 1 0, otherwise Compute the conditional density of X given that Y = y, where 0 < y < 1.

Example The joint density of X and Y is given by f (x, y) = { 12 5 x(2 x y), 0 < x < 1, 0 < y < 1 0, otherwise Compute the conditional density of X given that Y = y, where 0 < y < 1. f X Y (x y) = = = f (x, y) f Y (y) 1 0 12 5 12 5 x(2 x y) x(2 x y)dx 6x(2 x y) 4 3y

Example 1: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function. Find out f X (x). Given X = 0.5, find f Y X (y x = 0.5) and E[Y X = 0.5].

Example: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function.

Example: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function. 1 x=0 1 x y=0 f (x, y)dydx = 1.

Example: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function. 1 x=0 1 x y=0 Find out f X (x). f (x, y)dydx = 1.

Example: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function. 1 x=0 1 x y=0 f (x, y)dydx = 1. Find out f X (x). f X (x) = 1 x y=0 f (x, y)dy = 12x(1 x)2.

Example: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function. 1 x=0 1 x y=0 f (x, y)dydx = 1. Find out f X (x). f X (x) = 1 x y=0 f (x, y)dy = 12x(1 x)2. Given X = 0.5, find f Y X (y x = 0.5) and E[Y X = 0.5].

Example: f (x, y) = 24xy, where 0 < x < 1, 0 < y < 1, 0 < x + y < 1, and it equals to 0 otherwise. Show that f (x, y) is a joint probability density function. 1 x=0 1 x y=0 f (x, y)dydx = 1. Find out f X (x). f X (x) = 1 x y=0 f (x, y)dy = 12x(1 x)2. Given X = 0.5, find f Y X (y x = 0.5) and E[Y X = 0.5]. f Y X (y x = 0.5) = f (X = 0.5, y)/f X (0.5) = 8y, for y (0, 1 0.5). E[Y X = 0.5] = 1 0.5 y=0 yf Y X (y x = 0.5)dy = 1 0.5 y=0 8y 2 dy = 1/3. 24 0.5y 12 0.5(1 0.5) 2 =

Example 2 A man and a woman decide to meet at a certain location. If each of them independently arrives at a time uniformly distributed between 12 noon and 1 P.M. Then f (x, y) = (1/60) 2, 0 < x < 60, 0 < y < 60, where x represents the number of minutes after 12 noon when the man arrives, and y is for the woman. (why?) find the probability that the first to arrive has to wait longer than 10 minutes.

Example: Solution X and Y denote, respectively, the time past 12 that the man and the woman arrive. X and Y are independent uniform random variables over (0,60). = = P(X + 10 < Y ) + P(Y + 10 < X) f (x, y)dxdy + {x+10<y} 60 y 10 10 0 = 25/36 (1/60) 2 dxdy {y+10<x} 60 x 10 10 0 f (x, y)dxdy (1/60) 2 dydx

Example 3 An accident occurs at a point X that is uniformly distributed on a road of length L. At the time of the accident, an ambulance is at a location Y that is also uniformly distributed on the road. Assuming that X and Y are independent, find the expected distance between the ambulance and the point of the accident.

Example: Continued Need to compute E X Y The joint density function of X and Y is f (x, y) = 1 L 2, 0 < x < L, 0 < y < L L L 1 E[ X Y ] = x y dydx 0 0 L2 = 1 L x L 2 ( (x y)dy + 0 = 1 L 2 L = L 3 0 0 ( L2 2 + x 2 x)dx L x (y x)dy)dx