Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf



Similar documents
), 35% use extra unleaded gas ( A

Joint Exam 1/P Sample Exam 1

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Math 431 An Introduction to Probability. Final Exam Solutions

Statistics 100A Homework 7 Solutions

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability for Estimation (review)

University of California, Los Angeles Department of Statistics. Random variables

Double Integrals in Polar Coordinates

Lecture 6: Discrete & Continuous Probability and Random Variables

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

So, using the new notation, P X,Y (0,1) =.08 This is the value which the joint probability function for X and Y takes when X=0 and Y=1.

Notes on Continuous Random Variables

ST 371 (IV): Discrete Random Variables

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

ISyE 6761 Fall 2012 Homework #2 Solutions

Stats on the TI 83 and TI 84 Calculator

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Chapter 4 Lecture Notes

Introduction to Probability

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Section V.2: Magnitudes, Directions, and Components of Vectors

INVESTIGATIONS AND FUNCTIONS Example 1

Feb 28 Homework Solutions Math 151, Winter Chapter 6 Problems (pages )

Covariance and Correlation

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MAS108 Probability I

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Lecture 7: Continuous Random Variables

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Mathematical Expectation

5. Continuous Random Variables

D.2. The Cartesian Plane. The Cartesian Plane The Distance and Midpoint Formulas Equations of Circles. D10 APPENDIX D Precalculus Review

Section 7.2 Linear Programming: The Graphical Method

ECE302 Spring 2006 HW5 Solutions February 21,

Chapter 9 Monté Carlo Simulation

LESSON EIII.E EXPONENTS AND LOGARITHMS

Florida Algebra I EOC Online Practice Test

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

ACT Math Vocabulary. Altitude The height of a triangle that makes a 90-degree angle with the base of the triangle. Altitude

Lecture Notes 1. Brief Review of Basic Probability

Stat 704 Data Analysis I Probability Review

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

TEST 2 STUDY GUIDE. 1. Consider the data shown below.

Random variables, probability distributions, binomial random variable

Lecture 9: Introduction to Pattern Analysis

Solving Quadratic Equations by Graphing. Consider an equation of the form. y ax 2 bx c a 0. In an equation of the form

Example: Document Clustering. Clustering: Definition. Notion of a Cluster can be Ambiguous. Types of Clusterings. Hierarchical Clustering

Math 151. Rumbos Spring Solutions to Assignment #22

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

C3: Functions. Learning objectives

More Equations and Inequalities

Linear Inequality in Two Variables

Statistics 100A Homework 8 Solutions

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

12.5: CHI-SQUARE GOODNESS OF FIT TESTS

Random Variables. Chapter 2. Random Variables 1

Master s Theory Exam Spring 2006

Math 461 Fall 2006 Test 2 Solutions

Section 6.1 Joint Distribution Functions

MATH REVIEW SHEETS BEGINNING ALGEBRA MATH 60

The Big Picture. Correlation. Scatter Plots. Data

Aggregate Loss Models

Binomial random variables

Core Maths C2. Revision Notes

Math/Stats 342: Solutions to Homework

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Continuous Random Variables

5.2 Inverse Functions

Order Statistics. Lecture 15: Order Statistics. Notation Detour. Order Statistics, cont.

Implicit Differentiation

The Bivariate Normal Distribution

Functions and Graphs CHAPTER INTRODUCTION. The function concept is one of the most important ideas in mathematics. The study

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

An Introduction to Basic Statistics and Probability

THE POWER RULES. Raising an Exponential Expression to a Power

How many of these intersection points lie in the interior of the shaded region? If 1. then what is the value of

3. Regression & Exponential Smoothing

Practice problems for Homework 11 - Point Estimation

Exploratory Data Analysis

Lecture 4: Joint probability distributions; covariance; correlation

Exponential Distribution

Statistics 100A Homework 4 Solutions

Notes on the Negative Binomial Distribution

Department of Civil Engineering-I.I.T. Delhi CEL 899: Environmental Risk Assessment Statistics and Probability Example Part 1

x 2 + y 2 = 1 y 1 = x 2 + 2x y = x 2 + 2x + 1

SECTION 5-1 Exponential Functions

1 Maximizing pro ts when marginal costs are increasing

Systems of Linear Equations: Solving by Substitution

LINEAR FUNCTIONS OF 2 VARIABLES

Hypothesis Testing for Beginners

Transcription:

AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal pdf s of X and of Y. (d). Are X and Y independent (justif!). (e). Find E(e X cosy ). (f). Find cov(x,y ). We start (as alwas!) b drawing the. (See below, left.) 2 2 with > (a). We find c b setting so c. f(,)dd 2 (c 2 + 2c )dd 3 3 + 3, (b). Draw a picture of the (a -b-2 rectangle), and intersect it with the set {(,) : + }, which is the region above the line. See figure above, right. To compute the probabilit, we double integrate the joint densit over this subset of the : 2 P(X + Y ) ( 2 + 65 )dd 3 72 (c). We compute the marginal pdfs: f X () f Y () { 2 f(,)d (2 + )d 3 22 + 2 if 3 otherwise { f(,)d (2 + )d + if 2 3 3 6 otherwise

(d). NO, X and Y are NOT independent. The is a rectangle, so we need to check if it is true that f(,) f X ()f Y (), for all (,). We easil find countereamples: f(.2,.3) f X (.2)f Y (.3). (e). (f). 2 E(e X cos Y ) ( 2 + [ 3 )dd 2 (e cos )( 2 + 3 )dd cov(x,y ) E(XY ) E(X)E(Y ) 2 ( 2 + ] [ 3 )dd Eample 2: X and Y are jointl continuous with joint pdf { c if,, + f(,), otherwise. 2 ( 2 + 3 )dd ] (a). Find c. (b). Find P(Y > X). (c). Find marginal pdf s of X and of Y. (d). Are X and Y independent (justif!). We start (as alwas!) b drawing the. (See below, left.) with > (a). We find c b setting so c 24. f(,)dd.5 cdd c 24, (b). Draw a picture of the (a triangle), and intersect it with the set {(,) : }, which is the region above the line ; this ields a triangle whose leftmost -value is and whose rightmost -value is /2 (which is onl seen b drawing the figure!). See figure above, right. To compute the probabilit, we double integrate the joint densit over this subset of the : P(Y X) /2 2 24dd

(c). We compute the marginal pdfs: f X () f Y () f(,)d f(,)d { 24d 2( ) 2 if otherwise { 24d 2( ) 2 if otherwise (d). NO, X and Y are NOT independent. The is not a rectangle or generalized rectangle, so we know we can find points (,) where it fails to be true that f(,) f X ()f Y (). In particular, f(.7,.7) f X (.7)f Y (.7) >. Eample 3: X and Y are jointl continuous with joint pdf { c if, f(,), otherwise. (a). Find c. (b). Find P( Y 2X.). (c). Find marginal pdf s of X and of Y. (d). Are X and Y independent (justif!). We start (as alwas!) b drawing the, which is just a unit square in this case. (See below, left.) 2+. 2. with.< 2<....5.55.45 (a). We find c b setting so c 4. f(,)dd cdd c 4, (b). Draw a picture of the (unit square), and intersect it with the set {(,) : 2.} {(,) :. 2.} {(,) : 2. 2+.}, which is the region above the line 2. and below the line 2 +.. See figure above, left. (You will not be able to figure out the limits of integration without it!) To compute the probabilit, we double integrate the joint densit over this subset of the support set: P( Y 2X.). (+.)/2 3 4dd +. (+.)/2 (.)/2 4dd

(c). We compute the marginal pdfs: f X () f Y () (d). YES, X and Y are independent, since { f(,)d 4d 2 if otherwise { f(,)d 4d 2 if otherwise { 2 2 4 if and f X ()f Y () otherwise is eactl the same as f(,), the joint densit, for all and. Eample 4: X and Y are independent continuous random variables, each with pdf g(w) { 2w if w, otherwise. (a). Find P(X + Y ). (b). Find the cdf and pdf of Z X + Y. Since X and Y are independent, we know that { 2 2 if and f(,) f X ()f Y () otherwise We start (as alwas!) b drawing the, which is a unit square in this case. (See below, left.) 4

with +< z z.4 z with +<.4 z.6 with +<.6 z.4 z.6 Case: <z<2 Case: <z< (a). Draw a picture of the (unit square), and intersect it with the set {(,) : + }, which is the region below the line. See figure above, right. To compute the probabilit, we double integrate the joint densit over this subset of the : P(X + Y ) 4dd 6 (b). Refer to the figure (lower left and lower right). To compute the cdf of Z X + Y, we use the definition of cdf, evaluating each case b double integrating the joint densit over the subset of the corresponding to {(,) : + z}, for different cases depending on the value of z: F Z (z) P(Z z) P(X + Y z) P(Y X + z) if z z z 4dd if z z 4dd + z z 4dd if z 2 if z 2 5

Eample 5: X and Y are jointl continuous with joint pdf { e (+) if, f(,), otherwise. Let Z X/Y. Find the pdf of Z. The first thing we do is draw a picture of the (which in this case is the first quadrant); see below, left. (/z) with (/)<z, for z> To find the densit, f Z (z), we start, as alwas, b finding the cdf, F Z (z) P(Z z), and then differentiating: f Z (z) F Z(z). Thus, using the definition, and a picture of the, we start b handling the cases, F Z (z) P(Z z) P(X/Y z) { if z < P(Y (/z)x) if z >, where we have used the fact that X and Y are both nonnegative (with probabilit ), so multipling both sides of the inequalit b Y does not flip the inequalit; note, however, that when we divide both sides b z, to obtain Y (/z)x, we were making the assumption that z > (otherwise the inequalit would flip). Now, we consider the picture of the, together with the halfplane specified b (/z); see the figure above, right. We double integrate the joint densit over the portion of the where (/z), obtaining F Z (z) P(Z z) P(X/Y z) { if z < P(Y (/z)x) if z >, { if z < (/z) e (+) dd z if z >. z+ Then, to get the pdf, we take the derivative: { if z < f Z (z) (z+) z if z > (z+) 2 (z+) 2 6

Eample 6: X and Y are independent, each with an eponential(λ) distribution. Find the densit of Z X + Y and of W Y X 2. Since X and Y are independent, we know that f(,) f X ()f Y (), giving us f(,) { λe λ λe λ if, otherwise. The first thing we do is draw a picture of the : the first quadrant. (a). To find the densit, f Z (z), we start, as alwas, b finding the cdf, F Z (z) P(Z z), and then differentiating: f Z (z) F Z(z). Thus, using the definition, and a picture of the together with the halfplane + z, we get F Z (z) P(Z z) P(X + Y z) P(Y X + z) { if z < z z λe λ λe λ dd e λz λze λz if z This gives the pdf, { if z < f Z (z) λ 2 ze λz if z, which is the pdf of a Gamma(2,λ). Thus, Z is Gamma(2,λ) random variable. (b). To find the densit, f W (w), we start, as alwas, b finding the cdf, F W (w) P(W w), and then differentiating: f W (w) F W(w). Thus, using the definition, and a picture of the together with the region specified b 2 + w, we get F W (w) P(W w) P(Y X 2 w) P(Y X 2 + w) { 2 +w λe λ λe λ dd if w > w 2 +w λe λ λe λ dd if w < Then, we differentiate to get f W (w) F W(w). (Go ahead and evaluate the integral, then take the derivative.) Eample 7: X and Y are jointl continuous with (X,Y ) uniforml distributed over the union of the two squares {(,) :, } and {(,) :, 3 4}. (a). Find E(Y ). (b). Find the marginal densities of X and Y. (c). Are X and Y independent? (d). Find the pdf of Z X + Y. Solution to be provided. (possibl in class) Eample 8: X and Y have joint densit f(,) { + if,, otherwise. Find the joint cdf, F X,Y (,), for all and. Compute the covariance and correlation of X and Y. Solution to be provided. (possibl in class) 7

Eample 9: Suppose that X and Y have joint mass function as shown in the table below. (Here, X takes on possible values in the set { 2, 2}, Y takes on values in the set { 2,, 2, 3.}.) -2 2 3. -2.4.8.2.6 2.6.2.8.24 (a). (6 points) Compute P( X + Y 2 < ). (b). (6 points) Find the marginal mass function of Y and plot it. (be ver eplicit!) (c). (6 points) Compute var(x 2 Y ) and cov(x,y ). (d). (2 points) Are X and Y independent? (Wh or wh not?) Solution to be provided. (possibl in class) Eample : Two fair dice are rolled. Let X be the larger of the two values shown on the dice, and let Y be the absolute value of the difference of the two values shown. Give the joint pmf of X and Y. Compute cov(x,y ), E(X), E(Y X ), P(X > 2Y ). The sample space is the set S {(, ), (, 2),...,(6, 6)}; there are 36 equall likel outcomes. Note that X {, 2,...,6} and Y {,,...,5}. p(, ) P(X,Y ) P({(, )}) /36, where (,) is the outcome in which the first die is a and the second die is also a (so that the larger die is and the difference of the two values is ). Similarl, p(i, ) P(X i,y ) P({(i,i)}) /36, for i, 2,...,6. in which the first die is a i and the second die is also a i Now, p(, ) p(, 2) p(, 6), since, if the larger of the two dice shows, the difference cannot be or more. Now, p(2, ) P(X 2,Y ) P({(2, ), (, 2)}) 2/36. Similarl, p(3, ) p(3, 2) p(4, ) p(4, 2) p(4, 3) p(5, ) p(5, 4) p(6, ) p(6, 5) 2/36, since each corresponding event is a subset of two outcomes from S. All other values of p(i,j) are. Check that the sum of all values p(i,j) is, as it must be! Thus, in summar, if j i, and i {, 2,...,6},j {,,...,5} p(i,j) if (i,j) {(, ), (2, ),...,(6, )} 36 2 otherwise 36 (It is convenient to arrange all these numbers in a table.) We can also compute 6 5 E(X) i p(i,j) 6 i j 36, 6 5 6 5 6 5 cov(x,y ) E(XY ) [E(X)][E(Y )] ij p(i, j) i p(i,j) j p(i,j) i j i j i j 8

Eample : Alice and Bob plan to meet at a cafe to do AMS3 homework together. Alice arrives at the cafe at a random time (uniform) between noon and :pm toda; Bob independentl arrives at a random time (uniform) between noon and 2:pm toda. (a). What is the epected amount of time that somebod waits for the other? (b). What is the probabilit that Bob has to wait for Alice? Let X be the number of hours past noon that Alice arrives at the cafe. Let Y be the number of hours past noon that Bob arrives at the cafe. Then, we know that X is Uniform(,), and Y is Uniform(,2). Since, b the stated assumption, X and Y are independent, we know that the joint densit is given b { f(,) /2 if, 2 otherwise We begin (as alwas) b plotting the : it is simpl a rectangle of width and height 2. (a). Let W ma{x,y } min{x,y }; then, W is the amount of time (in hours) that somebod has to wait. We want to compute E(W). Now, W is a function of X and Y. So we just use the law of the unconscious statistician: E(W) E(ma{X,Y } min{x,y }) 2 [ma{, } min{, }](/2)dd [ma{,} min{,}]f(,)dd Now, in order to write the function [ma{, } min{, }] eplicitl, we break into two cases: If <, then ma{,} min{,} ; if >, then ma{,} min{,}. Thus, we integrate to get E(W) E(ma{X,Y } min{x,y }) ( )(/2)dd+ 2 ( )(/2)dd 2 +3 4 5 6. Thus, the epected time waiting is 5/6 hours (or 5 minutes). (Note that it is wrong to reason like this: Alice epects to arrive at 2:3; Bob epects to arrive at :; thus, we epect that Bob will wait 3 minutes for Alice.) (b). We want to compute the probabilit that Bob has to wait for Alice, which is P(Y < X), which we do b integrating the joint densit, f(,), over the region where <. Draw a picture! (Show the (a rectangle), and the line.) P(Y < X) (/2)dd 4 Eample 2: Suppose X and Y are independent and that X is eponential with mean.5 and Y has densit { f Y () 3e 3 if > otherwise Find the densit of the random variable W min{x,y } and the random variable Z X + Y. 9