Chapter 5: Joint Probability Distributions 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4 Independence 5-1.5 More Than Two Random Variables 5-2 Covariance and Correlation 5-3 Common Joint Distributions 5-3.1 Multinomial Probability Distribution 5-3.2 Bivariate Normal Distribution 5-4 Linear Functions of Random Variables 5-5 General Functions of Random Variables Chapter Learning Objectives After careful study of this chapter you should be able to: 1. Use joint probability mass functions and joint probability density functions to calculate probabilities 2. Calculate marginal and conditional probability distributions from joint probability distributions 3. Interpret and calculate l covariances and correlations between random variables 4. Use the multinomial distribution to determine probabilities 5. Understand properties of a bivariate normal distribution and be able to draw contour plots for the probability density function 6. Cl Calculate l means and variances for linear combinations i of random variables, and calculate probabilities for linear combinations of normally distributed random variables 7. Determine the distribution of a general function of a random variable 1 2 Concept of Joint Probabilities Some random variables are not independent of each other, i.e., their values are related to some degree Urban atmospheric ozone and airborne particulate matter tend to vary together Ub Urban vehicle hil speeds and dfuel consumption rates tend to vary inversely The length (X) of a injection-molded part might not be independent of the width (Y) (individual parts will vary due to random variation in materials and pressure) A joint probability distribution will describe the behavior of several random variables In the case of 2 RVs, say, X and Y, the graph of the joint distribution is 3-dimensional: x, y, and f(x,y). 3 The Joint Probability Distribution for a Pair of Discrete Random Variables 4
A Joint Probability Distribution Example (Example 5-1) The Joint Probability Distribution for a Pair of Continuous Random Variables Lt Let: 5 6 A Joint Probability Distribution Example (Example 5-2) A Joint Probability Distribution Example (Example 5-2) Figure 5-4 The joint probability density function of X and Y is nonzero over the shaded region where x < y 7 Figure 5-5 Region of integration for the probability that X < 1000 and Y < 2000 is darkly shaded 8
The Marginal Probability Distribution The individual probability distribution of a random variable is referred to as its marginal probability distribution In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other random variables For the case of two discrete random variables: A Marginal Probability Distribution Example (Example 5-3) Figure 5-6 Marginal probability distribution dst buto of X and Y from Fig. 5-1 9 10 The Marginal Probability Distribution For the case of two continuous random variables: The Conditional Probability Distribution (discrete case) f Y x (y) Given this definition, verify that the joint pmf in Example 5-1 leads to the following conditional pmfs: x (#bars of signal strength) y (#times city name is y ( y stated) 1 2 3 4 0.7500 0.4000 0.0909 3 0.1000 0.4000 0.0909 2 0.1000 0.1200 0.3636 1 0.0500 0.0800 0.4545 f X y (x) x (#bars of signal strength) y (#times city name is stated) 1 2 3 4 0.5000 0.3333 0.1667 3 0.1176 0.5882 0.2941 2 0.08000800 0.1200 0.8000 1 0.0357 0.0714 0.8929 11 12
Properties of Conditional pmfs An Example of Conditional pmf Mean and Variance Given the formulae on the previous slide and the conditional pmfs below, verify that conditional means and conditional variances below are correct: f Y x (y) y y (#times city name is is stated) x (#bars of signal strength) 1 2 3 4 0.7500 0.4000 0.0909 3 0.1000 0.4000 0.0909 2 0.1000 0.1200 0.3636 1 0.0500 0.0800 0.4545 Conditional mean: 3.5500 3.1200 1.8182 Conditional variance: 0.7475 0.8256 0.8760 f X y (x) y (#times city name is stated) x (#bars of signal strength) 1 2 3 Conditional Conditional mean: variance: 4 0.5000 0.3333 0.1667 1.6667 0.5556 3 0.1176 0.5882 0.2941 2.1765 0.3806 2 0.0800 0.1200 0.8000 2.7200 0.3616 1 0.0357 0.0714 0.8929 2.8571 0.1939 13 14 The Conditional Probability Distribution (continuous case) An Example of Conditional Probability (Example 5-6) 15 16
Conditional Mean and Variance (continuous case) Independence of Two Random Variables 17 18 Rectangular Range for (X, Y) An Example of Independence discrete case (Example 5-10) A rectangular range for X and Y is a necessary, but not sufficient, condition for the independence of the variables If the range of X and Y is not rectangular, then the range of one variable is limited by the value of the other variable If the range of X and Y is rectangular, then one of the properties of (5-7) must be demonstrated to prove independence 19 Figure 5-10 (a) () Joint and marginal probability distributions of X and Y. (b) Conditional probability bilit distribution ib ti of Y given X=x 20
An Example of Independence (Example 5-6) The example on the previous page could also be expressed din tabular form as follows: An Example of Independence continuous case (Example 5-12) f XY (x,y) x (colour conforms) ( l f ) y (length conforms) 0 1 f Y (y) 1 0.0098 0.9702 0.98 0 0.0002 0.0198 0.02 f X (x) 001 0.01 099 0.99 f Y x (y) x (colour conforms) f X y (x) x (colour conforms) y (length conforms) 0 1 y (length conforms) 0 1 Conditional Conditional mean: variance: 1 0.9800 0.9800 1 0.0100 0.9900 0.9900 0.0099 0 0.02000200 0.02000200 0 0.01000100 0.9900 0.9900 0.00990099 Conditional mean: 0.9800 0.9800 Conditional variance: 0.0196 0.0196 21 22 The Covariance Between Two Random Variables An Example (5-19) of Covariance for Discrete Random Variables Before we define covariance we need to determine the expected value of a function of two random variables: Now we are ready to define covariance: 23 Figure 5-12 Discrete joint distribution of X and Y f XY (x,y) x y 1 3 f Y (y) 3 030 0.30 030 0.30 2 0.20 0.20 0.40 1 0.10 0.20 0.30 f X (x) 0.30 0.70 24
What Does Covariance Signify? if Covariance is a measure of the strength of the linear relationship between two random variables. If the relationship is nonlinear, the covariance may not be useful. E.g. in Fig. 5-13 (d) there is definitely a relationship between the variables, undetected by the covariance. The Correlation Between Two Random Variables 25 26 A Correlation (and Covariance) Example (Example 5-21) A Correlation (and Covariance) Example (Example 5-21) f XY (x,y) x y 0 1 2 3 f Y (y) 3 0.40 0.40 2 0.10 0.10 0.20 1 0.10 0.10 0.20 0 0.20 0.20 f X (x) 0.20 0.20 0.20 0.40 Figure 5-14 Discrete joint distribution, f(x, y). 27 28
The Covariance and Correlation Between Independent Variables An Example (Covariance Between Independent Variables) (Example 5-23) IfXandYareexactly are exactly linearly related (i.e. Y=aX+b for constants a and b) then the correlation XY is either +1 or -1 (with the same sign as the constant a) If a is positive, XY = 1 If a is negative, XY = -1 29 Figure 5-16 Random variables with zero covariance from Example 5-23 30 An Example (Covariance Between Independent Variables) (Example 5-23) Common Joint Distributions There are two common joint distributions Multinomial probability distribution (discrete), an extension of the binomial distribution Bivariate normal probability distribution (continuous), a two-variable extension of the normal distribution. Although they exist, we do not deal with more than two random variables. There are many lesser known and custom joint probability distributions as you have already seen 31 32
The Multinomial Distribution The Bivariate Normal Distribution 33 34 Linear Combinations of Random Variables 35 36
A Linear Combination of Random Variables Example Example 5-31 Mean and Variance of an Average equation 5-27 37 38 Reproductive Property of the Normal Distribution 39