Jointly Distributed Random Variables

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Jointly Distributed Random Variables"

Transcription

1 Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables Definition Joint cdfs Joint Probability Mass Functions Joint Probability Density Functions Independence and Expectation Independence Expectation Conditional Expectation Covariance and Correlation Examples 6 1 Jointly Distributed Random Variables 1.1 Definition Joint Probability Distribution Suppose we have two random variables X and Y defined on a sample space S with probability measure P(E), E S. Jointly, they form a map (X, Y) : S R 2 where s (X(s), Y(s)). From before, we know to define the marginal probability distributions P X and P Y by, for example, P X (B) = P(X 1 (B)), B R. We now define the joint probability distribution P XY by P XY (B X, B Y ) = P{X 1 (B X ) Y 1 (B Y )}, B X, B Y R. So P XY (B X, B Y ), the probability that X B X and Y B Y, is given by the probability P of the set of all points in the sample space that get mapped both into B X by X and into B Y by Y. More generally, for a single region B XY R 2, find the collection of sample space elements and define S BXY = {s S (X(s), Y(s)) B XY } P XY (B XY ) = P(S BXY ). 1

2 1.2 Joint cdfs Joint Cumulative Distribution Function We thus define the joint cumulative distribution function as F XY (x, y) = P XY ((, x], (, y]), x, y R. It is easy to check that the marginal cdfs for X and Y can be recovered by and that the two definitions will agree. F X (x) = F XY (x, ), x R, F Y (y) = F XY (, y), y R, Properties of a Joint cdf For F XY to be a valid cdf, we need to make sure the following conditions hold F XY (x, y) 1, x, y R; 2. Monotonicity: x 1, x 2, y 1, y 2 R, x 1 < x 2 F XY (x 1, y 1 ) F XY (x 2, y 1 ) and y 1 < y 2 F XY (x 1, y 1 ) F XY (x 1, y 2 ); 3. x, y R, F XY (x, ) = 0, F XY (, y) = 0 and F XY (, ) = 1. Interval Probabilities Suppose we are interested in whether the random variable pair (X, Y) lie in the interval cross product (x 1, x 2 ] (y 1, y 2 ]; that is, if x 1 < X x 2 and y 1 < Y y 2. Y y 2 y 1 x 1 x 2 X First note that P XY ((x 1, x 2 ], (, y]) = F XY (x 2, y) F XY (x 1, y). It is then easy to see that P XY ((x 1, x 2 ], (y 1, y 2 ]) is given by F XY (x 2, y 2 ) F XY (x 1, y 2 ) F XY (x 2, y 1 ) + F XY (x 1, y 1 ). 1.3 Joint Probability Mass Functions If X and Y are both discrete random variables, then we can define the joint probability mass function as p XY (x, y) = P XY ({x}, {y}), x, y R. We can recover the marginal pmfs p X and p Y since, by the law of total probability, x, y R p X (x) = y p XY (x, y), p Y (y) = x p XY (x, y). 2

3 Properties of a Joint pmf For p XY to be a valid pmf, we need to make sure the following conditions hold p XY (x, y) 1, x, y R; 2. p XY (x, y) = 1. y x 1.4 Joint Probability Density Functions On the other hand, if f XY : R R R s.t. P XY (B XY ) = f XY (x, y)dxdy, B XY R R, (x,y) B XY then we say X and Y are jointly continuous and we refer to f XY as the joint probability density function of X and Y. In this case, we have F XY (x, y) = y t= x s= f XY (s, t)dsdt, x, y R, By the Fundamental Theorem of Calculus we can identify the joint pdf as f XY (x, y) = 2 x y F XY(x, y). Furthermore, we can recover the marginal densities f X and f Y : f X (x) = d dx F X(x) = d dx F XY(x, ) = d x f XY (s, y)dsdy. dx s= By the Fundamental Theorem of Calculus, and through a symmetric argument for Y, we thus get f X (x) = f XY (x, y)dy, f Y (y) = x= f XY (x, y)dx. Properties of a Joint pdf For f XY to be a valid pdf, we need to make sure the following conditions hold. 1. f XY (x, y) 0, x, y R; 2. x= f XY (x, y)dxdy = 1. 3

4 2 Independence and Expectation 2.1 Independence Independence of Random Variables Two random variables X and Y are independent iff B X, B Y R., P XY (B X, B Y ) = P X (B X )P Y (B Y ). More specifically, two discrete random variables X and Y are independent iff p XY (x, y) = p X (x)p Y (y), x, y R; and two continuous random variables X and Y are independent iff f XY (x, y) = f X (x) f Y (y), x, y R. Conditional Distributions For two r.v.s X, Y we define the conditional probability distribution P Y X by P Y X (B Y B X ) = P XY(B X, B Y ), B X, B P X (B X ) Y R, P X (B X ) = 0. This is the revised probability of Y falling inside B Y given that we now know X B X. Then we have X and Y are independent P Y X (B Y B X ) = P Y (B Y ), B X, B Y R. For discrete r.v.s X, Y we define the conditional probability mass function p Y X by p Y X (y x) = p XY(x, y), x, y R, p X (x) = 0, p X (x) and for continuous r.v.s X, Y we define the conditional probability density function f Y X by f Y X (y x) = f XY(x, y), x, y R. f X (x) [Justification: P(Y y X [x, x + dx)) = P XY([x, x + dx), (, y]) P X ([x, x + dx)) F(y, x + dx) F(y, x) = F(x + dx) F(x) {F(y, x + dx) F(y, x)}/dx = {F(x + dx) F(x)}/dx d{f(y, x + dx) F(y, x)}/dxdy = f (y X [x, x + dx)) = {F(x + dx) F(x)}/dx f (y, x) = f (y x) = lim f (y X [x, x + dx)) = dx >0 f (x).] R. In either case, X and Y are independent p Y X (y x) = p Y (y) or f Y X (y x) = f Y (y), x, y 4

5 2.2 Expectation E{g(X, Y)} Suppose we have a bivariate function of interest of the random variables X and Y, g : R R R. If X and Y are discrete, we define E{g(X, Y)} by E XY {g(x, Y)} = y g(x, y)p XY (x, y). x If X and Y are jointly continuous, we define E{g(X, Y)} by E XY {g(x, Y)} = x= g(x, y) f XY (x, y)dxdy. Immediately from these definitions we have the following: If g(x, Y) = g 1 (X) + g 2 (Y), E XY {g 1 (X) + g 2 (Y)} = E X {g 1 (X)} + E Y {g 2 (Y)}. If g(x, Y) = g 1 (X)g 2 (Y) and X and Y are independent, E XY {g 1 (X)g 2 (Y)} = E X {g 1 (X)}E Y {g 2 (Y)}. In particular, considering g(x, Y) = XY for independent X, Y we have E XY (XY) = E X (X)E Y (Y). 2.3 Conditional Expectation E Y X (Y X = x) In general E XY (XY) = E X (X)E Y (Y). Suppose X and Y are discrete r.v.s with joint pmf p(x, y). If we are given the value x of the r.v. X, our revised pmf for Y is the conditional pmf p(y x), for y R. The conditional expectation of Y given X = x is therefore Similarly, if X and Y were continuous, E Y X (Y X = x) = y p(y x). y E Y X (Y X = x) = y f (y x)dy. In either case, the conditional expectation is a function of x but not the unknown Y. 5

6 2.4 Covariance and Correlation Covariance For a single variable X we considered the expectation of g(x) = (X µ X )(X µ X ), called the variance and denoted σ 2 X. The bivariate extension of this is the expectation of g(x, Y) = (X µ X )(Y µ Y ). We define the covariance of X and Y by σ XY = Cov(X, Y) = E XY [(X µ X )(Y µ Y )]. Correlation Covariance measures how the random variables move in tandem with one another, and so is closely related to the idea of correlation. We define the correlation of X and Y by ρ XY = Cor(X, Y) = σ XY σ X σ Y. Unlike the covariance, the correlation is invariant to the scale of the r.v.s X and Y. It is easily shown that if X and Y are independent random variables, then σ XY = ρ XY = 0. 3 Examples Example 1 Suppose that the lifetime, X, and brightness, Y of a light bulb are modelled as continuous random variables. Let their joint pdf be given by Are lifetime and brightness independent? The marginal pdf for X is f (x) = f (x, y) = λ 1 λ 2 e λ 1x λ 2 y, x, y > 0. = λ 1 e λ 1x. f (x, y)dy = y=0 λ 1 λ 2 e λ 1x λ 2 y dy Similarly f (y) = λ 2 e λ 2y. Hence f (x, y) = f (x) f (y) and X and Y are independent. Example 2 Suppose continuous r.v.s (X, Y) R 2 have joint pdf { 1 f (x, y) = π, x2 + y 2 1 0, otherwise. Determine the marginal pdfs for X and Y. Well x 2 + y 2 1 y 1 x 2. So 1 x 2 1 f (x) = f (x, y)dy = y= 1 x 2 π dy = 2 1 x π 2. Similarly f (y) = 2 1 y π 2. Hence f (x, y) = f (x) f (y) and X and Y are not independent. 6

Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

More information

Bivariate Distributions

Bivariate Distributions Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

More information

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions. Tieming Ji. Fall 2012 Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

More information

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

More information

Continuous Random Variables

Continuous Random Variables Probability 2 - Notes 7 Continuous Random Variables Definition. A random variable X is said to be a continuous random variable if there is a function f X (x) (the probability density function or p.d.f.)

More information

Chapter 4 Expected Values

Chapter 4 Expected Values Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

More information

Continuous Distributions

Continuous Distributions MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

Chapter 4. Multivariate Distributions

Chapter 4. Multivariate Distributions 1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

More information

Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ). Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

More information

3.6: General Hypothesis Tests

3.6: General Hypothesis Tests 3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Continuous random variables

Continuous random variables Continuous random variables So far we have been concentrating on discrete random variables, whose distributions are not continuous. Now we deal with the so-called continuous random variables. A random

More information

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage 4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

More information

MATH 201. Final ANSWERS August 12, 2016

MATH 201. Final ANSWERS August 12, 2016 MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

More information

Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Covariance and Correlation. Consider the joint probability distribution f XY (x, y). Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

More information

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

More information

L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

More information

Change of Continuous Random Variable

Change of Continuous Random Variable Change of Continuous Random Variable All you are responsible for from this lecture is how to implement the Engineer s Way (see page 4) to compute how the probability density function changes when we make

More information

Math 432 HW 2.5 Solutions

Math 432 HW 2.5 Solutions Math 432 HW 2.5 Solutions Assigned: 1-10, 12, 13, and 14. Selected for Grading: 1 (for five points), 6 (also for five), 9, 12 Solutions: 1. (2y 3 + 2y 2 ) dx + (3y 2 x + 2xy) dy = 0. M/ y = 6y 2 + 4y N/

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the

More information

Examination 110 Probability and Statistics Examination

Examination 110 Probability and Statistics Examination Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

More information

Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

More information

Statistiek (WISB361)

Statistiek (WISB361) Statistiek (WISB361) Final exam June 29, 2015 Schrijf uw naam op elk in te leveren vel. Schrijf ook uw studentnummer op blad 1. The maximum number of points is 100. Points distribution: 23 20 20 20 17

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014 Joint distributions Math 17 Probability and Statistics Prof. D. Joyce, Fall 14 Today we ll look at joint random variables and joint distributions in detail. Product distributions. If Ω 1 and Ω are sample

More information

Worked examples Multiple Random Variables

Worked examples Multiple Random Variables Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

More information

Example. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have

Example. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have Lectures 9-10 jacques@ucsd.edu 5.1 Random Variables Let (Ω, F, P ) be a probability space. The Borel sets in R are the sets in the smallest σ- field on R that contains all countable unions and complements

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Continuous Probability Distributions (120 Points)

Continuous Probability Distributions (120 Points) Economics : Statistics for Economics Problem Set 5 - Suggested Solutions University of Notre Dame Instructor: Julio Garín Spring 22 Continuous Probability Distributions 2 Points. A management consulting

More information

Math 576: Quantitative Risk Management

Math 576: Quantitative Risk Management Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 4 Haijun Li Math 576: Quantitative Risk Management Week 4 1 / 22 Outline 1 Basics

More information

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b Continuous Random Variables The probability that a continuous random variable, X, has a value between a and b is computed by integrating its probability density function (p.d.f.) over the interval [a,b]:

More information

Joint Continuous Distributions

Joint Continuous Distributions Joint Continuous Distributions Statistics 11 Summer 6 Copright c 6 b Mark E. Irwin Joint Continuous Distributions Not surprisingl we can look at the joint distribution of or more continuous RVs. For eample,

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

Lecture Notes 3 Random Vectors. Specifying a Random Vector. Mean and Covariance Matrix. Coloring and Whitening. Gaussian Random Vectors

Lecture Notes 3 Random Vectors. Specifying a Random Vector. Mean and Covariance Matrix. Coloring and Whitening. Gaussian Random Vectors Lecture Notes 3 Random Vectors Specifying a Random Vector Mean and Covariance Matrix Coloring and Whitening Gaussian Random Vectors EE 278B: Random Vectors 3 Specifying a Random Vector Let X,X 2,...,X

More information

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291) Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

More information

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

More information

Renewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t].

Renewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t]. Renewal Theory Def. A stochastic process {N(t), t 0} is said to be a counting process if N(t) represents the total number of events that have occurred up to time t. X 1, X 2,... times between the events

More information

10 BIVARIATE DISTRIBUTIONS

10 BIVARIATE DISTRIBUTIONS BIVARIATE DISTRIBUTIONS After some discussion of the Normal distribution, consideration is given to handling two continuous random variables. The Normal Distribution The probability density function f(x)

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Chapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random

Chapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random Chapter 5: Joint Probability Distributions 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4

More information

Conditional expectation

Conditional expectation A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

Discrete and Continuous Random Variables. Summer 2003

Discrete and Continuous Random Variables. Summer 2003 Discrete and Continuous Random Variables Summer 003 Random Variables A random variable is a rule that assigns a numerical value to each possible outcome of a probabilistic experiment. We denote a random

More information

3. Continuous Random Variables

3. Continuous Random Variables 3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

More information

SECOND-ORDER LINEAR HOMOGENEOUS DIFFERENTIAL EQUATIONS

SECOND-ORDER LINEAR HOMOGENEOUS DIFFERENTIAL EQUATIONS L SECOND-ORDER LINEAR HOOGENEOUS DIFFERENTIAL EQUATIONS SECOND-ORDER LINEAR HOOGENEOUS DIFFERENTIAL EQUATIONS WITH CONSTANT COEFFICIENTS A second-order linear differential equation is one of the form d

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the

More information

Chap 3 : Two Random Variables

Chap 3 : Two Random Variables Chap 3 : Two Random Variables Chap 3.1: Distribution Functions of Two RVs In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record

More information

Set theory, and set operations

Set theory, and set operations 1 Set theory, and set operations Sayan Mukherjee Motivation It goes without saying that a Bayesian statistician should know probability theory in depth to model. Measure theory is not needed unless we

More information

Chapter 5: Multivariate Distributions

Chapter 5: Multivariate Distributions Chapter 5: Multivariate Distributions Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 5.1 5.12 1 Goals for this Chapter Bivariate and multivariate

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17 4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

MC3: Econometric Theory and Methods

MC3: Econometric Theory and Methods University College London Department of Economics M.Sc. in Economics MC3: Econometric Theory and Methods Course notes: Econometric models, random variables, probability distributions and regression Andrew

More information

TRANSFORMATIONS OF RANDOM VARIABLES

TRANSFORMATIONS OF RANDOM VARIABLES TRANSFORMATIONS OF RANDOM VARIABLES 1. INTRODUCTION 1.1. Definition. We are often interested in the probability distributions or densities of functions of one or more random variables. Suppose we have

More information

The random variable X - the no. of defective items when three electronic components are tested would be

The random variable X - the no. of defective items when three electronic components are tested would be RANDOM VARIABLES and PROBABILITY DISTRIBUTIONS Example: Give the sample space giving a detailed description of each possible outcome when three electronic components are tested, where N - denotes nondefective

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Lecture 8: Continuous random variables, expectation and variance

Lecture 8: Continuous random variables, expectation and variance Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: spring 2012 Lejla Batina Version: spring 2012 Wiskunde

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions Math 37/48, Spring 28 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 3 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

AB2.14: Heat Equation: Solution by Fourier Series

AB2.14: Heat Equation: Solution by Fourier Series AB2.14: Heat Equation: Solution by Fourier Series Consider the boundary value problem for the one-dimensional heat equation describing the temperature variation in a bar with the zero-temperature ends:

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Statistical Inference

Statistical Inference Statistical Inference Idea: Estimate parameters of the population distribution using data. How: Use the sampling distribution of sample statistics and methods based on what would happen if we used this

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Lecture 1: Entropy and mutual information

Lecture 1: Entropy and mutual information Lecture 1: Entropy and mutual information 1 Introduction Imagine two people Alice and Bob living in Toronto and Boston respectively. Alice (Toronto) goes jogging whenever it is not snowing heavily. Bob

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Topic 8 The Expected Value

Topic 8 The Expected Value Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

More information

Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-Normal Distribution

Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-Normal Distribution Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Log-ormal Distribution October 4, 200 Limiting Distribution of the Scaled Random Walk Recall that we defined a scaled simple random walk last

More information

Multivariable Calculus Practice Midterm 2 Solutions Prof. Fedorchuk

Multivariable Calculus Practice Midterm 2 Solutions Prof. Fedorchuk Multivariable Calculus Practice Midterm Solutions Prof. Fedorchuk. ( points) Let f(x, y, z) xz + e y x. a. (4 pts) Compute the gradient f. b. ( pts) Find the directional derivative D,, f(,, ). c. ( pts)

More information

Quiz 1(1): Experimental Physics Lab-I Solution

Quiz 1(1): Experimental Physics Lab-I Solution Quiz 1(1): Experimental Physics Lab-I Solution 1. Suppose you measure three independent variables as, x = 20.2 ± 1.5, y = 8.5 ± 0.5, θ = (30 ± 2). Compute the following quantity, q = x(y sin θ). Find the

More information

Contents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables

Contents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random

More information

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015. Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

More information

Lesson 3. Conditional Probability Review

Lesson 3. Conditional Probability Review SA402 Dynamic and Stochastic Models Fall 2016 Assoc. Prof. Nelson Uhan Lesson 3. Conditional Probability Review Course standards covered in this lesson: B3 Joint probabilities, B4 Conditional probabilities,

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Lecture 10: Other Continuous Distributions and Probability Plots

Lecture 10: Other Continuous Distributions and Probability Plots Lecture 10: Other Continuous Distributions and Probability Plots Devore: Section 4.4-4.6 Page 1 Gamma Distribution Gamma function is a natural extension of the factorial For any α > 0, Γ(α) = 0 x α 1 e

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Worked examples Random Processes

Worked examples Random Processes Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1 Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information